Writing a Page for the User NOT for Google

The primary reason to write any page, or add any content to your site must be to entice your visitors to do what you want them to do. That could be to subscribe to your newsletter, to buy from you, to contact you, or even just to get them to come back later on. The page/content is not necessarily the penultimate step in the process of getting you more business, but it could well be a ‘link in the chain’ and therefore needs to be carefully crafted.

You will note that I say the page needs to be written for the user, and not Google. This may seem to be a strange thing for an SEO Expert to say, but it is said with good reason. The fact is that Google, with its advanced algorithms (like Rankbrain) are now looking for content that answers peoples needs and questions, and not for a document liberally sprinkled with keywords and phrases. That is old school now and is an avenue no longer worth pursuing.

What Sort of Content Should You Write

It is pretty easy to write a 2,000 word article that says very little, provides minimal information and does not answer any questions. What is hard, is creating a compelling content that will stand out from the crowd and will meet the user’s expectations.

For instance, this article is designed to help you know what to write about, what to include in your page at a technical level – the headers, titles and descriptions – and why this is needed. It also must give you some idea of where to start, that is, what you should be writing about in the first place.

What to Write About

So, lets start here, what should you write about?? The first thing, when you can, is to write about something that you know and care about, something that perhaps you can add your own personal view on, one that may not be in tune with others. Of course, this is not always possible, so when you are ‘forced’ to write about something you do not know enough about, the first rule is to do the required research. This will enable you to create an article that is worth reading, it being, (hopefully) factually correct, whilst at the same time providing the reader with your own slant on the subject.

But before you can start writing, you need to know what to write about. One of the best ways of helping your users (and attracting Google’s interest) is to start answering one of the thousands (millions?) of questions being asked every day online.

Finding the Questions to Answer

One of the best sources for questions is provided by KWFinder – https://app.kwfinder.com . This app allows you to search for the questions being asked as well as carrying out conventional keyword research.

example of questions being asked on the web

examples of questions being autocompleted by Google

examples of questions being autocompleted by Google

 

As you can see, many of these searches are not used very often, but that is the case for any long tail phrase (longer than 3 words) and is of no matter. What is important is that you KNOW that someone is searching for the answer to these questions, and that, if you write a good enough article, that it could answer a lot more than one question. It should also be kept in mind that people who use these long phrases are often far further down the buying cycle and therefore much more likely to purchase something. This means your article could well result in a sale…

Now You Have Your Topic

Once you have decided upon the topic, you can start writing the copy. This will be easy for some, but very hard for many, and is no doubt a skill in itself. However, even though you might not be Shakespeare, I am sure that you can have a pretty good go. All you have to do is to remember what question you are trying to answer and to break down that answer into simple steps and ensure that the layout does not present the user with a wall of text. Break up the copy with images and whitespace as needed.

Titles, Descriptions and Headers

The Title of a page is always in the <head> block or the beginning of a web page’s source code. The title tag is text wrapped in the <title> HTML tag, this being shown (in most cases) as the headline of the search listing on results pages, as well as on the user’s browser tab. Its purpose is to describe the overarching intent of the page and the type of content a user can expect to see when they visit the page. You can use up to 70 characters here, but many experts recommend a maximum of 60.

The Meta Description is used by the search engines to provide a bit more information about the page, this being shown underneath the Title in the SERPS results. It does not directly affect the ranking of that page, but as it is used as a factor in the overall ‘quality’ of a site, it is something that deserves your attention.

Paying close attention to three things when writing a perfect meta description can provide dividends, the three areas being branding, user-intent, and what’s working well in your market place for others. It is in effect an ‘elevator speech’ and these 180-300 characters offer a special opportunity for your page to stand out from the crowd.

Headers are the next thing for you to consider.  These ‘section headers’ (H1-H6) were originally intended to size text on a webpage, with the H1 being used for the primary title of a document, it being the largest text on the page. The others, H2 to H6 being progressively smaller. However, the advent of Cascading Styling Sheets (CSS) in the late 90’s, meant that few designers used them for this purpose (indeed many misuse them today). Now their main purpose is to assist Google to understand the importance of each element on the page, i.e. what is the most important and what is the next most important.

Internal and External Linking

The purpose of any link on a page should be to enhance the user experience of any reader. In many cases this could mean linking out to an external site, which could mean you lose the visitor, but if it helps the user to better understand your message and to answer the question, then it should be used.

It may also help the SEO / ranking of the page (because Google is said to like pages that help others), but this is a debateable point, so it is best to use external links ‘where you think it helps’ and for no other reason.

As for internal links. Again these should not be used for SEO, but instead to help users move around your site to better understand the answer to their problem.

Hopefully this will have provided you with some idea of what to write and how to construct the page in a way that will impress your visitors and Mr Google…

 

About the author

Graham Baylis was born in 1957 and has therefore seen the birth of the Internet and experienced at first hand, just how it has changed the World we live in. He has been involved with computers since 1983 and helped set up the first electronic mail system used by the Civil Service in 1986. He has gained several qualifications in Computing and Marketing and has written hundreds of blogs which you can find around the web, many being on customer’s websites. He has over 19 years of experience with Pay Per Click and SEO and has an answer for most Search Marketing questions, and, for those that he hasn’t is quick to find one. Continually coming up with solutions, Graham is one person it is worth having a chat with.

 

Some Reasons Why People and Google Hate Your Website

The reasons that Google could dislike your website are many, in fact there are over 200 so called ‘signals’ that Google takes into account, some of which are known and some of which are not.

As to why people hate your site, well, they are much more noticeable and easily fixed.

Let’s start with this easy area.

  • It takes a long time to load
  • It does not work well on mobile devices
  • The navigation is poor
  • It has auto play videos
  • The copy is badly written and is stuffed with keywords
  • The fonts are poor
  • Has many Google Ads, especially if they are at the top of the page
  • No About Us page
  • There is no easily found contact info
  • There is no blog

Now some of these are also on Google’s list of dislikes too, which makes them doubly important.

Slow Loading Sites

It is understandable that sites that load slowly will really annoy people, time being so precious in today’s fast moving world. However, Google have also made it very clear that they will rank fast loading sites higher than those that are slow.

There are many reason that sites load slowly, one being the fact that the host being used is overloaded, something that is not always considered. Other reasons include images that have not been optimized for the web, pages that make too many ‘calls’ on the server to build the pages or are just too code heavy.

Whatever the reason, this is an area that needs to be addressed.

Not Optimised For Mobile Devices

With the rise in the use of mobile devices to access the internet, it is vital that any site can be viewed on the many mobile devices in use today. If it is not then users could well leave the site just as soon as they arrive, never giving the chance for the site to engage with that potential customer.

But there is another reason, as if a site is not mobile optimized, then this is one of those signals that Google is looking for, and will result in a poorer ranking for the site than it otherwise deserves.

Poor Navigation

This is another one of those areas that covers both the efficiency of a site at converting visitors and in the way Google views the site. The first is easy to understand, as if a user can’t find their way around a site easily, they are very likely to leave. However, Google also measures User Experience levels, and somehow, it can tell that a navigation system is poor and will mark it accordingly.

Auto Playing Videos

It is well known that users hate videos ‘playing at them’ when they arrive on a site, and as it is quite possible that Google will also note this fact and again mark a site down.

Poor Copy

There is little doubt that poor content is a big turn off for viewers, this including poor grammar and spelling, as well as obviously uninteresting stuff. Putting it simply, if a viewer does not get something out of the interaction with a page, be it education, fun, or advice, they are likely to leave and never return.

Google also dislikes poor copy, and will not rank such a page well, plus, if they find too many such pages on a site they are likely to downgrade the entire site.

Badly Chosen Fonts
Some fonts look better than others, and because now, the fonts used on a site can be controlled (at least to some degree) it is wise to choose a font that looks the part.

Using Too Many Google Adverts

There is nothing against placing Google adverts on a site, but if you use too many, especially if they are ‘above the fold’ the again Google will downgrade the site and users too may leave earlier than they otherwise would.

No About Us Page

About us pages are yet another way that both Google and users can be affected by the contents of a site. Users are known to like to check out the history of a business before they commit to contacting them or buying from them, and thus often will want to see what the business has to say about itself. Thus, failure to provide this information or to provide little data is likely to have a negative effect.

Similarly, Google likes a good About Us page, believing that this is a way of increasing the level of User Experience. So again, it is vital, in order to boost the overall site quality to include a very detailed About Us page.

Contact Information Not Easily Found

Most sites have some contact on them, but in some cases this is not easily found, requiring the user to visit the Contact Us page before finding out what they want. If on the other hand, the contact information is easily found, then both the user and Google will be pleased.

This is thought to be one of those signals that Google looks for and thus it is a very good idea to include contact information on every page of a site.

Lack of a Blog

Blogs are a great way of providing users with the sort of information they want, as well as creating content that Google too can rate. One big reason to have a blog is that before they ever contact a business, potential customers will want to know the business can help them. So, if they find answers to some of their questions they are much more likely to enter into the sales process.

Conclusion

As you can see, there are many ways you can get things right on a site and many ways you can get them wrong too. Paying attention to the factors above will help your site not only convert better, but also obtain higher rankings on Google.

 

About the author

Graham Baylis was born in 1957 and has therefore seen the birth of the Internet and experienced at first hand, just how it has changed the World we live in. He has been involved with computers since 1983 and helped set up the first electronic mail system used by the Civil Service in 1986. He has gained several qualifications in Computing and Marketing and has written hundreds of blogs which you can find around the web, many being on customer’s websites. He has over 19 years of experience with Pay Per Click and SEO and has an answer for most Search Marketing questions, and, for those that he hasn’t is quick to find one. Continually coming up with solutions, Graham is one person it is worth having a chat with.

 

How SEO Trends Have Changed Over Time (And What It Means for Your Site)

Anyone who’s not a snake oil salesman will tell you there’s no magic bullet when it comes to Search Engine Optimization (SEO) If you want to get to the top of the Search Engine Results Pages (SERPs) and more importantly, stay there, you have to be on board with the latest SEO trends and best practices.

Online marketing explained

So how do we do this for our clients?

Simply put, knowing what the latest trends in SEO are enables us to adjust the way we do things, which helps us keep our clients a step ahead of the competition. The good news is that SEO fundamentals tend not to change too much (despite what many SEO companies say), so we don’t need to continually rebuild the strategies we use from the ground up every time Google puts out an update.

In this article, we’ll give you a brief crash course on SEO and its history, and then we’ll move onto four of the most recent trends in SEO, and discuss how they can be used to get your site more traffic.

A Short History of SEO

The search engines (like Google) take a lot of factors into consideration to determine when and where to show your website on their results pages. However, not all are known and in anycase, their effect and value is bound to change over time, so you cannot just concentrate on one or two, you have to cover as many as you can, and then look for ways to rank more highly by optimizing their content and structure (hence the term ‘SEO’).

SEO has been around since the search engines started up

At the start, the Search Engines were quite easy to manipulate, but over time they have become more sophisticated and that has had a huge impact on the way you have to do SEO.

Simply put, if your website relies on search engines for its traffic, then you just have to take SEO into account.  The problem is that search engine algorithms are constantly changing, which means that even thought the basics (High Value Content and Links) remain the same, it means there isn’t a magic set of rules that will guarantee you high ranks and traffic.

4 Recent SEO Trends (And How to Harness Them)

Before you start to implement any of the issues we cover below, we would recommend that you first check on how well your site is optimised and whether you need to fix any basic issues (you will be surprised how many sites we see that simply don’t even have these right). The good news here is that we offer a FREE SEO check, so please contact us if you would like a report on your site.

Prioritize Mobile Optimization

You may not know it, but mobile traffic has long since overtaken that of desktop traffic. In other words, most of the people who visit your website will probably do so from their phones or other mobile device/

This is a problem for lots of sites as they were built with the idea that they would be looked on by people using a desktop or a laptop PC. This means they may not be optimized to provide a good mobile experience, and that’s a big problem, because a poor user experience on mobiles will more than likely lead to people leaving very quickly.

With all that potential traffic in danger of being lost, you just have to take mobile optimization seriously, this being even more vital now that Google is rating sites on the basis of them being viewed from a mobile device, whether or not the user is using a desktop PC or not.

There are a lot of things you can do to make your website more mobile-friendly, including:

  • Make sure your site is fully responsive
  • Test your website using multiple mobile devices, to ensure that it works perfectly across them all
  • Optimize your website’s construction to ensure it loads quickly

Optimize Your Content for Voice Searches

The way people search for things on mobile devices are often quite different from how people do on desktops or laptops. For example, say you got a craving for a Chinese meal and  decide to look up what’s nearby

If you’re on a PC, you’ll probably jump over to Google and type in something like “best Chinese restaurants” However, if you’re on a phone, you’re more likely to use a voice search and ask something along the lines of “what are the best Chinese restaurants near me?”:

Whatever the search method is used, the search engines rely heavily on the keywords in the content to determine when to show what page, but when it comes to voice searches, people are more likely to use long-form keywords and ask full questions. This means that if you only optimize your content for “Chinese restaurant” you may be missing out on some traffic.

The experts are predicting that by 2020, half of all online searches will be voice-based, so it looks to be a smart move to start including long-form keywords within your SEO strategy. However, whilst the format might be a little different, at Serendipity we have always used / recommended the use of long tail keywords, something that will benefit businesses with a local presence, since users are more likely to use mobile devices to look for nearby resources.

Add Structured Data Markup to Your Site

These days, search engines do a pretty good job of determining what is relevant content and what isn’t, relying on an increasingly complex system of content analysis. However, it is still a good idea to give then a little help.

There are many ways you can do this, such as by using relevant Meta Titles, descriptions and Header and subheading tags, adding alt tags to your images, and more.

However, if you want to go a step further, you can also add structured data to your content.

Structured data markup is a language you can use to include more information for search engines about what your content involves.

Search engines, in turn, can use this data to display ‘rich snippets’ , which are search results that contain all that extra information:

There are many types of structured data that you can add to your content, ranging from articles to recipes, online products, and more, but it is easy to get this wrong and so you have to approach this with caution, not to mention a little knowledge.

Optimize Your Content for Google Answer Boxes

Sometimes, when people ask Google a simple question, they will see the answer right within the SERPs in an ‘answer box’ format:

Answer boxes tell searchers: “Hey, here’s what you were looking for, so you don’t need to check out any other sources” In fact, having one of your pages show up within a Google answer box can increase click-through rates by up to 32%. The problem is that you can’t just ask Google to feature your content within its answer boxes .

What you can do is to optimize your content to increase your chances. First, it’s important to understand that not all types of content work with answer boxes and also they will only show up when it’s a question Google is ‘confident’ it can answer.

That means most complex queries are out of the question, but if you’re confident that your content is a good fit for answer boxes, you can increase your chances of it showing up that way by following three simple steps:

  • Ensure that you clearly and concisely answer the question.
  • Break down your response using bullet point lists (search engines just love lists)
  • Optimize the meta descriptions , so that it is clear that you’ll provide an answer within the content

Do bear in mind that, you’re probably competing with a lot of other websites for the same answer boxes, so you will need to do your very best to beat out the competition

Conclusion

Search engines are constantly updating their algorithms, this being done to stop people from ‘gaming’ the system. This means that apart from doing the obvious (Great content and good quality links), understanding the latest trends can give you an advantage on the competition, and enable you to overtake them in the SERPs

 

About the author

Graham Baylis was born in 1957 and has therefore seen the birth of the Internet and experienced at first hand, just how it has changed the World we live in. He has been involved with computers since 1983 and helped set up the first electronic mail system used by the Civil Service in 1986. He has gained several qualifications in Computing and Marketing and has written hundreds of blogs which you can find around the web, many being on customer’s websites. He has over 19 years of experience with Pay Per Click and SEO and has an answer for most Search Marketing questions, and, for those that he hasn’t is quick to find one. Continually coming up with solutions, Graham is one person it is worth having a chat with.

 

 

SEO Trends for 2019.

I have been working in the SEO field for some 17 years now, and am one of the few SEO consultants who state that overall, things are not that much different from the past, at least at the basic level.

I say this as today:-

  • Links are important, just as they were in 2000
  • Content is important, just as it was in 2000
  • Technical SEO is important, again, just as it was in 2000.

The difference is that in the first two cases Google has become more savvy (through the many changes to its algorithm) about what is good and what is not.  In the case of Technical SEO, in 2000 you had to ensure that your site was well built; in that it was ‘clean’, not because you wanted to please Google, but because you had to contend with very slow download speeds (hence the reason for the ALT tag; this not being for SEO at all, it’s purpose being to allow users to switch off image downloads because they could sometimes, simply not wait for the image to appear).

That’s not so say that there have not been changes though..

SEO Trends for 2019

SEO Trends for 2019

The changes in Links (also known as Off Page SEO)

I have touched on the subject of backlinks (a backlink is a link from another site to yours, the anchor text used in the link – the bit you click – being ‘read’ by Google and used to create an idea about what subject your site covers), stating that Google has become much cleverer at spotting links that have been built solely for the purposes of getting better rankings. Today, if you try to trick Google by creating lots of links with your ‘money’ phrases like ‘pet cremation’ or ‘video marketing’ you will more than likely to penalised.

However, what this has really done is to weed out those SEO’s who do not keep up with the changes. For those that do, all it means is that you have to create more Brand and Natural links. The position is the same as 2000 though, Google likes sites that have lots of links (from lots of domains).

EAT – Expertise – Authority – Trust

The other change in the linking world is that the ‘relevance’, ‘trust’ and ‘authority’ of the sites providing the links has become more important. This has been the case for many years though. Trust is relatively easy to gain, all you need are links from sites that themselves have a high Trust rating, this also giving you some ‘authority’. The ‘expert’ rating of a site is becoming a more important factor and one that is hard to influence easily.

The changes in content

Here the changes are in my opinion very much deeper. In 2000, you could keyword stuff a page and get away with it, getting good rankings in the process. In my view this was never a really practical method, as even though you could get a good ranking, such pages never had that good a conversion rate, thus you got a lot of ‘horses to the water’ but few of them drank, which made that practice a poor one.

Today, Google is very much better at working out what good content is, (which means you need to improve your copywriting skills) looking for a whole host of ‘signals’, these including how well it is written, how many synonyms are used and if it includes images and videos.

Content Development and Marketing

With the importance of content growing so much over the years (the reason content has become more important is that in 2000 Google relied heavily on Pagerank, which was all about links, whilst today, Google can better understand content, this now accounting for at least 50% of the points that Google give any page), two new buzz phrases have been brought into play:-

  • Content Development
  • Content Marketing

As you would imagine the first is about creating the content, whilst the second is about getting it noticed, the planning for this being given the grand title of ‘content marketing strategy’.

Google has also improved its ability to spot duplicate content, this being the reason so many Ecommerce stores have to tweak the descriptions of their products. Failure to do this means that the pages on their sites are just the same as countless others, which makes it harder to rank…

Meeting User Intent is the key

However, the biggest change since 2000, is that now Google is looking for pages that match the ‘intent’ behind the search. Saying this, at the moment, Google is still guessing most of the time, its AI helper ‘Rank Brain’ still having a long way to go. In the meantime, we are all ‘rats in the maze’, Google constantly checking to see what sites people stay on for a given term. This way they can start to associate sites with phrases, and by looking at the content of the sites, deduce (to some degree) what the user wanted in the first place.

As I say, this has a long way to go, but it is going to get more important in 2019.

The changes in Technical SEO since 2000

The obvious change here is that website construction has come a long long way since 2000, but that to some degree has been a double edged sword, many companies offering web design services  taking advantage of the faster download speeds available by not optimising the amount of code the text rating (the amount of code versus the actual words seen by the visitor). This can lead to a site that is too code heavy and this must have an impact on speed which is not good at all.

The rise of the use of mobile devices (with their lower download speeds) has however put this issue back under the microscope, so again Google are looking for sites that download nice and fast AND offer the required level of usability when viewed on the smaller screens that most mobile phones and tablets have.

Actionable Changes – What can you do in 2019 to improve your rankings?

 

Satisfy the Intent of the user

“You need to understand what someone is expecting to find when they query a word or phrase and you need to give them the answer in the simplest way possible,” said Mindy Weinstein CEO of Market Mindshift.

To me this sounds like an excellent point, but it is not always that easy. There is no problem when someone asks a direct question, eg ‘how to you fix a leaking tap’, or ‘what ratio of links should use money phrases’, but when it comes to more generic phrases like ‘maps’ (20 million searches in the USA in October 2018) or ‘entertainment’ (16 million in the USA). You can have a guess at both, but it would be impossible to know exactly what the user was searching for.

The good news is that ranking for such phrases is pretty useless anyway, but even some ‘long tail’ phrases beg the question ‘what is the user looking for’ a great example for me being ‘bed bath and beyond’ (6 million searches in the USA in Oct 2018)…

So how can you win here?

The advice I always give to my clients is that as you cannot really guess what someone is looking for, is to provide them with the answers to some questions that you can help them with. So in relation to the terms ‘maps’ a site could provide information as to how they could provide ‘large scale digital maps’, or maps that could be used to support ‘planning applications’. That way, they are going to strike lucky some of the time, and as long as the content is really informative / useful, it’s bound to help the site’s standing in Google’s eyes.

To achieve this, we go through a very detailed process of finding what sort of content is already ‘liked’ by Google, then after writing a useful article / page / post, we then compare this with those pages that have proven their worth, altering them to include as many of the words as possible, whilst of course, maintaining good readability and thus user experience.

You can also tune a site so that Google will use the content as a rich snippet.

“Answer boxes, recipes, the knowledge graph, carousels, and who-knows-whatelse will take an even bigger bite out of organic traffic,” said Ian Lurie, CEO and founder of Portent. “That makes SEO even more important, because exposure is as much about visibility in the SERPs as it is about clicks.”

This can be a really good idea, but you can only do this for certain terms. But if the cap fits then it is a great idea to wear it.

Structured Language Mark Up

“With AI becoming increasingly important for Google, structured data is becoming more important as well,” Tandler said. “If Google wants to move from a mobile-first to an AI-first world, structured data is key. No matter how good your AI is, if it takes too long to ‘crawl’ the required information, it will never be great. AI requires a fast processing of contents and their relations to each other.”

You cannot use Structured Markup Language on every page, but as with rich snippets, if you can integrate this code in your site then it is yet another thing that will help your SEO in 2019.

Do beware though, if you use SML on a page incorrectly (that is to include data that is not relevant to the actual page) then Google will actually penalise your site.

Voice Search

This is an area tipped to be more important in 2019, however, the amount of work needed is, at the moment, not matched by the expected gains.  This is one to keep an eye on in 2019.

On Page SEO

This is an area that is often not attended too and one that has been a ‘winner’ for many years. Putting it simply, on page SEO is all about making sure that the important areas of a page are populated with the right keywords; those that tell Google what the page is all about.

These areas are:-

The Title of the Page. This is the text you see in the tab on your browser and in the SERPS listing. It remains the most important and vital piece of ‘web real estate’ for 2019

The Header Tags – These are a throw back to the time before Cascading style sheets were introduced, but are still very important. It is best to use just one H1 and then use H2, H3, H4. H5 and H6 tags to introduce deeper and deeper topics within the content.

Using keywords (or a synonym) and then use the bold, italic or list attributes to highlight them

Lastly, the Meta Description.  This does not have a huge effect on the rankings for a page, but they are important as the words in this meta tag are normally used in the Google SERPS. As such their main job is to act as an ‘elevator speech / pitch’ the idea being to encourage people to click on the link. Besides this ‘positive’ use, a site that had a lot of duplicate meta tags can have their overall quality rating reduced, something that is best to avoid.

Other Things to do

If you have not yet registered your site for two of the very best web analytics tools –  Google Webmaster Tools (now called Google Search Console) and Google Analytics, I would suggest that you do so immediately as it will help immensely. Together they allow you to see:-

  • what keyword phrases your site is being found for
  • which enables you to see if you are getting the targeted traffic you desire

You should also check to see if your site needs some local search marketing, the answer being a clear yes if it is not appearing in the Google 3 pack (that maps with the pins) for a relevant term. Local search marketing optimisation covers much the same ground as SEO, but with the added issue of Local Citations.

If your site is not doing well in 2018, then you will have to go back to the basics and go through the process of keyword analysis (to make sure you are targeting the right words), get an SEO Audit (a detailed form of website analysis), to make sure your site is SEO friendly and is not breaking any of Google’s webmaster guidelines, and more than likely have to take advantage of the link building services and other professional seo services, that companies like Serendipity provide.

 

Hopefully, the above data will help you improve your SEO tactics for 2019.

 

About the author

Graham Baylis was born in 1957 and has therefore seen the birth of the Internet and experienced at first hand, just how it has changed the World we live in. He has been involved with computers since 1983 and helped set up the first electronic mail system used by the Civil Service in 1986. He has gained several qualifications in Computing and Marketing and has written hundreds of blogs which you can find around the web, many being on customer’s websites. He has over 19 years of experience with Pay Per Click and SEO and has an answer for most Search Marketing questions, and, for those that he hasn’t is quick to find one. Continually coming up with solutions, Graham is one person it is worth having a chat with.

Aiming for Perfect On Page Search Engine Optimisation

Those of you that have been researching or running SEO will no doubt have come across ‘TheHoth’, a very useful site that provides all sorts of information and services, whilst also running a very good blog.

Their latest post is all about ‘On Page Search Engine Optimisation’. It covers a lot of the ground that has been ‘well trodden’ over the past few months, but is nevertheless, a useful read.:-

https://www.thehoth.com/blog/on-page-seo/

The cover what they define as the top 3 Steps to getting On Page SEO right:-

  • Keyword Research
  • Optimising the Titles, Description, and H1 (Header tags)
  • Not stuffing the pages with the target keywords

There are other issues that are connected, such as using ALT tags for images and using ‘descriptive URLs’ for both file and image names (i.e ‘picture-of-dog.jpg’ rather than ‘image001.jpg’), plus of course ‘technical SEO’ issues like the speed the site downloads and ‘geographically tagging’ images (very useful for Local SEO).

Getting Keyword Research Right

This is an essential part of any SEO project, you simply cannot guess what people are using to find the sort of service or product you are selling. Even if you did guess correctly, without other research into the ‘power’ of the sites you are competing against (and you can bet your last pound that you will have lots of competition) you could well select keyword phrases that you can just not get a  good rank for, the competition simply out gunning you.

In their blog, TheHoth, point out the value of using SEMRush (this now being added to their services). I can only agree with them about this tool, it providing a host of information about what keywords are being used, together with an idea of how often they are used and the competition levels. Also, and this is VERY important, it allows you to see what your competition is being found for, this being especially useful for Local SEO.

But, there is another set of tools that you should also know about, these being provided by Mangools (https://mangools.com/). Their suite of tools is most useful indeed, covering not only Keyword Research (with a lot of information about the power of the competition included), but also data on the strength of sites too. All very useful and well worth a look if you are looking to carry out SEO on your own site.

Some Interesting SEO Tools

Some Interesting SEO Tools

Optimising Your Titles, Description and Header Tags for SEO

If you want a page to be ranked for a certain phrase, then you simply MUST ensure that the Title contains some relevant words (normally the phrase that you are targeting). There is some debate as to whether this needs to be in the form of a sentence, or whether you can use the keywords as they are, separating them with a pipe symbol | . Either seem to work, but it is possible that the sentence version could increase conversion rates (from being listed in SERPS to getting a click).

As with everything in the SEO World, you should not use your target words too many times in the Title, and it is best to keep to about 70 characters, even though Google, for example will read / index many more.

On a side note, I am AMAZED at the number of sites that do not use these areas of a page ‘correctly’. This seems to be madness to me, as they are quite simply denying themselves lots of free, targeted traffic….

 

The Meta Description

TheHoth goes on to mention that you should then make sure your Description is completed, but don’t tell you that the Meta Description is not used that much for SEO (as the words within it are not taken into account in the same way as the Title or Body copy). The main thing to do here is to NOT REPEAT the same Meta Description throughout the site as this looks ‘lazy’ and can reduce the overall ‘Quality Score’ for the domain (this being a different ‘Quality Score’ to that used in Google’s Adwords system).

The other thing that TheHoth says, is that you should not repeat the keywords in the Description. I am not sure of that, although of course, you should not ‘stuff’ this area either.

The Real Purpose of the Meta Description Tag

By the way, the real purpose of the Description text is to act as the ‘elevator speech’ for a page, the idea being to get the viewer to click the link when it is displayed in the Search Engine Results.

The Header Tags

Looking at TheHoth’s article we see that they mention the importance of the H1 tag. Again this is an area of contention, as I have seen many pages performing very well with no H1’s, or with tags that do not use any of the target keywords (or anything like them).

However, I agree that it is best practice to use H1’s (and only one per page), whilst also using H2’s down to H6 in a cascading manner, as best fits the way the copy is divided up.

They also point out that you could over-optimise a page by using the target Keywords in the H1. Again I am not sure that this is totally true, as I have seen that work too. I feel that the best approach is to write the copy, some Good Solid, Useful Copy and then add Headers that look ‘right’, all the while making sure that you do not ‘over-cook’ the area of keyword usage.

Variations of Target Words

This is where the use of variations (synonyms) of the target words come into play, and it pays to use them throughout the site’s pages, especially in the body copy, for the simple reason that Google ‘likes it’. It is therefore reasonable to also use these variations in the Title / Description and Headers, if it looks OK.

This leads nicely on the third area that TheHoth covers, that of Keyword Density.

Keyword Stuffing and The Use of Synonyms

In the old days of SEO, you could quite happily repeat your target keyphrases again and again and again, and the Search Engines would reward your site with lots of top ranks. However, in the game of cat and mouse that SEO is, things have changed quite dramatically in this area, now of course repeating your keyphrases has dire results…

This change, like the majority, has been made with one aim in mind, that of increasing the quality of the pages listed for any given phrase, in the Search Engine Results. The other reasons are because there were many SEO’s who were gaming the system and skewing the results, which was something that Google was not prepared to put up with.

Hence the many changes to how pages are ranked – this including Off Page SEO, which is another topic.

The problem Google had (has) is how to calculate quality, especially when copy / an article that is deemed great by one, is thought of as rubbish by another. This being made even more difficult when a computer is being used to score pages, after all, ‘English’ is not exactly the first language of a Computer….

To this end, Google (being the front runner here) built a sort of Artificial Intelligence into its algorithm, its job being to deduce the real ‘meaning’ of a page from the words on the page. Here it was not looking from the angle of ‘what keyphrases’ does this page target, but rather ‘what is the page talking about as a whole’. To this end it was trained to look for words that are associated with each other, so that for example, a page that uses the words ‘cat’, ‘feline’, ‘kitten’, ‘purr’, ‘pet’ and ‘bed’ would automatically be associated with the phrase ‘cat beds’, in a stronger way than a page that just used the words ‘cat beds’ would be.

How to Deduce What Synonyms to Use

One way is to spend your life looking at a Thesaurus, but the perhaps the best way is to examine the words used on the pages that Google is known to like (we can tell because they are ranked well for any given phrase). The list that is created from such research can then be used in the copy in the knowledge that they must be relevant to some degree. This is the way that Serendipity Online Marketing goes about the matter of copy creation and it has been seen to work.

Long Tail Keywords

Any site gets a huge percentage of its non Brand traffic from what are called ‘long tail keywords’, these being phrases using 3 or more words. Often these phrases are not used very often, which in turn means that they do not appear on the ‘SEMRush radar’ and thus cannot be directly targeted.

Instead, what you have to do is to write the copy with a view to providing information to the reader, copy, which if it is good enough, WILL be associated with many of the relevant long tail keyword searches made, whether or not they actually use the words in question.

There Is More of Course

TheHoth’s article finishes at this point, but there is of course a lot more too why pages are ranked and why they are not. This includes Off Page SEO, and this is an area that cannot be ignored, as a page which uses the very best of copy, placed on a site that uses the Titles, Descriptions and Headers to full effect, is more than likely to be beaten to the top spots by other pages, simply because they have more links…

This is one reason why Keyword Research is so important, as when selecting the target words, it is vital, especially for a new domain, to select the possible, rather than the impossible.

 

About the author

Graham Baylis was born in 1957 and has therefore seen the birth of the Internet and experienced at first hand, just how it has changed the World we live in. He has been involved with computers since 1983 and helped set up the first electronic mail system used by the Civil Service in 1986. He has gained several qualifications in Computing and Marketing and has written hundreds of blogs which you can find around the web, many being on customer’s websites. He has over 19 years of experience with Pay Per Click and SEO and has an answer for most Search Marketing questions, and, for those that he hasn’t is quick to find one. Continually coming up with solutions, Graham is one person it is worth having a chat with.

Combined Arms is as necessary in SEO as it is in warfare

Any army general or historian will tell you that in order to win a battle, you need to use all the different types of offensive weaponry available, this being the same across the ages, from the time of 1066 (and earlier) to today.

This is called ‘Combined Arms’ it being the process where an army will use the three main types of fighting unit together in order to win. In the past, these were represented by the Infantry, the cavalry and the artillery. Over time of course this has changed, the role/type of cavalry changing the most, with tanks replacing the horse.

the three arms of SEO

You need to use the 3 arms of SEO just as you do when at war.

But regardless of the change, all three are needed. The artillery to subdue the enemy before the main attack, with the cavalry, after performing the important job of reconnaissance, taking on the task of pinning the enemy in place, thus allowing  the infantry to advance to take the objective.

Of course all this has changed since the days of Napoleon, but they are still basically the same today, all being  needed (with air power taking over a large part of the role of artillery of course).

I understand all of this, not because I have ever fought in a war (or am an acclaimed war historian) but rather because of my hobby, that of wargaming. I play games that cover battles fought by the Vikings, the Persians, the Crusaders, as well as the Napoleonic period. My favourite being that of World War Two. Here I have ‘fought’ in the deserts of Egypt, the steppes of Russian in Normandy, of which have been immensely  enjoyable.  In each battle I have learnt of the importance of using the three different arms together, something that lies at the heart of the hobby.

But what has this got to do with Search Engine Optimisation?

Well, putting it quite simply, there is no ‘magic bullet’ when it comes to getting the best possible rankings on Google (other search engines are available). Instead, you have to make sure that the three areas of Technical Site Build, Content, and Linking Structure are all properly attended too. It is vitally important that these three areas work together just as the three ‘arms’ on  the battlefield, if success is wanted.

The reason for this is that Google look at all three areas, giving ‘marks’ for each. To fail on any of them risks loosing the chance to get a top ranking, the reasons, and the details of each one being covered below:-

Technical Site Build

This is an area that is often forgotten, but is vital, as if the site is not built to allow Google to find all the pages easily, the site will fail at the very first step. Besides this, it is also vital to allow users to move around the site easily and to make the navigation easy to understand and use.

Perhaps the greatest area is however that of site speed, Google now more than ever (with the advent of the Mobile Index) looking for sites that download in the shortest times possible. If they are built in a manner that slows the page delivery  down, or are placed on a server that is overloaded, Google will downgrade the site severely, preventing it from gaining the high positions that it otherwise may deserve.

Content

The reasons that people visit sites are many, but in all cases they are looking for something, maybe the answer to a problem, or for a particular product or service. If the pages of a site do not provide these answers, or give enough detail on a product or service, they will fail to meet the needs of any visitor. And hence, as Google’s aim is to only list pages that are ‘worth the time of their users’ they will fail this important test.

It is therefore necessary to ensure that the pages of a site meet the needs of the visitor. This means that there is not only enough text on the page, but also that it contains pictures and where possible video content (this being another example of ‘combined arms’).

There is by the way a ‘hidden’ advantage to having lots of text on a page (as long as this is laid out in a manner that allows it to be easily absorbed – ‘walls of text’ not being a good idea – whitespace being important). This is all to do with what is known as ‘long tail keywords’ and the capturing of such searches on Google.

Long Tail Keyword Search Phrases.

It is well understood that users use different types of search phrases when they are looking for a product or service. For example, when looking for a TV, they may search for ‘large screen TV’ only to find that the number of search results is too large and that it is impossible to know where to start.

In such cases it is normal for the search phrased to be changed with a view  to getting a better list of sites to check. Perhaps the phrase will be altered to ’40 inch Smart TV’ at this stage.

Further pages and terms will be used until the searcher finds out the model that they want. This is the ‘buying stage’ of the search ‘lifecycle’ and is therefore most important. A term that is used here could be ‘Sony 40EXDB Smart TV in black’. In such cases, it is vital to make sure that your website is in the position to capture such a query.

There are many examples of long tail keywords, and in many cases they are the best ones to capture, as they are often used towards the end of a search for a product etc, at the very time the searcher is ready to purchase.

This is just what using a lot of text on a page can do for any website owner, it allowing them in effect to put more hooks in the ‘water’ of the internet. More hooks lead to more fish being caught, this translating to more visitors and thus hopefully sales.

The overlap with Technical SEO

There is also an overlap with the area of technical search engine optimisation to consider here, that of ensuring that the important areas on a page are used to best effect. These include the Title of a page (the most important real estate a page has) to the Header tags (the H1 being the most important and is best used only once). Besides this, using the other attribute tags like Bold, Italic and List should not be overlooked, these all being places where a part of the content of a page will be placed.

Linking Structure

The third arm of SEO trilogy is still very important today, but it must be said is not as all powerful as it was in the not so distant past (when it was said to be possible to get a blank page to position 1 on Google).

Today, it is still necessary to ensure that a domain, and the pages within the site, have a good number of links, the numbers needed being very different from market sector to market sector and from niche to niche, some being more highly contested than others.

There was a big change in the recent past however, a very big change, one that was started with the introduction of the so called  ‘Penguin’ update by Google. Google felt that they had to make these changes to their algorithm as SEO professionals started creating links in huge numbers to get the pages they wanted to the top of the SERPs.

Basically, this algorithm checked (it is now not run every now and then, but is integrated into the rule set that is used every day) on the links pointing to a site, penalising those that had created too many ‘spammy’ links or a linking structure that use too high a percentage of ‘money keywords’ (these are the phrases that are thought to bring in the sales / enquiries).

All of this means that this part of the ‘combined arms’ team needs to be very carefully handled indeed. So carefully that at Serendipity Online Marketing use specially designed software to handle the whole process, thus ensuring that we only build links that will enhance the standing of sites we work on.

 

So that is it, to succeed in SEO you have to use all three ‘arms’ and use them correctly.

About the author

Graham Baylis was born in 1957 and has therefore seen the birth of the Internet and experienced at first hand, just how it has changed the World we live in. He has been involved with computers since 1983 and helped set up the first electronic mail system used by the Civil Service in 1986. He has gained several qualifications in Computing and Marketing and has written hundreds of blogs which you can find around the web, many being on customer’s websites. He has over 19 years of experience with Pay Per Click and SEO and has an answer for most Search Marketing questions, and, for those that he hasn’t is quick to find one. Continually coming up with solutions, Graham is one person it is worth having a chat with.

Googles New Quality Guidelines and What it Means..

Today content and making a page ‘User Friendly’ is more important than ever when it comes to the current ‘SEO rules’ that Google use. This of course only covers On Page SEO (linking is another kettle of fish) but it is an area that covers a lot of ground.

With this in mind (and acknowledging that I do not know it all) I contacted an SEO Guru to find out if I had all the bases covered. I’m glad to say that I had, but the reply I got back did highlight the fact that Google have just changed their ‘Quality Guidelines’.

But back to my question. I wanted to know more about how content and UX were graded, especially, as before a page is visited by a human (and could hence provide Google with data via the Chrome Browser about time on page etc – if it wanted to take that into account) Google MUST have a means of calculating the ‘value’ of the content and how user friendly it is.

To me this is / was the ‘egg’ part of the chicken and egg story, it being the ‘egg’; a page not seen by Google or anyone else before, that is analysed and given a ‘value’ ranking. This is then being used as a basis for any later search related ranking procedure, pages with ‘higher value’ rankings being more likely to get a position at the top of the SERPS.

I listed the signals that I thought Google use, these being:-

  • Title of Page
  • Description of Page (not truly used, but a lot of poor ones can degrade an entire sites quality, so I have been told)
  • Header Tags on the pages (although these are not as powerful and many a site breaks the ‘rules’ about using them and still gets high ranks)
  • Bold, Itatlics, lists
  • Words used (more on this later)
  • Links out to relevant / useful sites (although I have seen comments from SEO professionals who also say this is not a useful signal)
  • Embedding videos
  • Using Images with the ‘descriptive’ file names and ALT text, and GeoTagging them for LOCAL SEO

Plus on the UX side

  • The Speed of the Page
  • Using whitespace
  • Not allowing too many adverts at the head of page
  • Ensuring that the above the fold area is not just images (the use of Carousels is said, by some to be harmful, but is used extensively and many sites still get a high rank..)

The Words Used on the page:

Here I pointed out that as Google uses a computer programme to analyse any page, that this in turn meant that it must use a lot of TRUE / FALSE checks, this leading on to the use of Words in the content. To me this is an important fact, as it would take a committee of ‘experts’ to view a page to tell whether it was truly good and useful (and they would surely disagree in many cases) and as this is just not the case with Google (even with the power of RankBrain), it surely MUST be making its decisions at a far lower, more ‘mechanical’ level.

The problem about what words to use has been overcome by the use of LSI and Reverse Engineering, and from the reply I got back, I would say that this is still the case today.

Of course, if you want to ‘get a message’ across to Google about what a page was all about, with some specific keyword phrases in mind,  you just CANNOT stuff a page with those target words, this being a dangerous method now.

Google’s Quality Guideline Update

It must be said that these guidelines ARE NOT A PART of the SEO algorithm, but they are important as they form a part of the ‘feed back’ process that Google use when evaluating their own SERPS listings…

The way it Works…

We know a fair bit about the way Google rates pages for any given term, and we also know that Google is constantly changing these rules. In the past, they had to keep changing the rules as SEO professionals were constantly ‘taking advantage’ of an anomaly in the algorithm, but today, with Google’s more holistic approach (also known as Semantic SEO), I believe that the changes they make are all about presenting the best possible results.

Google however has a problem here, as they need some way of checking that they are getting it right…

This is where their army of human evaluators come in. They have been around for many years of course, and were responsible for the rule set that Google used to highlight sites whose general quality is low. Here the sort of thing that they found was that sites which use a lot of duplicate Meta data, or Titles, or have a lot of pages with ‘thin content’ (low word count) tend to provide poor user experience and are basically not worth Google’s time to include in the results.

In order to help these evaluators, Google provided them with an Aide Memoir, this listing all the things that should be checked on a sites pages. We will cover this in more detail later in this post.

So, how does Google use the results of the human evaluators?

Of course, they don’t give you the full picture, but looking at it logically, if the human evaluators rate a page as being of the Highest Quality AND this page is NOT listed in the results for a relevant term, then, the Algorithm may well need some work. The same would be the case if those pages that were considered to be of Low quality WERE in the rankings.

So, even though you cannot affect that part of the ‘Quality Assessment’ that is not being worked out by the set of computer rules that is the Google rule set, you can help Google get it right.

This is important as if a human evaluator ranks a page on your site (or a page like it) highly, this feedback process will eventually ensure that your page gets the best possible rank…

The Google Quality Rules

There is a very detailed blog post on this, and you can also download the full details from Google if you want. But to help, the information below (taken from a part of the post mentioned) will enable you to ensure that all of your pages are of the highest quality.

 

Page Quality Ratings

Overall Page Quality Rating

Google has completely rewritten this part of their guide lines,  expanding this section from the very brief version it has had before.

Old version:

The overall Page Quality rating scale offers five rating options: Lowest , Low , Medium , High, and Highest .

New version:

At a high level, here are the steps of Page Quality rating:

  1. Understand the true purpose of the page. Websites or pages without any beneficial purpose, including pages that are created with no attempt to help users, or pages that potentially spread hate, cause harm, or misinform or deceive users, should receive the Lowest rating. No further  assessment is necessary.
    2. Otherwise, the PQ rating is based on how well the page achieves its purpose using the criteria outlined in the following sections on Lowest , Low , Medium , High, and Highest quality pages.

Here you can see that Google is putting the focus on the beneficial purpose of the page.

 

Page Quality Rating: Most Important Factors

Google’s change to this section yet again put the focus on the purpose of the page, but also bring in the ‘reputation of the creator’ of the content.

Here are the changes, with the changes in italics to this section:

Here are the most important factors to consider when selecting an overall Page Quality rating:

  • The Purpose of the Page
    Expertise, Authoritativeness, Trustworthiness: This is an important quality characteristic. Use your research on the additional factors below to inform your rating.
    Main Content Quality and Amount: The rating should be based on the landing page of the task URL.
    ● Website Information/information about who is responsible for the Main Content: Find information about the website as well as the creator of the MC.
    Website Reputation/reputation about who is responsible for the Main Content: Links to help with reputation research will be provided.

 

 

Expertise, Authoritativeness and Trustworthiness (E-A-T)

Again there are some significant changes here.  First, the instances where Google referred to “high quality” have  now been changed to “high EAT”.

Here we believe Google is directing its human evaluators to look beyond simple quality and consider other aspects that contribute to the value of that content.

So, Google has added this new part:

Remember that the first step of PQ rating is to understand the true purpose of the page. Websites or pages without some sort of beneficial purpose, including pages that are created with no attempt to help users, or pages that potentially spread hate, cause harm, or misinform or deceive users, should receive the Lowest rating.

For all other pages that have a beneficial purpose, the amount of expertise, authoritativeness, and trustworthiness (E-A-T) is very important. Please consider:

  • The expertise of the creator of the MC.
    ● The authoritativeness of the creator of the MC, the MC itself, and the website.
    ● The trustworthiness of the creator of the MC, the MC itself, and the website.

Later in the section, they make some changes specific to the content creators in several key areas, including medical, news, science and financial sites.

Here are those changes, with the changes in italics:

  • High E-A-T medical advice should be written or produced by people or organizations with appropriate medical expertise or accreditation. High E-A-T medical advice or information should be written or produced in a professional style and should be edited, reviewed, and updated on a regular basis.
    High E-A-T news articles should be produced with journalistic professionalism—they should contain factually accurate content presented in a way that helps users achieve a better understanding of events. High E-A-T news sources typically have published established editorial policies and robust review processes ( example 1 , example 2 ).
    High E-A-T information pages on scientific topics should be produced by people or organizations with appropriate scientific expertise and represent well-established scientific consensus on issues where such consensus exists.
    High E-A-T financial advice, legal advice, tax advice, etc., should come from trustworthy sources and be maintained and updated regularly.
    High E-A-T advice pages on topics such as home remodeling (which can cost thousands of dollars and impact your living situation) or advice on  parenting issues (which can impact the future happiness of a family) should also come from “expert” or experienced sources that users can trust.
    High E-A-T pages on hobbies, such as photography or learning to play a guitar, also require expertise.

Here you can see that  Google is putting a lot of stress on the content creators as well, this being all the more important for YMYL (Your Money or Your Life) sites.

 

High Quality Pages

Characteristics of High Quality Pages

Google has also expanded this section, the first reference to the new title changes being mentioned , as well as more on the beneficial purpose of a page.  Changes/additions are in italics.

High quality pages exist for almost any beneficial purpose, from giving information to making people laugh to expressing oneself artistically to purchasing products or services online.

What makes a High quality page? A High quality page should have a beneficial purpose and achieve that purpose well.  In addition, High quality pages have the following characteristics:

  • High level of Expertise, Authoritativeness, and Trustworthiness (E-A-T).
    ● A satisfying amount of high quality MC, including a descriptive or helpful title.
    ● Satisfying website information and/or information about who is responsible for the website. If the page is primarily
    for shopping or includes financial transactions, then it should have satisfying customer service information.
    ● Positive website reputation for a website that is responsible for the MC on the page. Positive reputation of the
    creator of the MC, if different from that of the website.

 

This is all very useful stuff, but hidden in the text is the interesting phrase ‘A satisfying amount of high quality MC, including a descriptive or helpful title’.  This is important as it highlights the fact there is no set number of words and that Titles need to be descriptive and relevant (Click Bait Titles could well result in penalisation).

The Highest Quality Pages

Highest Quality Pages

Again, beneficial purpose is added as a requirement for a highest quality page.

They have also added the “and quantity if MC” as a marker for a distinction between high and highest quality.  This does raise a question about whether all content length is really considered equal in the eyes of Google.  Both Gary Illyes and John Mueller have stated you don’t need to write an essay for a piece of content that doesn’t need it, and to write as much as you need to in order to answer the question the title presents.  But here, quantity of the main content is something rates should specifically look for when deciding if a page is highest quality or only high quality.

And we see yet another reference to the need of having a “very positive reputation of the creator of the main content, if different from that of the website.”

But they have removed references to this on pages for stores or other financial transactions.

Here is the old version:

Highest pages are very satisfying pages that achieve their purpose very well. The distinction between High and Highest is based on the quality of MC as well as the level of EAT and reputation of the website.

What makes a page Highest quality? A Highest quality page may have the following characteristics:

  • Very high level of Expertise, highly Authoritative, and highly Trustworthy for the purpose of the page (EAT), including the EAT
    of the publisher and/or individual author for news articles and information pages on YMYL topics.
    ● A satisfying amount of high quality MC.
    ● Highly satisfying website information and/or information about who is responsible for the website or for stores and pages involving financial transactions, highly satisfying customer service reputation is very important.
    ● Very positive website reputation for a website that is responsible for the MC on the page.

And the updated version:

Highest quality pages are created to serve a beneficial purpose and achieve their purpose very well. The distinction between High and Highest is based on the quality and quantity of MC, as well as the level of reputation and E-A-T.

What makes a page Highest quality? In addition to the attributes of a High quality page, a Highest quality page must have at least one of the following characteristics:

  • Very high level of Expertise, Authoritativeness, and Trustworthiness (E-A-T).
    ● A very satisfying amount of high or highest quality MC.
    ● Very positive website reputation for a website that is responsible for the MC on the page. Very positive reputation of the creator of the MC, if different from that of the website.

 

 

And for Low Quality Pages…

This entire section on low quality pages has been updated.  Some was removed as it was replaced with something more concise, while other areas were expanded, particularly around reputation and beneficial content.

Low Quality Pages

The first paragraph has been updated completely.

This was removed:

Low quality pages are unsatisfying or lacking in some element that prevents them from achieving their purpose well. These pages lack expertise or are not very trustworthy/authoritative for the purpose of the page.

And it was changed to this:

Low quality pages may have been intended to serve a beneficial purpose. However, Low quality pages do not achieve their purpose well because they are lacking in an important dimension, such as having an unsatisfying amount of MC, or because the creator of the MC lacks expertise for the purpose of the page.

Here is the reference to beneficial purpose once again.  But this time it also concedes that sometimes these pages were intended to serve a beneficial purpose but something on the page – or missing from it – means it is still low quality.

Google has removed the possibility that some pages that meet their “low quality pages” criteria might not be considered low.  Now, raters must always rate a page as Low – or Lowest – if any one or more applies.

Here is what the section used to be:

If a page has one of the following characteristics, the Low rating is usually appropriate:

  • The author of the page or website does not have enough expertise for the topic of the page and/or the website is not trustworthy or authoritative for the topic. In other words, the page/website is lacking EAT.
    ● The quality of the MC is low.
    ● There is an unsatisfying amount of MC for the purpose of the page.
    ● MC is present, but difficult to use due to distracting/disruptive/misleading Ads, other content/features, etc.
    ● There is an unsatisfying amount of website information for the purpose of the website (no good reason for anonymity).
    ● The website has a negative reputation.

And here is the new revised version:

If a page has one or more of the following characteristics, the Low rating applies:
● An inadequate level of Expertise, Authoritativeness, and Trustworthiness (E-A-T).
● The quality of the MC is low.
● There is an unsatisfying amount of MC for the purpose of the page.
● The title of the MC is exaggerated or shocking.
● The Ads or SC distracts from the MC.
● There is an unsatisfying amount of website information or information about the creator of the MC for the purpose of the page (no good reason for anonymity).
● A mildly negative reputation for a website or creator of the MC, based on extensive reputation research. If a page has multiple Low quality attributes, a rating lower than Low may be appropriate.

Note that it no longer includes the reference that anonymity for some content might be appropriate.

Lacking Expertise, Authoritativeness, or Trustworthiness (E-A-T)

This section has been completely rewritten, and was formerly section 6.5.

Removed:

Some topics demand expertise for the content to be considered trustworthy. YMYL topics such as medical advice, legal advice, financial advice, etc. should come from authoritative sources in those fields, must be factually accurate, and must represent scientific/medical consensus within those fields where such consensus exists. Even everyday topics, such as recipes and house cleaning, should come from those with experience and everyday expertise in order for the page to be trustworthy.

You should consider who is responsible for the content of the website or content of the page you are evaluating. Does the person or organization have sufficient expertise for the topic? If expertise, authoritativeness, or trustworthiness is lacking, use the Low rating.

Revised:

Low quality pages often lack an appropriate level of E-A-T for the purpose of the page. Here are some examples:

  • The creator of the MC does not have adequate expertise in the topic of the MC, e.g. a tax form instruction video made by someone with no clear expertise in tax preparation.
    ● The website is not an authoritative source for the topic of the page, e.g. tax information on a cooking website.
    ● The MC is not trustworthy, e.g. a shopping checkout page that has an insecure connection.

 

User Generated Content Guidelines

It also made some slight changes to the user generated content section of this, and now specifically includes references to social networking pages, video sharing sites, and wiki-type sites.

Old version:

User-generated websites span the Page Quality rating spectrum. Note that in some cases, contributors choose their own topics with no oversight and may have very poor writing skills or no expertise in the topic of the page. Contributors may be paid per article or word, and may even be eligible for bonuses based on the traffic to their pages. Depending on the topic, pages on these websites may not be trustworthy.

New version:

Note: Websites with user-generated content span the Page Quality rating spectrum. Please pay careful attention to websites that allow users to publish content with little oversight, such as social networking pages, video sharing websites, volunteer-created encyclopedias, article sharing websites, forums, etc. Depending on the topic, pages on these websites may lack E-A-T.

The user generated content section is noteworthy, because they aren’t automatically discounting user generated content as low or lowest, but rather as something that warrants further investigation before rating it.  There are plenty of examples of high quality user generated content, but it seems the majority is definitely lacking in quality and EAT.

It has also changed the notation at the end from “Important : Lacking appropriate EAT is sufficient reason to give a page a Low quality rating.” to “Important : The Low rating should be used if the page lacks appropriate E-A-T for its purpose.”  So Google has a new distinction on EAT for the purpose of the specific page.

 

Low Quality Main Content

This section has been significantly reduced, although some of it was incorporated into new individual sections Google has added to the guidelines, so just because it is noted as removed here, doesn’t mean it was removed entirely.  But we also get our new guidance on the clickbait style titles vs actual content that Google now wants its human evaluators to call Low.

They entirely removed this part which was an example used to illustrate types of low quality content, as well as the differentiation between professional websites and those from hobbyists:

One of the most important criteria in PQ rating is the quality of the MC, which is determined by how much time, effort, expertise, and talent/skill have gone into the creation of the page, and also informs the EAT of the page.

Consider this example: Most students have to write papers for high school or college. Many students take shortcuts to save time and effort by doing one or more of the following:

  • Buying papers online or getting someone else to write for them.
    ● Including inaccurate information, such as making things up, stretching the truth, or creating a false sense of doubt about well-established facts.
    ● Writing quickly with no drafts or editing.
    ● Failing to cite sources, or making up sources where none exist.
    ● Filling the report with large pictures or other distracting content.
    ● Copying the entire report from an encyclopedia, or paraphrasing content by changing words or sentence structure here and there.
    ● Using commonly known facts, for example, “Argentina is a country. People live there. Argentina has borders.”
    ● Using a lot of words to communicate only basic ideas or facts, for example, “Pandas eat bamboo. Pandas eat a lot of bamboo. Bamboo is the best food for a Panda bear.”

 

Here Google point out that the content of some webpages is similarly created. So, where you find content like this, it should be rated as Low quality if it is created without adequate time, effort, expertise, or talent/skill. Inaccurate or misleading information presented as fact is also a reason for Low or even Lowest quality ratings. Pages with low quality MC do not achieve their purpose well.

 

Keep in mind that we have very different standards for pages on large, professionally-produced business websites than we have for small amateur, hobbyist, or personal websites. The quality of MC we expect for a large online store is very different than what we might expect for a small local business website.

All Page Quality ratings should be made in the context of the purpose of the page and the type of website.

Important : Low quality MC is a sufficient reason to give a page a Low quality rating.

The very much abbreviated version of this section has specifics to clickbait:

The quality of the MC is an important consideration for PQ rating. We will consider content to be Low quality if it is created without adequate time, effort, expertise, or talent/skill. Pages with low quality MC do not achieve their purpose well.

In addition, please examine the title on the page. The title of the page should describe the content.

Exaggerated or shocking titles can entice users to click on pages in search results. If pages do not live up to the exaggerated or shocking title or images, the experience leaves users feeling surprised and confused. Here is an example of a page with an exaggerated and shocking title: “Is the World about to End? Mysterious Sightings of 25ft Sea Serpents Prompt Panic!” as the title for an article about the unidentified remains of one small dead fish on a beach. Pages with exaggerated or shocking titles that do not describe the MC well should be rated Low.

Important : The Low rating should be used if the page has Low quality MC.

 

Unsatisfying Amount of Main Content

Here there is a small change, but it does make a evaluator aware that there is a difference between the amount of content for the purpose of the page.

Old version:

Important : An unsatisfying amount of MC is a sufficient reason to give a page a Low quality rating.

New version:

Important : The Low rating should be used if the page has an unsatisfying amount of MC for the purpose of the page.

 

Lack of Purpose Pages

This is a very important area, Google stating that  “Some pages fail to achieve their purpose so profoundly that the purpose of the page cannot be determined. Such pages serve no real purpose for users.”

Pages that Fail to Achieve Their Purpose

This is another section that was reorganized and rewritten.  Here is the updated version:

Lowest E-A-T

One of the most important criteria of PQ rating is E-A-T. Expertise of the creator of the MC, and authoritativeness or trustworthiness of the page or website, is extremely important for a page to achieve its purpose well.

If the E-A-T of a page is low enough, users cannot or should not use the MC of the page. This is especially true of YMYL topics. If the page is highly inexpert, unauthoritative or untrustworthy, it fails to achieve its purpose.

Important : The Lowest rating should be used if the page is highly inexpert, unauthoritative, or untrustworthy.

No/Little Main Content

Pages exist to share their MC with users. The following pages should be rated Lowest because they fail to achieve their purpose:
● Pages with no MC.
● Pages with a bare minimum of MC that is unhelpful for the purpose of the page.

Lowest Quality Main Content

The Lowest rating applies to any page with Lowest Quality MC. Lowest quality MC is content created with such insufficient time, effort, expertise, talent, and/or skill that it fails to achieve its purpose. The Lowest rating should also apply to pages where users cannot benefit from the MC, for example:

  • Informational pages with demonstrably inaccurate MC.
    ● The MC is so difficult to read, watch, or use, that it takes great effort to understand and use the page.
    ● Broken functionality of the page due to lack of skill in construction, poor design, or lack of maintenance.

Have high standards and think about how typical users in your locale would experience the MC on the page. A page may
have value to the creator or participants in the discussion, but few to no general users who view it would benefit from the
MC.

Copied Main Content

Interesting part they removed from the beginning of this section is the comment that “Every page needs Main Content.”

They also combined the two sections “Copied Main Content” and “More About Copied Content”, although it is nearly identical.

They did remove the following:

If all or most of the MC on the page is copied, think about the purpose of the page. Why does the page exist? What value does the page have for users? Why should users look at the page with copied content instead of the original source?

That is a curious part to remove, since it is a valid way to determine if there is any way the content has value despite being copied or syndicated.

Auto-Generated Main Content

This section was renamed from “Automatically-Generated Main Content”, perhaps to change it to match industry lingo.

This section is primarily the same, but added “Another way to create MC with little to no time, effort, or expertise is to create pages (or even entire websites)” to the first paragraph.

 

 

 

Conclusion

There is a lot here as you can see, but for me the main point is that a page should be USEFUL and be WORTH READING.

Curiously though, the guidelines do not state that Copied Content is necessarily a bad thing. I read this as if a page uses content from another site, IF that page then goes on to Add Value then that page should not be down rated.

It also points out that there are no firm guidelines on the amount of content that should be considered as too low. BUT it does state that the length of content can be used to identify those pages as of being of the highest value…

I do hope that this information helps and thanks again for the work done by Jennifer Slegg

 

About the author

Graham Baylis was born in 1957 and has therefore seen the birth of the Internet and experienced at first hand, just how it has changed the World we live in. He has been involved with computers since 1983 and helped set up the first electronic mail system used by the Civil Service in 1986. He has gained several qualifications in Computing and Marketing and has written hundreds of blogs which you can find around the web, many being on customer’s websites. He has over 19 years of experience with Pay Per Click and SEO and has an answer for most Search Marketing questions, and, for those that he hasn’t is quick to find one. Continually coming up with solutions, Graham is one person it is worth having a chat with.

 

Local SEO – The way forward for many small local companies

Local SEO – The way forward for many small local companies

Local SEO

Local SEO is crucial in 2018

 

Why Local SEO is becoming so important?

The fact of the matter is that things are changing because of one thing, the growth of the use of Mobile Devices, especially Smartphones to access the Internet. The figures are quite startling when you look at them, from 41% in 2016 to over 61% in 2017.

Smartphones and Internet Search

How People use Smartphones to Search the Internet

When you then consider that most smartphone searches are related to immediate / local needs you can see why appearing on that Google Search screen has become more important, YOU NEED TO BE THERE to get that all important click (and the business that comes with it).

You also have to consider that when you search on a Smartphone you automatically get local results (Google after all knows where you are) instead of having to ‘manually signal’ i.e. use a geographic term along with the search phrase (i.e ‘Security companies Hereford’ rather  than just ‘security companies’. The latter, when used on a Smartphone automatically giving you local results for Hereford (as long as of course  you are in Hereford at the time of the search).

 

So What Makes Google Choose One Site Over Another?

The first thing to remember is that Google uses different signals to decide who to list in the Local SEO (the map section at the top of listings that is often shown) than the standard Organic Listings.

The Six Local Pack Ranking Factors

  • Proximity of the address used in your Google My Business (GMB) listing
  • The quality of the GMB listing, photographs, details of services etc
  • The linking structure for the website mentioned in the GMB listing
  • The On Page SEO of the website mentioned in the GMB listing
  • The number of Citations for the website mentioned in the GMB listing
  • The number of Reviews the GMB has recevied

Local SEO and Organic SEO have similarities

As you can see, it is IMPORTANT to ensure that your website is properly optimised, as many of the factors used in Local SEO are also used in the normal Organic listings. It is of course possible to get a listing in the so called 3 pack without having a website, but you are far more likely to get a position if you have a high quality optimised site linked too from your GMB listing.

How to get a Place in the 3 Pack?

Obviously the first thing you need is a Google My Business listing. You may well find that one does exist already, in which case you need to ‘claim it’ a process that can sometimes be carried out by phone, whilst in other cases Google will send you an email or a postcard.

If you don’t have a listing to claim the you must  first log onto your Google account and then vistit https://www.google.com/intl/en/business/, following the process laid out there.

Whichever process is used, you will eventually end up with a GMB listing that you can edit.

Are you listed already and if not who is?

You may also want to check to see if you are, for some reason already listed in the 3 pack, or maybe you want to find out who is. The best way of doing this is to use the Google Adwords tool, this can be found at https://adwords.google.com/apt/anon/AdPreview.

3 Pack Listing Checking Tool

See if your business is already listed in the 3 Pack

 

3 Pack Listing for Caple Security

3 Pack Listing for Caple Security

Editing your Google My Business Listing

The first thing to do is to remember to keep to the rules, these being available at https://support.google.com/business/answer/3038177?hl=en-GB.

Another set of interesting information can be found at https://support.google.com/business/answer/7091?hl=en.

What to include in your GMB listing

  • At least 5 photographs are needed, it also being important to ‘GeoTag’ them (this telling Google they are relevant to a certain location). This is an easy enough process, see https://www.geoimgr.com/ for more information.
  • Provide as much information as you can about the services you offer and your times of operation / opening.

Getting Citations and Reviews

As mentioned above, having a correctly optimised website is a key requirement to getting a 3 Pack listing, but that is a separate matter (and you can find a lot about ‘search engine optimisation’ elsewhere in this blog, so please do have a look about), this post will just cover the important area of Citations and Reviews.

Of the two, Citations are the most important, it being very possible to get a 3 pack listing with no reviews at all.

If you already have a GMB listing that is not listed in the 3 Pack AND have a fully optimised site, then this is more than likely being caused by having too few Citations, or by the fact that the ones you have are ‘confusing’ Google. This is often caused by the fact that the Citations are using different variations of your business name, address of phone number. Sometimes this is just human error, in other cases it is because your business has moved address or changed the phone number.

Whatever the reasons, having citations that all use different Names, Addresses or Phone numbers (the three together being known by the acronym NAP) can be a real killer to your chances of getting a place, so the first thing to do is to check on what citations you do have and ‘clean them up’.

If you don’t have any then fine, you can start building them up, but ALWAYS ensure that you use EXACTLY the same information on ALL of the citations you create.

Don’t forget Press Releases

Online Press Releases are also a way of helping things along, so please also consider posting these as many times as you can.

Local SEO also helps in getting business if you are not in the 3 Pack.

In some cases, you simply won’t be able to get a 3 Pack listing in the town / city you want, this for the very good reason that your office (as in the GMB listing) is not close enough to the centre of the city / town for Google to consider you as being ‘local enough for a 3 Pack placement.

One way around this is to create a ‘virtual’ office in the town that you wish to target, and whilst this, if done properly can work, it is fraught with dangers (i.e. getting your business thrown off ALL 3 Pack listings). It is therefore better to make sure that your website is fully optimised for ‘Local Searches’ as that way, although you cannot get a 3 pack listing, you can get an Organic listing.

As some if the results in Google don’t have a 3 Pack to get listed in in the first place, this is really a great way to go. See below for our customers No.1 listing, one of many too.

 

Local SEO outside the 3 Pack

Local SEO outside the 3 Pack

 

I do hope that this has provided you with some useful information and please do contact us if you want more information or help with your Local SEO / 3 Pack problems. We are here to help!

SEO Is A Marathon – Not A Sprint

There is a common mistake that many new website owners make, and it’s about the concept of SEO (Search Engine Optimisation). For the most part, those who don’t have experience with it believe it’s something you do once or twice, then you forget about it. In reality, SEO is a consistent practice. So, it is in every website owner’s best interest to understand that SEO is all about the long haul and that it’s not going to be instant. That means all the experts promising you higher overnight rankings are not being completely truthful.

Now you might be wondering why you need to constantly tweak your SEO strategy? Or maybe there’s a shortcut you can use? In order to find out the answers, keep reading.

Why SEO Is A Marathon

The best way to explain why SEO is like a marathon is to look at the way search engines operate. If you were to compare how Google operates now, and what it was like ten years ago, you would notice a dramatic difference.

This is because search engines never stop adapting to the habits of their users. Instead, they continue to add and fine-tune technology in such a way that users become completely dependent on their services.

And you can bet your site the changes will never stop. Because the more people get comfortable with something, the more they search for change.

What does this all mean to you exactly? Well, with SEO you are trying to make your site more visible. And getting to the first pages are definitely possible, but there are many variables involved.

Not only do you have to keep the changing algorithms of search in mind, but you have to stay up-to-date with trends and your competition. Everything will be changing around, getting more focused towards making users happy, which means you need to keep changing if you want to stay relevant.

The Current Situation

As it stands, search engines make it pretty obvious what they want from websites. And by doing a little research, you’ll learn all about these requirements, ranging from informative content to page loading speeds.

Once you have all these areas covered according to the set requirements, it becomes a matter of maintenance. So, no, you won’t have to do a massive amount of work each day, but there are certain things that need to do to stay consistent. For example, posting quality content two or three times a week, makings sure the site is responsive, check for important software updates, etc.

Basically, everything you do can be considered a step towards the next thing that needs to get done. And if you take the right steps, it’s a lot easier to maintain an SEO campaign.

SEO Takes Time

Another tough reality is the fact that SEO takes time to show results. Don’t expect things to happen in a day or two, because there are simply no guarantees involved and you don’t control all the variables.

All you can do is think smart and do the work. And if you really want to take a shortcut, the best way to save loads of time is to work with a professional.

 

Clients can often make tough demands but some of these are literally impossible, such as being able to block a specific country from viewing a particular webpage…

Ah, clients. We love them. They are our bread and butter and helping them realise their search goals is naturally what we’re all about. Of course, helping them understand what their search goals actually are – or should be – is another kettle of fish.

Read four of the most common yet misguided client requests below.

1. We want to rank top in Google for ‘biscuit news’ but we don’t actually want to mention biscuits in the content

If only this was possible! Actually, on second thoughts, no. Could you imagine the state the web would be in if optimisation really worked like this? It all goes back to search intent – if content does not meet user expectations then search engines such as Google will not present it to them. You can’t trick engines by mentioning something in a headline but talking about something entirely different in the body – they are much, much cleverer than that. In fact, since 2017 Google et al have been demanding to see even more relevant detail in body copy than ever before. Deep diving into a topic and its associated interests is really what it’s all about.

2. We need to rank #1 for this keyword by next week

Perhaps the most common request of all. And in our heads we’re thinking, “Of course you do – but so do 500+ other websites.” Lots of clients I’ve come into contact with do not consider SEO to be a time-consuming process – most seem to be under the impression that adding a smattering of big number keywords to a page will work itself out because “Google will do the rest”. No. It. Won’t. Even a new Telegraph webpage, which comes from a domain of longstanding authority, can take weeks or even months to settle.

But expecting to rank at number one – even without a given timeframe – is a mistake. It could be that the client’s website is not authoritative enough to land a page-one slot or that the keyphrase they’re gunning for is so competitive and far removed from conversion opportunity that it would be a pointless exercise.It’s also worth pointing out that search engines can rank and un-rank content without obvious reasons so even if your efforts are improving client ranks today, tomorrow might be a different story. Clients should therefore be encouraged to take the long term view and create content pieces that will enable them to build authority in the relevant field. Targeting keywords big and small is essential – those long-tail keywords, that are cheaper and “less popular” are actually often closer to points of conversion and should be considered “low hanging fruit”.

3. I need every trace of this webpage wiped from the internet by X date

This demand crops up when clients are working to specific campaign timelines, meaning certain messaging or detail can expire on certain dates. It’s understandable, therefore, that pages or content marketing efforts which feature this detail will need to be edited or removed from the front and centre.

Unfortunately, a page that goes live on the web cannot simply be erased with the push of a button. Certainly, we can delete pages from a content management system and yes, we can redirect old links into new destinations on servers, but we cannot control search engine indexes, old social posts that refer to content pieces and any other platforms that may have lifted the content and re-published it on their end.

If content detail is so sensitive to a client’s marketing schedule then discussions should be had at the beginning to manage expectations.

Read more: https://www.telegraph.co.uk/spark/marketing-guides/funny-seo-requests-from-clients/

 

Importance Of Fast Loading Web Pages For SEO

Page speed is often confused with ‘site speed.’ Page speed is actually the time it takes for a webpage to load and fully display its available content. With that being said, page speed is a crucial element when it comes to both SEO and user satisfaction. Not only will you be able to get your website to rank better in the search engines, but it will help you convert more of your traffic as well. Below, we will be going over why fast loading web pages are so important.

Why Fast Loading Web Pages Are So Important For SEO:

1. Optimization.

One of the biggest reasons it is so crucial to achieving a fast loading web page is because of optimization purposes. Search engines want to provide the best experience for their own users. Thus, in order to effectively do so, they need to ensure that the websites they are ranking high within the search engine ranks are well optimized. Having a fast loading web page is one of the biggest indicators of having a well-optimized website.

2. Mobile Responsiveness.

Another reason why having a fast loading webpage is so important for achieving a high ranking within the search engines is because it means that your website is mobile responsive. Because a majority of the traffic now comes from mobile devices, search engine providers are making a concerted effort to boost the rankings of websites that adhere to this trend. Thus, if your website is not fast loading or if it is unoptimized for mobile traffic, you are bound to experience some sort of penalty within the search engine ranks.

Now that we have gone over some of the main reasons it is so important to have fast loading web pages, we will be going over some of the top ways to achieve it.

Ways To Achieve Fast Loading Web pages:

1. Compression.

One of the best ways to achieve faster load times for your web pages is by enabling compression. By reducing the overall size of your CSS, HTML, and Java files, you will be able to make your web pages load at a much faster overall rate.

2. Reduce Redirects.

Another successful way to minimize load times is by reducing the number of redirects you use. Each time a page redirects a user to another page, additional time is added to the load time. Thus, by reducing this redirect pattern, you should be able to achieve significantly reduced load times. One example of a redirect could be redirecting your visitor to a mobile version of your website.

Overall, there are plenty of different ways you are going to be able to minimize the load times that each of your webpages has. By implementing the tips above, you should be able to achieve highly optimized web pages that load fast. By achieving fast load times, you will be able to boost your website’s organic rankings in the search engines and even achieve a higher conversion rate for the traffic that you do end up generating.

 

Success in search engine optimization (SEO) requires not only an understanding of where Google’s algorithm is today but an insight to where Google is heading in the future.

Based on my experience, it has become clear to me Google will place a stronger weight on the customer’s experience with page load speed as part of their mobile-first strategy. With the investment Google has made in page performance, there are some indicators we need in order to understand how critical this factor is now and will be in the future. For example:

  • AMP — Specifically designed to bring more information into the search engine results pages (SERPs) in a way that delivers on the customer’s intent most expeditiously. Google’s desire to quickly serve the customer “blazing-fast page rendering and content delivery” across devices and media begins with Google caching more content in their own cloud.
  • Google Fiber — A faster internet connection for a faster web. A faster web allows for a stronger internet presence in our everyday lives and is the basis of the success of the internet of things (IoT). What the internet is today is driven by content and experience delivery. When fiber installations reach critical mass and gigabit becomes the standard, the internet will begin to reach its full potential.
  • Google Developer Guidelines — 200-millisecond response time and a one-second top of fold page load time, more than a subtle hint that speed should be a primary goal for every webmaster.

Now that we are aware page performance is very important to Google, how do we as digital marketing professionals work speed and performance into our everyday SEO routine?

A first step would be to build the data source. SEO is a data-driven marketing channel, and performance data is no different from positions, click-through rates (CTRs) and impressions. We collect the data, analyze, and determine the course of action required to move the metrics in the direction of our choosing.

Tools to use

With page performance tools it is important to remember a tool may be inaccurate with a single measurement. I prefer to use at least three tools for gathering general performance metrics so I can triangulate the data and validate each individual source against the other two.

Data is only useful when the data is reliable. Depending on the website I am working on, I may have access to page performance data on a recurring basis. Some tool solutions like DynaTrace, Quantum Metric, Foglight, IBM and TeaLeaf collect data in real time but come with a high price tag or limited licenses. When cost is a consideration, I rely more heavily on the following tools:

  • Google Page Speed Insights — Regardless of what tools you have access to, how Google perceives the performance of a page is really what matters.
  • Pingdom.com — A solid tool for gathering baseline metrics and recommendations for improvement. The added capability to test using international servers is key when international traffic is a strong driver for the business you are working on.
  • GTMetrix.com — Similar to Pingdom, with the added benefit of being able to play back the user experience timeline in a video medium.
  • WebPageTest.org — A bit rougher user interface  (UI) design, but you can capture all the critical metrics. Great for validating the data obtained from other tools.

Use multiple tools to capitalize on specific benefits of each tool, look to see if the data from all sources tells the same story. When the data is not telling the same story, there are deeper issues that should be resolved before performance data can be actionable.

Sampling approach

While it is more than feasible to analyze a single universal resource locator (URL) you are working on, if you want to drive changes in the metrics, you need to be able to tell the entire story.

I always recommend using a sampling approach. If you are working on an e-commerce site, for example, and your URL focus is a specific product detail page, gather metrics about the specific URL, and then do a 10-product detail page sample to produce an average. There may be a story unique to the single URL, or the story may be at the page level.

Read more: https://searchengineland.com/making-website-speed-and-performance-part-of-your-seo-routine-291449