SEO Trends for 2019.

I have been working in the SEO field for some 17 years now, and am one of the few SEO consultants who state that overall, things are not that much different from the past, at least at the basic level.

I say this as today:-

  • Links are important, just as they were in 2000
  • Content is important, just as it was in 2000
  • Technical SEO is important, again, just as it was in 2000.

The difference is that in the first two cases Google has become more savvy (through the many changes to its algorithm) about what is good and what is not.  In the case of Technical SEO, in 2000 you had to ensure that your site was well built; in that it was ‘clean’, not because you wanted to please Google, but because you had to contend with very slow download speeds (hence the reason for the ALT tag; this not being for SEO at all, it’s purpose being to allow users to switch off image downloads because they could sometimes, simply not wait for the image to appear).

That’s not so say that there have not been changes though..

SEO Trends for 2019

SEO Trends for 2019

The changes in Links (also known as Off Page SEO)

I have touched on the subject of backlinks (a backlink is a link from another site to yours, the anchor text used in the link – the bit you click – being ‘read’ by Google and used to create an idea about what subject your site covers), stating that Google has become much cleverer at spotting links that have been built solely for the purposes of getting better rankings. Today, if you try to trick Google by creating lots of links with your ‘money’ phrases like ‘pet cremation’ or ‘video marketing’ you will more than likely to penalised.

However, what this has really done is to weed out those SEO’s who do not keep up with the changes. For those that do, all it means is that you have to create more Brand and Natural links. The position is the same as 2000 though, Google likes sites that have lots of links (from lots of domains).

EAT – Expertise – Authority – Trust

The other change in the linking world is that the ‘relevance’, ‘trust’ and ‘authority’ of the sites providing the links has become more important. This has been the case for many years though. Trust is relatively easy to gain, all you need are links from sites that themselves have a high Trust rating, this also giving you some ‘authority’. The ‘expert’ rating of a site is becoming a more important factor and one that is hard to influence easily.

The changes in content

Here the changes are in my opinion very much deeper. In 2000, you could keyword stuff a page and get away with it, getting good rankings in the process. In my view this was never a really practical method, as even though you could get a good ranking, such pages never had that good a conversion rate, thus you got a lot of ‘horses to the water’ but few of them drank, which made that practice a poor one.

Today, Google is very much better at working out what good content is, (which means you need to improve your copywriting skills) looking for a whole host of ‘signals’, these including how well it is written, how many synonyms are used and if it includes images and videos.

Content Development and Marketing

With the importance of content growing so much over the years (the reason content has become more important is that in 2000 Google relied heavily on Pagerank, which was all about links, whilst today, Google can better understand content, this now accounting for at least 50% of the points that Google give any page), two new buzz phrases have been brought into play:-

  • Content Development
  • Content Marketing

As you would imagine the first is about creating the content, whilst the second is about getting it noticed, the planning for this being given the grand title of ‘content marketing strategy’.

Google has also improved its ability to spot duplicate content, this being the reason so many Ecommerce stores have to tweak the descriptions of their products. Failure to do this means that the pages on their sites are just the same as countless others, which makes it harder to rank…

Meeting User Intent is the key

However, the biggest change since 2000, is that now Google is looking for pages that match the ‘intent’ behind the search. Saying this, at the moment, Google is still guessing most of the time, its AI helper ‘Rank Brain’ still having a long way to go. In the meantime, we are all ‘rats in the maze’, Google constantly checking to see what sites people stay on for a given term. This way they can start to associate sites with phrases, and by looking at the content of the sites, deduce (to some degree) what the user wanted in the first place.

As I say, this has a long way to go, but it is going to get more important in 2019.

The changes in Technical SEO since 2000

The obvious change here is that website construction has come a long long way since 2000, but that to some degree has been a double edged sword, many companies offering web design services  taking advantage of the faster download speeds available by not optimising the amount of code the text rating (the amount of code versus the actual words seen by the visitor). This can lead to a site that is too code heavy and this must have an impact on speed which is not good at all.

The rise of the use of mobile devices (with their lower download speeds) has however put this issue back under the microscope, so again Google are looking for sites that download nice and fast AND offer the required level of usability when viewed on the smaller screens that most mobile phones and tablets have.

Actionable Changes – What can you do in 2019 to improve your rankings?

 

Satisfy the Intent of the user

“You need to understand what someone is expecting to find when they query a word or phrase and you need to give them the answer in the simplest way possible,” said Mindy Weinstein CEO of Market Mindshift.

To me this sounds like an excellent point, but it is not always that easy. There is no problem when someone asks a direct question, eg ‘how to you fix a leaking tap’, or ‘what ratio of links should use money phrases’, but when it comes to more generic phrases like ‘maps’ (20 million searches in the USA in October 2018) or ‘entertainment’ (16 million in the USA). You can have a guess at both, but it would be impossible to know exactly what the user was searching for.

The good news is that ranking for such phrases is pretty useless anyway, but even some ‘long tail’ phrases beg the question ‘what is the user looking for’ a great example for me being ‘bed bath and beyond’ (6 million searches in the USA in Oct 2018)…

So how can you win here?

The advice I always give to my clients is that as you cannot really guess what someone is looking for, is to provide them with the answers to some questions that you can help them with. So in relation to the terms ‘maps’ a site could provide information as to how they could provide ‘large scale digital maps’, or maps that could be used to support ‘planning applications’. That way, they are going to strike lucky some of the time, and as long as the content is really informative / useful, it’s bound to help the site’s standing in Google’s eyes.

To achieve this, we go through a very detailed process of finding what sort of content is already ‘liked’ by Google, then after writing a useful article / page / post, we then compare this with those pages that have proven their worth, altering them to include as many of the words as possible, whilst of course, maintaining good readability and thus user experience.

You can also tune a site so that Google will use the content as a rich snippet.

“Answer boxes, recipes, the knowledge graph, carousels, and who-knows-whatelse will take an even bigger bite out of organic traffic,” said Ian Lurie, CEO and founder of Portent. “That makes SEO even more important, because exposure is as much about visibility in the SERPs as it is about clicks.”

This can be a really good idea, but you can only do this for certain terms. But if the cap fits then it is a great idea to wear it.

Structured Language Mark Up

“With AI becoming increasingly important for Google, structured data is becoming more important as well,” Tandler said. “If Google wants to move from a mobile-first to an AI-first world, structured data is key. No matter how good your AI is, if it takes too long to ‘crawl’ the required information, it will never be great. AI requires a fast processing of contents and their relations to each other.”

You cannot use Structured Markup Language on every page, but as with rich snippets, if you can integrate this code in your site then it is yet another thing that will help your SEO in 2019.

Do beware though, if you use SML on a page incorrectly (that is to include data that is not relevant to the actual page) then Google will actually penalise your site.

Voice Search

This is an area tipped to be more important in 2019, however, the amount of work needed is, at the moment, not matched by the expected gains.  This is one to keep an eye on in 2019.

On Page SEO

This is an area that is often not attended too and one that has been a ‘winner’ for many years. Putting it simply, on page SEO is all about making sure that the important areas of a page are populated with the right keywords; those that tell Google what the page is all about.

These areas are:-

The Title of the Page. This is the text you see in the tab on your browser and in the SERPS listing. It remains the most important and vital piece of ‘web real estate’ for 2019

The Header Tags – These are a throw back to the time before Cascading style sheets were introduced, but are still very important. It is best to use just one H1 and then use H2, H3, H4. H5 and H6 tags to introduce deeper and deeper topics within the content.

Using keywords (or a synonym) and then use the bold, italic or list attributes to highlight them

Lastly, the Meta Description.  This does not have a huge effect on the rankings for a page, but they are important as the words in this meta tag are normally used in the Google SERPS. As such their main job is to act as an ‘elevator speech / pitch’ the idea being to encourage people to click on the link. Besides this ‘positive’ use, a site that had a lot of duplicate meta tags can have their overall quality rating reduced, something that is best to avoid.

Other Things to do

If you have not yet registered your site for two of the very best web analytics tools –  Google Webmaster Tools (now called Google Search Console) and Google Analytics, I would suggest that you do so immediately as it will help immensely. Together they allow you to see:-

  • what keyword phrases your site is being found for
  • which enables you to see if you are getting the targeted traffic you desire

You should also check to see if your site needs some local search marketing, the answer being a clear yes if it is not appearing in the Google 3 pack (that maps with the pins) for a relevant term. Local search marketing optimisation covers much the same ground as SEO, but with the added issue of Local Citations.

If your site is not doing well in 2018, then you will have to go back to the basics and go through the process of keyword analysis (to make sure you are targeting the right words), get an SEO Audit (a detailed form of website analysis), to make sure your site is SEO friendly and is not breaking any of Google’s webmaster guidelines, and more than likely have to take advantage of the link building services and other professional seo services, that companies like Serendipity provide.

 

Hopefully, the above data will help you improve your SEO tactics for 2019.

 

About the author

Graham Baylis was born in 1957 and has therefore seen the birth of the Internet and experienced at first hand, just how it has changed the World we live in. He has been involved with computers since 1983 and helped set up the first electronic mail system used by the Civil Service in 1986. He has gained several qualifications in Computing and Marketing and has written hundreds of blogs which you can find around the web, many being on customer’s websites. He has over 19 years of experience with Pay Per Click and SEO and has an answer for most Search Marketing questions, and, for those that he hasn’t is quick to find one. Continually coming up with solutions, Graham is one person it is worth having a chat with.

Aiming for Perfect On Page Search Engine Optimisation

Those of you that have been researching or running SEO will no doubt have come across ‘TheHoth’, a very useful site that provides all sorts of information and services, whilst also running a very good blog.

Their latest post is all about ‘On Page Search Engine Optimisation’. It covers a lot of the ground that has been ‘well trodden’ over the past few months, but is nevertheless, a useful read.:-

https://www.thehoth.com/blog/on-page-seo/

The cover what they define as the top 3 Steps to getting On Page SEO right:-

  • Keyword Research
  • Optimising the Titles, Description, and H1 (Header tags)
  • Not stuffing the pages with the target keywords

There are other issues that are connected, such as using ALT tags for images and using ‘descriptive URLs’ for both file and image names (i.e ‘picture-of-dog.jpg’ rather than ‘image001.jpg’), plus of course ‘technical SEO’ issues like the speed the site downloads and ‘geographically tagging’ images (very useful for Local SEO).

Getting Keyword Research Right

This is an essential part of any SEO project, you simply cannot guess what people are using to find the sort of service or product you are selling. Even if you did guess correctly, without other research into the ‘power’ of the sites you are competing against (and you can bet your last pound that you will have lots of competition) you could well select keyword phrases that you can just not get a  good rank for, the competition simply out gunning you.

In their blog, TheHoth, point out the value of using SEMRush (this now being added to their services). I can only agree with them about this tool, it providing a host of information about what keywords are being used, together with an idea of how often they are used and the competition levels. Also, and this is VERY important, it allows you to see what your competition is being found for, this being especially useful for Local SEO.

But, there is another set of tools that you should also know about, these being provided by Mangools (https://mangools.com/). Their suite of tools is most useful indeed, covering not only Keyword Research (with a lot of information about the power of the competition included), but also data on the strength of sites too. All very useful and well worth a look if you are looking to carry out SEO on your own site.

Some Interesting SEO Tools

Some Interesting SEO Tools

Optimising Your Titles, Description and Header Tags for SEO

If you want a page to be ranked for a certain phrase, then you simply MUST ensure that the Title contains some relevant words (normally the phrase that you are targeting). There is some debate as to whether this needs to be in the form of a sentence, or whether you can use the keywords as they are, separating them with a pipe symbol | . Either seem to work, but it is possible that the sentence version could increase conversion rates (from being listed in SERPS to getting a click).

As with everything in the SEO World, you should not use your target words too many times in the Title, and it is best to keep to about 70 characters, even though Google, for example will read / index many more.

On a side note, I am AMAZED at the number of sites that do not use these areas of a page ‘correctly’. This seems to be madness to me, as they are quite simply denying themselves lots of free, targeted traffic….

 

The Meta Description

TheHoth goes on to mention that you should then make sure your Description is completed, but don’t tell you that the Meta Description is not used that much for SEO (as the words within it are not taken into account in the same way as the Title or Body copy). The main thing to do here is to NOT REPEAT the same Meta Description throughout the site as this looks ‘lazy’ and can reduce the overall ‘Quality Score’ for the domain (this being a different ‘Quality Score’ to that used in Google’s Adwords system).

The other thing that TheHoth says, is that you should not repeat the keywords in the Description. I am not sure of that, although of course, you should not ‘stuff’ this area either.

The Real Purpose of the Meta Description Tag

By the way, the real purpose of the Description text is to act as the ‘elevator speech’ for a page, the idea being to get the viewer to click the link when it is displayed in the Search Engine Results.

The Header Tags

Looking at TheHoth’s article we see that they mention the importance of the H1 tag. Again this is an area of contention, as I have seen many pages performing very well with no H1’s, or with tags that do not use any of the target keywords (or anything like them).

However, I agree that it is best practice to use H1’s (and only one per page), whilst also using H2’s down to H6 in a cascading manner, as best fits the way the copy is divided up.

They also point out that you could over-optimise a page by using the target Keywords in the H1. Again I am not sure that this is totally true, as I have seen that work too. I feel that the best approach is to write the copy, some Good Solid, Useful Copy and then add Headers that look ‘right’, all the while making sure that you do not ‘over-cook’ the area of keyword usage.

Variations of Target Words

This is where the use of variations (synonyms) of the target words come into play, and it pays to use them throughout the site’s pages, especially in the body copy, for the simple reason that Google ‘likes it’. It is therefore reasonable to also use these variations in the Title / Description and Headers, if it looks OK.

This leads nicely on the third area that TheHoth covers, that of Keyword Density.

Keyword Stuffing and The Use of Synonyms

In the old days of SEO, you could quite happily repeat your target keyphrases again and again and again, and the Search Engines would reward your site with lots of top ranks. However, in the game of cat and mouse that SEO is, things have changed quite dramatically in this area, now of course repeating your keyphrases has dire results…

This change, like the majority, has been made with one aim in mind, that of increasing the quality of the pages listed for any given phrase, in the Search Engine Results. The other reasons are because there were many SEO’s who were gaming the system and skewing the results, which was something that Google was not prepared to put up with.

Hence the many changes to how pages are ranked – this including Off Page SEO, which is another topic.

The problem Google had (has) is how to calculate quality, especially when copy / an article that is deemed great by one, is thought of as rubbish by another. This being made even more difficult when a computer is being used to score pages, after all, ‘English’ is not exactly the first language of a Computer….

To this end, Google (being the front runner here) built a sort of Artificial Intelligence into its algorithm, its job being to deduce the real ‘meaning’ of a page from the words on the page. Here it was not looking from the angle of ‘what keyphrases’ does this page target, but rather ‘what is the page talking about as a whole’. To this end it was trained to look for words that are associated with each other, so that for example, a page that uses the words ‘cat’, ‘feline’, ‘kitten’, ‘purr’, ‘pet’ and ‘bed’ would automatically be associated with the phrase ‘cat beds’, in a stronger way than a page that just used the words ‘cat beds’ would be.

How to Deduce What Synonyms to Use

One way is to spend your life looking at a Thesaurus, but the perhaps the best way is to examine the words used on the pages that Google is known to like (we can tell because they are ranked well for any given phrase). The list that is created from such research can then be used in the copy in the knowledge that they must be relevant to some degree. This is the way that Serendipity Online Marketing goes about the matter of copy creation and it has been seen to work.

Long Tail Keywords

Any site gets a huge percentage of its non Brand traffic from what are called ‘long tail keywords’, these being phrases using 3 or more words. Often these phrases are not used very often, which in turn means that they do not appear on the ‘SEMRush radar’ and thus cannot be directly targeted.

Instead, what you have to do is to write the copy with a view to providing information to the reader, copy, which if it is good enough, WILL be associated with many of the relevant long tail keyword searches made, whether or not they actually use the words in question.

There Is More of Course

TheHoth’s article finishes at this point, but there is of course a lot more too why pages are ranked and why they are not. This includes Off Page SEO, and this is an area that cannot be ignored, as a page which uses the very best of copy, placed on a site that uses the Titles, Descriptions and Headers to full effect, is more than likely to be beaten to the top spots by other pages, simply because they have more links…

This is one reason why Keyword Research is so important, as when selecting the target words, it is vital, especially for a new domain, to select the possible, rather than the impossible.

 

About the author

Graham Baylis was born in 1957 and has therefore seen the birth of the Internet and experienced at first hand, just how it has changed the World we live in. He has been involved with computers since 1983 and helped set up the first electronic mail system used by the Civil Service in 1986. He has gained several qualifications in Computing and Marketing and has written hundreds of blogs which you can find around the web, many being on customer’s websites. He has over 19 years of experience with Pay Per Click and SEO and has an answer for most Search Marketing questions, and, for those that he hasn’t is quick to find one. Continually coming up with solutions, Graham is one person it is worth having a chat with.