Research into the FRED Google update, confirming why sites lost rankings.

The Fred Update by Google caused quite a ripple in the SEO world, many sites losing ranks, and hence traffic, up to 90% in some cases.  I have been doing quite a bit of digging and have asked some Gurus some pointed questions about why and what has happened.

The overall thoughts on the matter are that Google penalised sites that had poor content, or ones that were simply there to make money and not give anything back to the visitor in the form of useful data or information.

User Experience is Another Factor

Other thoughts on the matter were more to do with the User Experience that a page gave its visitors. Here the sites that were said to be hit included those that placed the copy below the fold of the screen or in some cases had very low load times.

However, in some cases sites were hit that were not just ‘out to make money’, but that seem to have been ‘lumped in’ with those that do because of the lack of content on their page.

Having a Lot of Links Did Not Save Sites

There was also talk that FRED checked on the quality of the links to sites too. This may turn out to be the case, further research is needed on this matter. However, what we can say is that sites that fell foul of FRED’s On Page quality checks were not saved by having a lot of links. Instead their positions were taken by sites that had inferior linking profiles, both at Page and Domain levels.

This research only covers 9 sites, so it can hardly be said to be definitive, but the evidence so far is pretty conclusive. Further research into the sites that were affected but did not fit the profile of sites that ‘should have been affected’ (by Fred) is the next step. More on the ‘efficiency’ of Fred later.

The FRED Data

In each case, the sites that held a first page rank before Fred for a given term were compared with the sites that now hold the first page (for that term). The sites that had lost their first page rank (had to have a position of 7 or above pre Fred) were then checked, this with a view to see ‘what could have caused them to lose their rank’ and whether this fitted with the profile of sites that Fred ‘should have hit’.

The phrases checked covered a range of topics, ranging from ‘lqf fruit’ to ‘chemical companies’ so should be diverse enough to give some firm data.

Search Phrase ‘ lqf Fruit’

Before and After FRED

Search Results pre and post Fred

Google results before and after FRED

Here two sites lost their first page rank:-

1

Not enough text for FRED

A screen shot of the site

This site had lost a rank of 5, and when checked, we saw that the actual page that was shown when you clicked the link was https://www.thespruce.com/what-does-iqf-mean-995719, a page not even on the stated domain. Something that is sure to annoy Google to start with. Furthermore, this page had very thin content and seemed to be only really there provide to a place for Google Ads and other advertisments. Being a prime target for Fred, it is not surprising to see that it was hit.

2

 

Content to Thin

The fruitbycrops site

Again a site with very thin content, just 155 words with an Advert at the very top, again a prime target for Fred.

 

Search Phrase ‘ chemical companies’

Before and After FRED

 

Results before and after FRED

The results for the term before and after FRED

Again two sites affected:-

12

 

 

 

 

This is a big website, with a lot of links, some 222,000 links to the domain, (although only 3 to the page)  linking to the page, the reason it lost its ranks seemingly down to the fact that the page in question was just not related enough, it being just one short item on the page.

4

 

An example of a penalised site

Was this site penalised because it’s copy was not ‘good enough’. Seems to be the most likely.

Another page that held just a small amount of what I would call ‘filler text’, it not really ‘saying anything’, at least in my view, the total length being just 251 words. Again a prime target for the Fred update.

 

Search Phrase ‘welding supplies uk’

Changes in the Google results pre and post FRED

The results from Google, pre and post the Fred update

Two sites here:-.

11

An example of a site hit by FRED

The Weldingshop site one of many hit by the Fred update

This site is not that bad in reality, although some may think it is a bit old fashioned. But it is not as bad as many that do hold onto first page ranks.  What is most likely the cause of the pages loss of rankings is that the main copy is only 340 words long. This leads me to consider that the length on the copy is considered below the ‘satisfactory’ level laid down in the Google Quality Guidelines.

5

 

Little text below the fold

Too little copy, with it below the fold. Possible reasons for the site being hit by FRED

This page lost a rank of 7, again the amount of copy being the likely cause of the drop, only 270 words being on the page, whilst also being below the fold, a factor that Google has stated (in 2012) that caused the value of any copy to be degraded.

Search Phrase ‘metal fabricators’

See how Fred altered the Google results

Google results pre and post the FRED update

Three sites had lost their ranks for this phrase

6

 

 

 

To few words for FRED?

Another site hit by Fred, more than likely due to the small amount of copy

Yet another page that lost its ranks, apparently down to the lack of content, the copy amounting to just 154 words.

7

 

Text below the fold - a reason for a Fred hit?

A page with over 600 words, but being below the fold, this could have caused a Fred hit.

This site had a rank of 4 before Fred, and does have a fair number of words, over 600 in all. However, 90% of it is below the fold on the screen and this looks to be the reason for the drop.

8

 

Yet another site hit by Google's Fred.

Yet another site hit by Google’s Fred.

This page lost its 6th position, it again being a ‘low volume of copy’ casualty, the length of copy amounting to just 170 words.

 

Conclusion

In all cases we can see that the sites affected by Fred did seem to fit the patterns suggested by the Gurus and by other research in that they mostly had very thin copy or ‘hid’ the copy below the fold in the page.

The next step is to see if the pages we are currently looking after SEOwise that also suffered a drop in rankings also fit this pattern.

Watch out for another report on this later in April.

Smart Scientific SEO Strategies for 2017

It’s been a fair few weeks since we managed to post anything on our blog and frankly I’m amazed at how fast the year has gone so far, and at the rate at which things seem to be changing, not to mention a lot of really useful software that has become available.

The post we’ve highlighted today (see below) comes from a series published by a well respected Web Design and SEO company called AimInternet. It is certainly a useful piece and highlights the fact that the information in Google’s Webmaster Tools (now called Google Search Console) is very very useful. The main reason I say this is that Google (for reasons of privacy they say…) stopped reporting the keyword phrases used by any visitor to a site in Analytics. You can tell they come from Google, but not what search words they used. All very annoying when trying to work out what words are converting and what are resulting in a high bounce rate.

Google Search Console fills this gap, to a degree in that it gives you a good idea of the phrases being used, the number of times a phrase has resulted in someone seeing a Google listing for the site, the Click Through Rate (very useful this, as it gives you an idea if your Title and Meta Description are well tuned to get clicks) as well as the average position in Google. But, it does not tell you what page they land on or whether they stay or ‘bounce’.

You can start extrapolating the data to make some intelligent guesses about what is going on (there is software that will do this for you) but they are only guesses (you could always run an Adwords campaign to check, but that is another story).

But to get back to what the article is about.

Scientific Organic Search Strategy

In the article AimInternet mention that they had increased the ‘number of keywords present’, by which I think they mean the number of different search phrases (or ‘Queries’ in Google Search Console speak) that were associated with a site. They made a big difference (something that we too pride ourselves on being able to achieve), increasing the number of associated phrases from 300 to 800. What this really means is that the ‘footprint’ of the site on Google has more than doubled, hence it is more likely to be seen and thus get a click ! All very good.

The process by which they reached this point is covered in earlier posts and no doubt they follow the same ‘Scientific’ path as we do. If they do they will first carry out research to find the words being used by people searching for their customers services and products. Then they will weave these into the site and construct content that supports the drive for rankings for the chosen target phrases.

What they ‘might’ not do is to check on the sites that currently have the best positions for these target phrases and then ‘Reverse Engineer’ them. By following that path you ‘know’ the words that Google likes to see and can thus use them in the content. This system also gives you a list of all the similar words and phrases that should be used, which avoids keyword stuffing and gets the ‘message’ across to Google in the way that we know it likes.

Add some links (that themselves have to be intelligently added – there is software that helps with that now too) and the site WILL, like Eagle, be associated with more query phrases, get better rankings and thus more traffic.

But the trick is in carrying out each of these phases in a controlled scientific manner…

One very interesting point that Aim made is that once you have a list of the phrases that Google associate with a site, that you should build on this and write content (about these phrases) that will make the site that bit more interesting and helpful. This will not only cement your position with Google but will no doubt improve the rankings for the site and, more importantly, give your readers more reasons to come back for more, and even, hopefully, buy from you.

They also make the point that visitors don’t always come in through the front door (the home page) so you should make your interior pages interesting too. This is not really new though, in that most of the pages on a site should be doing their best to engage with viewers by providing useful content, each page targeting a different set of keyphrases.

So a very interesting article.

To read the whole post on A Smart Organic Search Strategy please click the link

How We Use A Smart Organic Search Strategy To Get Our Clients On The First Page Of Google

This week we expand on looking at how to get your website on the first page of Google by using a smart organic search strategy.

In our last blog, we looked at the importance of getting on the first page of Google. And, we examined how our methods of using local marketing tools are driving traffic to the homepage – and producing fantastic results – for a client of ours. This week, we’ll expand on part of that methodology – using an organic search strategy to drive traffic to particular product pages or blog pages which then link through to specific product pages. We also do this via Adwords, although this is something we’ll look at in more detail in following blogs.

What Is An Organic Search Strategy?

In brief, an organic search strategy consists of finely keyworded product pages or blogs, which get picked up by Google each time one is published on a website. At this point, you might be thinking “I’ve already got all the information about the products or services I offer on one page of my site so I’ve nailed it, right?” or “I make rubber plugs, why the heck do I need a blog about those, who is going to read it?!”.

OK, so you might not be totally wrong about the last point (but hey, you never know, there might just be a rubber plug enthusiast out there who would LOVE to read your blog about them!).

Getting back to business…

Creating separate product pages on your site and posting blogs is all part of your organic search strategy. Simply, doing so creates more pages on your website containing the relevant keywords that you want your website to be found for, which Google can then index. The more relevant and unique pages and content you have on your site, the more shots on target you have at being shown on the first page of Google.

The important things to note here are relevant and unique. Google is smart and will penalise your site if you post up a load of duplicate pages and content. The same goes if you keyword stuff (make your content unintelligible by jamming in too many keyword phrases) your posts and pages.

We won’t go into it here but recommend that you take some time to familiarise yourself with good content practice. That includes following referencing protocols if you are using content from another site. For example, you might choose to do a blog post which rounds up the “5 best things about rubber plugs” and which uses information from other websites. That’s absolutely fine, but just remember to acknowledge and reference your sources correctly.

Why Do This?

How many pages are currently on your website? Probably not that many? So, if you currently have one page that discusses your 10 different products, by separating them out into individual pages you just added 10 extra pages to your site virtually overnight. You’ll be able to expand the content around each product, and so the mentions of the relevant keyword, too. So, whereas on the original page, you may have only listed the type of products you sell, you can now go into more detail about each one on their own page. This naturally allows for an articulate way of including more of your desired keywords on your site – avoiding the extreme no-no practice of keyword stuffing.

Google likes new and relevant content. Each page becomes a new way for traffic to come to your site. Of course, once the core pages of your site are done it’s likely that you won’t be updating those that often. Which is why, as part of any organic search strategy, we advise our clients to do regular blogging. And, in the case of blogging, the more regular you post the better.

Employing an organic search strategy such as this might mean that traffic enters your website not via the traditional route of arriving at the homepage. Instead it might enter on a product page or a blog post page written around a specific topic, which then links to a product page. Typically, we notice that customers will land on one of the product pages of our client’s websites, because of the organic search that we’ve set up for the client.

If you’re in the pressed parts trade you might do a search in Google for “copper plating”. Google will take into account your location (it gets this information from your settings) and present to you the most relevant results. Let’s say you’re Midlands based, as is EC Williams.

As a result of this search, people enter EC Williams’ site on the Copper Plating product page. Once on the page, you are presented with all of the information you need about “copper plating” along with some important trust points about the company. Our analysis shows us that from landing on this entry point people also then navigate to other pages on the site. From this example in particular, we can see that “zinc plating” is the next most popular page. Once on their website, this alternative page is now easily found in the navigation bar above, under “Plating Services”. From our research, most people stay on the “zinc plating” page, as they’ve found what they want. But, if they want more depth they’ll go onto “zinc nickel plating”.

The point of this is that once on the EC Williams’ website, the customer is presented with everything they need to make a purchasing decision. And, if you were that person looking for a company who were experts in the field of coating pressed-parts, then, bingo – you just found them.

Straight away, serious buying customers get what a snapshot of relevant information once they are on the site. Because of the trade they’re in (pressed parts), they become interested in making an enquiry straight away. We’ve measured this extensively on EC Williams’ site plus many others’, and know that it works. You need to make it easy for your customers to find information on your site and this method works by doing just that. Everything has be there for the user so that they’re not having to look for things too much.

How Organic Search Strategy Works

Most people will find you through a long-tail keyword search. These are keywords that tend to be more specific. Your website content should be driven by the keywords that your SEO advisor gives you. They need to advise your outsourced content providers of these keywords so that they can write content around them.

Take a look at www.eagleplastics.co.uk. They are another client of ours. Again, you can see that similar to www.ecwilliams.co.uk, everything a customer requires is there easy to find on the homepage, above the fold.

From an SEO perspective, when we started working with Eagle Plastics, the number of keywords we had to work with was much less than it is now. The site was receiving much less traffic that it does today which meant that there were nowhere near as many clicks or impressions being recorded. This impacted on the number of keywords being presented to us by Google. At the time we were only getting about 300 keywords presented, yet a year or so on, Google is now presenting 800 keywords.

This is as a result of the organic search strategy we have implemented, like that we discussed earlier. Traffic gets signposted to the Eagle Plastics website all based around these 800 keywords. And, now we have more of those, we can start creating content based on different keywords and keyword phrases.

Through testing the blogs, we are able to determine which keyword phrases are the most successful by analysing which ones have the best impressions.

On Eagle Plastics, “High Impact Polystyrene” is a key term for them. We know that this keyword phrase works well for them so we use it regularly in their blog headlines, in the h2 sub-headers and throughout the blog text. Of course though, we ensure we use it professionally and never keyword stuff.

As a result of this organic search strategy, we are providing more content to Google. This is recognised by them and results in Google starting to suggest more keywords which are relevant. We then create content based around these suggested keywords and their variations. As we post regular content which uses those keywords, Google views this as quality content and so provides us with even more relevant keywords. We then use these to continue to push the search and content strategy. The result is more traffic. But more than that, in getting more traffic, Google rewards you for quality content. And so it continues…

As little as five years ago, most searches were conducted using use two keywords. Today people use an average of five words per keyword search term. What was once a keyword search for “plugs” is now a more unique phrase of “the best luxury rubber plugs”. As you can see, the one word keyword has become a keyword phrase made up of multiple words. Searches are now more unique and these long-tail keyword phrases more specific.

Ultimately, it’s important to remember that every keyword search represents an intent by someone to find some information out. Long-tail keywords help you to better address that user intent by creating unique tailored content.

Statistics show that of 3 billion searches a day, 20% of every search is unique. That’s a heck of a lot of unique searches – and to get displayed on the first page of Google, you need a successful organic search strategy to be found amongst all of that noise.

SEO Ho Ho – Search Engine Optimisation in 2016 – Xmas Message

The year is nearly at an end and Xmas has been and gone, but there is still a lot of cheer in the air and pleasant memories of all the festivities to boot (amongst them our company Xmas card – see the image below) which went down very well with our customers).

seo-ho-ho-card

But there are other reasons to be grateful about 2016, in that in my view Google has made some really good moves to make the results fairer and more accurate, the latest Penguin update really sorting things out.

This has been somewhat of a relief to SOM as we have been ‘preaching’ what we call ‘Proper’, ‘Scientific’ SEO. What we mean by this is that we research the words that people are searching for in a market area, find the words that Google ‘wants to see’ for these phrases so that they can be incorporated in to the copy. Then we add some relevant links (with a natural anchor text and source type mix) and he presto , things start to happen.

The best part of this is that it is all totally ‘Google legal’  and can never in our view be subject to any penalties that Google may dream up at some time. We can say this as all we are trying to do is to make sure that any site we optimise offers some of the best information there is on a given subject, and of course we make sure that there are enough links to the site’s pages so that Google thinks the same. We call this link building programme ‘priming the pump’ as once the site gets traffic, the links will start building organically. Link building is still required in many cases, but perhaps, only because others are trying to get their sites rankings higher too…

As to the blog post we have included below, we certainly agree about the rise of AI and believe that Google searchers have for some time been ‘rats in the Google maze’, in that they have been analysing what we click on and what sites we like, thus getting closer and closer their goal of truly understanding the real intent behind a given search term.

The other interesting thing raised here is the increased importance that mobile search is being given these days, not really surprising when you realise that people are accessing the web using mobile devices more and more these days.

For 2017 we see it as more of the same, Google getting cleverer and cleverer at spotting the good sites (the ones that deserve rankings) from the ones that don’t, all of which means you just have to ‘Do SEO properly’ or suffer the consequences…

To see the full article on SEO in 2016 and some predictions for 2017 please click the link.

What we’ve learned about SEO in 2016?

Since the inception of the search engine, SEO has been an important, yet often misunderstood industry. For some, these three little letters bring massive pain and frustration. For others, SEO has saved their business. One thing is for sure: having a clear and strategic search strategy is what often separates those who succeed from those who don’t.

As we wrap up 2016, let’s take a look at how the industry has grown and shifted over the past year, and then look ahead to 2017.

A growing industry

It was only a few years ago when the internet was pummeled with thousands of “SEO is Dead” posts. Well, here we are, and the industry is still as alive as ever. SEO’s reputation has grown over the past few years, due in great part to the awesome work of the real pros out there. Today, the industry is worth more than $65 billion. Companies large and small are seeing how a good search strategy has the power to change their business.

As search engines and users continue to evolve, SEO is no longer just an added service brought to you by freelance web designers. With the amount of data, knowledge, tools and experience out there, SEO has become a power industry all on its own.

Over the course of the year, my agency alone has earned a number of new contracts from other agencies that are no longer able to provide their own search efforts. A large divide between those that can deliver SEO and those that can’t is beginning to open up across the board.

The rise of AI

Artificial intelligence (AI) is now prevalent in many of our lives. Google, IBM, Amazon and Apple are very active in developing and using Artificial Narrow Intelligence (ANI). ANI can be used to automate repetitive tasks, like looking up product details, shipping dates and order histories and performing countless other customer requests.

The consumer is becoming more and more comfortable with this technology and has even grown to trust its results. Sundar Pichai, Google CEO, announced during his Google I/O keynote that 20 percent of queries on its mobile app and on Android devices are voice searches.

RankBrain, Google’s machine-learning artificial intelligence system, is now among the top three ranking signals for Google’s search algorithm. Why? Google handles more than 3.5 billion searches per a day, and 16 to 20 percent of those are unique queries that have never been searched before. To handle this, the team at Google has harnessed the power of machine learning to help deliver better results.

While we can’t “control” RankBrain, what we can do is learn more about how Google is using it and then help the tool by creating good content that earns shares and links, building connections with others in our niche or related niches, and building trust in very targeted topics.

We are still in the beginning stages of this technology, but as more and more homes become equipped with smart tools like Amazon Echo and Google Home, we can be sure that these tech giants will use the knowledge they gain from voice search to power their AI technology.

The “Google Dance”

Every so often, Google likes to surprise us with a major algorithm update that has a significant impact on search results — some years we get one, and other years we get a little more.
While they do make nearly 500 tweaks to the algorithm each year, some are big enough to garner more attention. Let’s look back at four of 2016’s most memorable updates.

Mobile-friendly algorithm boost

A little under a year after “Mobilegeddon,” an event marked by the launch of Google’s mobile-friendly ranking algorithm, the search giant announced that it would soon be increasing the effects of this algorithm to further benefit mobile-friendly sites on mobile search. That boost rolled out on May 12, 2016, though the impact was not nearly as significant as when the mobile-friendly ranking algorithm initially launched.

Penguin 4.0

While this ended up being a two-phase rollout, Penguin 4.0 made its entrance on September 23, 2016. This has been considered the “gentler” Penguin algorithm, which devalues bad links instead of penalizing sites. The second phase of Penguin 4.0 was the recovery period, in which sites impacted by previous Penguin updates began to finally see a recovery — assuming steps were taken to help clean up their link profiles.

“Possum”

While this update was never confirmed by Google, the local SEO community noted a major shake-up in local pack and Google Maps results in early September 2016.

Fellow Search Engine Land columnist Joy Hawkins noted that this was quite possibly the largest update seen in in the local SEO world since Pigeon was released in 2014. Based on her findings, she believes the update’s goal was “to diversify the local results and also prevent spam from ranking as well.”
Divided index

As mobile search continues to account for more and more of the global share of search queries, Google is increasingly taking steps to become a mobile-first company. In November, Google announced that it was experimenting with using a mobile-first index, meaning that the mobile version of a website would be considered the “default” version for ranking purposes instead of the desktop version:

“To make our results more useful, we’ve begun experiments to make our index mobile-first. Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results.”

The time to say goodbye to 2016 is fast approaching, and I am truly excited to see what 2017 has in store for the world of SEO!

95% of websites are HURTING their Own Google Rankings

We have checked hundreds of websites over the years and the sad fact is that 95% of them are actually doing things that will make it harder (or impossible) to get rankings on Google.

95percent

Is Your Site One of the 95%?

The question that you (as a business website owner) might well be asking is MY site one of the 95%?? Of course, you may not be bothered, thinking that your site’s ‘job’ is just to ‘be there’ when someone wants to check on you. But that is really a waste, your site could be doing so much more than just sitting back, waiting for the occasional visitor…

Brochure Sites

Brochure sites are sites that are just meant to act, well, as an online brochure, a means to impart information about a business to anyone who is interested. They are often just visited by people who having heard about a company (or maybe they met someone at a networking event?) want a bit more information before they contact them for a quote etc.

A Wasted Marketing Opportunity?

This is a good way of using the power of the Internet (saves on a lot of brochure printing for a start), BUT, is it also a wasted opportunity? The thing is here you have a website, full of (hopefully) interesting stuff about your business, the services that you offer and ‘what makes you special’ and yet no great efforts are being made to get more people to read it all. This must be a wasted opportunity, as any one of those visitors (that the site is not getting) could be a potential customer…

So What Are These Sites Doing Wrong?

The fact is that there are many ways that business sites are ‘getting it wrong’ when it comes to getting Google to ‘like’, and thus give their pages a prominent position for a given search term. Some of them are quite basic mistakes too and could easily be fixed with a few clicks (and a little bit of thought).

Some Examples of the Mistakes Sites Make

The Title Tag

You may not notice (although Google always does) this one, as it a bit hidden, but if you take a look at the top of your Internet Browser window, you will see the ‘Title’ information for the page you are looking at. In many cases you will see words like ‘Home’ or ‘About Us’. Whilst not being incorrect (as you would be looking at the Home or About us page), they are not really very informative to the very ‘person’ you really want to impress and that of course is Google.

Think about it, would not a phrase like ‘IT Support Services | Computer Repairs’ ‘tell’ Google a bit more than the word ‘Home’? It really is a no brainer and so very easy to fix….

The Meta Description

When you look at a page you don’t even see this (not even at the top of the Browser), it only being visible in Google’s search results, under the Title and the URL of a site. This might make you think that it is worthless from an SEO point of view, but you would be wrong. It is true that the words in the Description do not have a lot of clout SEOwise, but if you leave the field empty or use the same one on many pages, you run the risk of making the site appear to be ‘lazy’ as far as Google is concerned and that ‘black mark’ could make all the difference when Google has to decide what site to list for a phrase you want to be found for.

Again, a few clicks on the keyboard can make the problem go away.

The Elevator Speech

Another thing you should bear in mind is that a good Description can make all the difference when it comes to getting that all important click from the Google search results. Think of this 160 character text block as your ‘elevator’ speech and create one that would make someone just have to click through to your site, as it is only then that you get a chance to start that dialogue that could result in a sale or enquiry.

The Header Tags

This is another of those things that you will probably not have noticed (and yes you guessed it, Google is looking at this too), other that is that the text might look a bit bigger. But why is the correct use of Header tags important? To explain this I need to give you a bit of a history lesson, it all starting with the way that documents are constructed. This actually goes back to the time that newspapers were laid out using lead type as here the editors had to be able to let the people who were laying out the type which bits were the important, that is, what words (like the Headlines) needed to be big. This was all done using a ‘Header Tag Number ranging from 1 to 6 (or something similar).

This rule set was used when the code that describes how a page would be displayed on Wordprocessors and screens was written , it again being used to control how words would be displayed. This in turn fed through to the language that controls printers and also, most lately, how web pages are rendered by Browsers, this of course being HTML.

The Advent of CSS Styles

In the early days on the Internet there were in fact only a few ways you could control how big the words on a page were, these Header tags being one of them. Today of course you can control the font, size and colour of the text on your webpages using CSS Styles, but the importance of the Header tag lives on as Google still use these to work out which words on a web it should take more notice of, something that is vitally important when trying to get your page to the top of the results.

A Problem With Web Designers

It must be said that most sites use these Header tags, but the problem is they are often used incorrectly, the majority of web designers still using them to control the size of text, often compounding the issue by then using them for such terms as ‘Home’, ‘Contact Us’ or ‘Blog’. Highlighting words like these to Google is useless, far better to use them to point out to Google those words that you want to be found for like ‘IT Support Prices’ or ‘Best Anti Virus Software’.

Putting this right is a little harder than both of the above, but it is still not that big a job and makes your site that bit better in Google’s eyes and thus that bit more likely to get a good listing in their results.

Links – The Popularity Voting System of the Internet

Whilst the majority of the power that links bestow come from links to a site from other sites (so called ‘backlinks’ as they link back to you), the links FROM a webpage to other sites and the INTERNAL links in a site are also important. The first tells Google that you are a part of the community that makes up your market place (as well as pointing them at some other valuable resources, which Google likes to see), whilst the second type helps Google understand what each of your pages is about as well as helping people move about your site. As Google rates sites that offer the best ‘user experience’ higher than others, such internal links can only help.

Incoming Links

Whilst the links to a site cannot be put right by making changes to the site, they are a vital part of the ‘battle’ to get a site listed on Google, accounting for about 40% of the marks that Google allocate when deciding what site to list for what term. However, the fact is that the majority of sites either don’t have the any (or enough) links or have the wrong sort. Both of these can really hinder a sites chances of getting a first page (or any) ranking. Fixing them can take a long time and a lot of work though and has to be done very CAREFULLY.

 

SEMANTIC SEO and the Words on the Page

Semantic SEO is all about making sure that Google understand what a site is all about, thereby ensuring that it’s ‘meaning’ is fully comprehended. This is easier to do than you might think, the major thing to get right being to make sure you use the right words on the page. The right words of course are the words that Google wants to see. The good news is that Google will tell you what these words are, all you have to do is to ask in the right way, this being done by ‘Reverse Engineering‘ the top pages on Google …

Writing the Right Copy

Armed with these words and phrases, and a good understanding of the subject (it helps if you are a genuine expert) you can then write the right copy, adding some images, and if you can audio and video components as you go. Sprinkle some internal and external links at the same time and you have gone a long, long way of cracking this particular nut.

 

Polishing the Spitfire

You may not believe it, but it is said that back in World War 2 they used to polish the photo reconnaissance Spitfires (as well as painting them pink so that they were harder to spot in the dawn or dusk skies) just so that they could gain a few mph, something could make all the difference, life or death in this instance, when being chased by enemy fighters.

If you follow the guidance above and fix any of the items mentioned in the above information, it will in effect polish your website a little, perhaps gaining just enough extra speed to get your site onto Page 1 of Google and thus get the extra traffic that could make all the difference to your business.

 

Need Help With the Polishing?

However, if you need help with the polishing, even if it’s just some assistance in finding out what bits to polish the hardest, please do give us a call. We are here to help and offer a lot of free advice and assistance.

WHAT IS SCIENTIFIC SEO?

First a bit of history about Search Engine Optimisation

SEO can trace its history way back to 1994 when the early pioneers discovered that they could use the Internet to drive traffic to their sites and hence sell their goods. As this idea became more accepted, people started competing with each other for traffic and that meant that they had to ‘convince’ the Search Engine of the day to list their site for appropriate terms.

The Search Engine of the Day has changed over the years, Alta Vista, Ask Jeaves and Yahoo all being the top dog at some time. However, today, the big player is Google and thus that is the engine everyone wants to get listings on, and that of course means you have to understand the rules.

 

The Rules of The Old SEO

The rules that the Search Engines use have altered drastically over the years, as they have become more and more sophisticated. At the start, it was easy to ‘trick’ the Engines, all you needed to do was to stuff the pages with your keywords and get some links to the site (Google’s first stab at SEO was based on something called PageRank which basically is all about the number of links to  a site – and not much else).

These ‘old’ rules however had one big problem, in that the SEO professionals of the day kept finding ways around them and thus the Engines had to keep taking steps to close these ‘holes’ in their rule sets.

This process escalated over the years, especially since 2010, and basically Google decided that enough was enough and decided on a whole new approach, one that could not be
tricked and relied on one thing, perceived quality.

 

The New SEO and Perceived Quality

Today, with the advent of something called ‘SEMANTIC SEO’ (the meaning of a site, what it is really all about), things are a lot different, it being all about the quality of the content of a site.

But Why use the term Perceived Quality?

I use this term as I believe that there are limits to what Google can do, in that its computer algorithms cannot ‘really’ decide on what is real ‘quality’ content and what is not. Also, as mentioned above, links had, and still have a vital role to play in how Google decides what site to list for what.  But it cannot always tell if these links are ‘real’ or have been created, thus in all cases Google looks at a page/site and decides (using it’s rule sets) if it is quality or not.

This is why I say it is the quality that Google perceives in a site that is important. So how can you convince Google that your content is good enough to get a top ranking??

The Rules of the NEW SEO in Detail

Despite all the changes that have taken place in the world of SEO since 1994, but all of them are based on four things, one of these only recently coming to the fore.

The Four Things SEO is and was Based Upon

 Site Construction

The way a site is built is important as if it is constructed in the wrong way then Google cannot (or may just not want to be bothered to) find all the pages in a site. Also if the site is built in such a way that it is very slow, or is not mobile friendly, then too Google will downgrade the site in various ways.

One thing that does not cause so much of a problem today is that of the ‘Code to Text’ ratio (the amount of code that is used to build a site versus the number of words visible to the visitor). In the old days, too much ‘construction code’ was an issue, but today, with the advent of WordPress and the like, Google has been ‘forced’ to ignore this area, virtually all sites being very code heavy.

You MUST however ensure that the site can easily be navigated, a failure in that department being very serious indeed. Plus you should also use a fair number of internal links (not just the navigation) to highlight to Google what each page is about.

Words, Pictures and Videos

This is the area most affected by the new SEMANTIC SEO, it being vitally important to use all the ‘right’ words in a page. Gone are the days of just stuffing a page with the words you want to be found for. Today you need to understand what words Google wants to see and then make sure you include them in the copy, also making sure that you include pictures and where possible audio and video content on the page.

Reverse Engineering is the Key

This is where reverse engineering can help, the idea being that if you know what words are being used on the top pages (for a given term) then by including them (using correct grammar of course, as this is also checked) you must be getting closer to the perfect page.

Links

In the early days of SEO Links were vitally important, in fact they could, all by themselves get a page listed. However, today things have changed a lot. Links are still important counting for some 40% of the reason for a site getting a rank, but they are not as all powerful as they used to be.

Google is Watching You

Besides not being as important as they used to be, the links to a site are now carefully checked by Google. Their aim?, to make sure that the links to a site are ‘natural’ and not all built by an SEO company (although they know of course that the practice goes on all the time).

This checking is carried out by Google, the process being labelled as ‘Penguin’. Basically this checks a sites linking structure to see if it complies with the ‘rules’ and is hence seen to be natural. Here the number of links using the domain or URL of the site as the anchor text (the bit we humans click on) are checked, as are the number of links using ‘money words’ (the terms that a site wants to be found for) and those ‘noise’ links, like ‘see this site’, or ‘click here’. If the balance is not right, or they seem to have been created too fast, then a site can be heavily penalised.

This means that a site’s links have to be built very carefully over time and not all in a rush.

Social Media

This is very new in SEO terms and the amount of ‘power’ that social media chit chat, comments on Facebook and Twitter provide is not fully understood. In my view, the importance of Social Media is more to do with other marketing channels, but nevertheless, obtaining links via things like ‘Social Bookmarks’ can be useful.

Putting it All Together – Scientific SEO

So, what does all this mean?? Basically, it means that you must

 

  1. Find the words you want your site to be found for – KEYWORD RESEARCH
  2. Find the words you need to include in the copy of the page(s) using Reverse Engineering – CONTENT RESEARCH
  3. Build the links to the site, CAREFULLY
  4. If you can get some Social Media comments going (more important for sites selling direct to the public than B2B sites)
  5. Monitor the progress and make changes to improve matters further

 

 

I hope this helps you understand how the matter of SEO has to be approached today.

What Google Wants…

So What Does Google Want?

If the full answer to this question was indeed understood, you can bet just about every site that wanted top rankings would make changes to the way their site looked and worked in pretty short order, the prize, that top place on the first page of Google being worth a lot of money…

But of course, Google won’t tell anyone just what they want, instead they just give out information about some of the things they want to see and as importantly, don’t want to see. Whilst the knowledge that is imparted is useful, it only gives us a part of the picture.

 

wpid-2555-2555-Googles_RankBrain_3x29gn.jpg

A Vital Point – Google Often Ignores Its Own Rules

The biggest problem, from my point of view (as a SEO Professional) is that Google don’t only not tell you the rules, they also don’t keep to the ones you know about. This makes applying any scientific approach to the process difficult, its just like trying to find the boiling point of water when someone  is altering the  air pressure all the time. One time water will boil at 100’C, whilst at another it will boil at 90’C….

You can see this ‘not following their own rules’ phenomenon all the time (if you know what to look for) sites that break the rules still enjoying top ranking positions. This does make life difficult, but does not invalidate the data you can obtain by checking a site’s linking or page structure as it is more than possible (especially when it comes to links) that Google have yet to impose some form of penalty, the site then potentially losing the rankings it currently has.

Some of the Known SEO Rules

Keyword Stuffing

In the early days of the web, it was quite easy to trick the Search Engines into providing a first page rank simply by using the target words over and over again…. Things have moved  on now though, and if you try this trick today you will (more than likely) get worse rankings and not better ones.

Status – Avoid…

Use of Title Tag

The Title tag is not on the page, but is shown in the Browser Window and is used by Google to ‘understand’ what topic the page I question is all  about. It also, importantly forms the phrase that is seen when a site is listed by a Search Engine, so is something that needs to be carefully chosen.

Status – Use Wisely (best to keep to 65-70chrs)

Meta Description Tag

Like the Title tag, the words in this area are not  shown on  the actual page, instead they are used in the Search Engine listings and are to all intents and purposes an ‘elevator speech’. Their effect on SEO is very limited, expect that if they the same text is used on lots of pages that is. In such instances, it is believed that they may have a negative effect.

Status – Ensure that your website has a unique ‘elevator speech’ and is 165 – 170 characters long.

Header Tags

These tags have a long history, their use dating back to the days when newspapers were printed using lead type in blocks. More recently, they formed a part of the postscript language that allowed computers to communicate with printers. They were then subsumed into HTML and at the very start of the Internet, were the only way of creating bigger text on the screen. There are 6 Header styles, from H1 (the most important) to H6 (the least).

Google have stated in the past that they use the text within these tags <H1>the text</H1> as pointers about what the page is about, but now, as CSS styles are used to  control the size of the text on pages there is some debate that Google also treat any BIG text as important..

Status – Use, but only for important phrases (not for Navigation) and only have one H1 tag.

Word Count

There is evidence that the top pages for many search terms are ones that have over 1,000 words of copy, although this ‘requirement’ can go up and down depending on the level of competition. The most important factor here is to use the ‘right words’ on the page (this best found by reverse engineering the top sites  for any term) and to use as many as you can. Size is important here for two reasons, the first is (as explained above) that Google likes lots of words (words are its food after all) but there is another, as important reason  to have lot of text.

This second reason is based on the fact that obtaining traffic for ’long tail searches’ can be great for business, such search terms (normally 4 words or longer) often being used by people who are nearer the end of the buying process and thus that more likely to convert.

Status – Try to create pages that are 1,000 words or longer which contain relevant words and terms

Tabs and Accordions (Copy Triggerd by User Interaction)

One of the reasons that pages are often to light in copy is that the site owner (and the designer) rightly points out that a page that looks like  a ‘wall of text’ is likely to be off putting to viewers and would therefore increase bounce rates and reduce conversion rates.

There is a way of placing the text on the page so that Google can read it, but at the same time ‘hide’ it from viewers, until that is they want to see it. There are various methods doing this, but in every case, it is a user action  that causes the text to be made visible. This process is not treated as hiding text (in the old days people used white text on a white background would  you believe), something that Google frown upon and which could get a site banned, but is a practice that Google have reportedly said they are not altogether happy about.

I find this stance of Google’s somewhat strange as they also want sites to offer the best possible ‘user experience’ and  makes me feel that Google want the cake and to eat it too. But as I don’t believe they are actively  penalising sites that use this in their interface, it seems the  best way of providing Google with the words it needs whilst giving users the best way of assimilating the site’s message.

Status – Use with caution.

Internal Links and Links From Your Site

The power of links to sites are well understood, the right type and number enhancing the possibility of a site getting better rankings. However, it is not so widely understood that the links from a site also have their place. The reason that they are important is that they ‘prove’ to Google that the site is a part of the wider community (in that market area) as well as potentially helping users locate other relevant information.

Internal links also have a role in that they allow users to move through a site in an easier way than just using the navigation system. Used carefully these link types can really assist in improving both the ‘user experience’ and Search Engine rankings.

Status – Do implement links to relevant sites, the more powerful the better. Also, consider what internal  links you could place on your pages.

 

Links To Your Site.

Links to a site are still very powerful, accounting for at least 50% of the reason that a site is selected by Google for a ranking and form a VITAL part of any plan to get better Search Engine listings.

There are however some important factors to bear in mind…

  • Ensure that the links come from a wide number of locations / sites
  • Make sure that the anchor text used contains no more than 25% of ‘money phrases’
  • Check to see that the number of ‘other phrases’ is high, at least 30-40%.
  • Remember that a site is more than just a home page, links to internal pages also being needed
  • Plus when building links, make sure that you don’t build too many too quickly..

 

There is more to SEO of course, so please do see our site for more details and assistance.

What is the aim of Search Engine Optimisation?

The aim of Search Engine Optimisation is obviously about getting traffic, the right sort of traffic from the Search Engines, this being achieved by making a website more ‘attractive’ to Google, Bing etc, so that for certain phrases, the site is listed when someone searches for that phrase.

wpid-2555-2555-SEO_-_The_Old_and_the_NEW.jpg

The process of making a site appeal to the Engines is well understood, the rules dictating where you place the words that are important to the sites SEO being one that Google, for example, are happy to share. Some of the things that you should NOT do have also been shared, but the content of a site is only half the story, the power of links still being one that cannot be ignored.

It is this latter point that is to a degree strange, it being an effect of the very start of Google when it’s PageRank algorithm powered the way by which sites were graded. PageRank was based on the idea that sites that had lots of links MUST be good (or otherwise why did people take the time to create the link). It was not just numbers of links that counted though, the PageRank system looked at the page that held the link and at what pages linked to it, then checking the links to that page and so on. I am not sure as to the ‘depth’ that Google went to here, but it was quite deep and in the beginning worked quite well.

 The Start of The SEO ‘Battle’.

As soon as people became aware of how  the PageRank system worked, and bearing in mind the pot of gold that this form of marketing seemed to offer, companies sprung up offering services that were designed to create links, thus ‘fooling’ Google into thinking that a site was more popular than it really was.

Besides the links, Google also (at this early stage) took a cursory look at the words on the pages, it being heavily influenced  by the simple inclusion of a phrase, hence the start of pages that were ‘stuffed’ with strings of words, no real effort being made to make the page appeal to anyone  other than the Search Engines themselves.

Once this ‘war’ started, Google began to fight back, their systems starting to spot and penalise sites that stuffed their pages with the words they wanted to be found for, while also starting to check on the linking structures of sites in greater and greater detail, both of course with the idea of stamping out the ‘cheating’ that was going on.

Like most wars, both sides got cleverer and cleverer, one thinking of ways to get around the checks and rules that were created, the other trying to combat the attempts, one of the results being the birth of two of Google’s animals, Panda and Penguin.

Google’s Penguin – The Link Checker

As mentioned above, at the start it was links that mattered more than anything else, it being said by some that they could get a blank HTML page ranked if they created enough links to it. Google of course tightened its rule sets to try to counter such practices, in the end deciding to run periodic checks on the links to a site, the rule set being named ‘Penguin’.

Penguin’s aim is to ensure that the linking structure looks ‘normal’ (that is one that has not been manipulated too much) and there many checks that we know it runs (and many that I suspect we do not), these including the type of sites the links come from, and the words used as the ‘anchor text’ (the bit you click on). Failure to keep your linking structure looking ‘normal’ could result in an automatic penalty, one that could cause a site to lose rankings and potentially to be removed from the listings entirely.

However, the real change is not so much about checking the links, but the way that Google evaluate sites in an overall manner.

Google’s Panda

Besides links, it is the power of the content that Google measures. In the beginning, it was quite easy to ‘fool’ Google by simply including the words you wanted to be listed for, the quality of the site was not important. Of course, Google, who wanted to make money from advertising, could not allow these poor quality sites to dominate its rankings, as that would cause people to switch to another Engine, and with Google competing against the likes of Yahoo and AskJeaves this was important…

What Google needed was a system by which they could ensure that the pages they listed first were relevant and offered the information/service which people needed and  wanted. Poor quality sites with little or copied content were not wanted…

Thus the Panda rule set was born, its job being to sniff out sites and pages that were of poor quality this including sites that were not updated frequently enough, or seemed not to be ‘bothered’ enough even to create the right Meta Descriptions and Titles, in short sites that appeared to be ‘lazy’.

Panda also checked for copied and duplicated content as well as looking for pages that were ‘thin’ on words (less than 250), at the same time giving points to sites that included videos and images as well as links to interesting and relevant sites.

Then Came (or Comes) the Semantic Web

Both Penguin and Panda (in their various forms – they kept being altered to make them tougher to fool) were attempts by Google to ensure that the sites they listed were the very best (for any given search phrase), but they were not perfect and time after time the SEO community found ways around them.

Google could see this was going to be a never ending battle, so turned their attention to creating a set of rules that could not be fooled, this rule set looking not just at pages and links, but at the overall ‘meaning’ of the site (this is what Semantic SEO is all about), what ‘it was really all about’ and what problems it was trying to solve and the services it provided.

The Current State of Play

Once Google have perfected this rule set, then it is pretty certain that only the best sites and pages will appear at the top of its rankings, BUT and it is still a BIG BUT, they are far from achieving  this at the moment. The truth is though, at least for the very near future that they are far from achieving their aim, you only have to look at the top sites to see that in many cases, there are far better sites that should be occupying those coveted first page positions.

This of course is extremely annoying for any site owner who KNOWs their site deserves better treatment, but until Google (for one) really do implement this strategy in full, all you can do is to make the best you can to promote your site and that means carrying out works both on and off page.

SEO What You Need To Do

There are two main areas that need to be done in the right manner, one is making sure that the pages are full of useful content, the other to make sure that they site has enough links so that Google ‘believes in it’ enough to rank it. This is as you can imagine, quite a wide topic, but one that we cover in great detail.

So, please see the other pages of this site for more details on what Search Engine Optimisation is all about and the services that we offer at Serendipity Online Marketing Ltd

 

 

 

 

 

 

SEO Poem (Created for a local business networking group)

We all know that there is more to marketing and selling your products than using the Search Engines to bring in traffic and sales (although this is obviously a VERY useful channel) and with this in mind I do a certain amount of networking with local businessess.

The challenge with such events is to ‘stand out’ from the crowd and be remembered for what you do, and to a certain extent for who you are (having a memorable personality is useful here).

With this in mind, and as they was a already a ‘poet’ in the group, I wrote this little poem that I hope shows just what we do…

“I am a little website
  As pretty as can be

But no one it seems
  Wants to visit me..

 

Mr Google came a few times
  That is very true

But he seemed not to understand me
  And left without even a to do!

 

So I sit here just a twiddling
  My little website thumbs

When what I really want
  Is lots of new friendly chums

 

But how can this goal be reached
  Just what am I to do

If I go on like this for much longer
   I’ll end up feeling really blue..

 

Of course I get lots of calls
   All a promising me

A first page position on Google
   All for a very reasonable fee

 

But I have heard they’re all rubbish
  Promising more than they can give

Some say it just cheating
  Whilst others say it’s just one big fib

 

But there is light in the tunnel
   I’m sure that you’ll agree

It’s talking to Graham Baylis
  Now there’s someone whose worth their fee!

 

He checks to see what is wrong
  A delving oh so deeply

Coming back with so much data
  That I’m afraid it makes me sleepy..

 

But the good news is that I
  Just have to listen a little

For Graham knows what to do
  And soon goes into battle!

 

Soon Mr Google understood me
  And now when he comes along

He no longer leaves straight away
  So I know that at last I belong

Next a sprinkle of links are added
 To make a bigger impression

And soon I see the results
  With many a friendly session

 

So if you know another website
  That is feeling lost in the crowd

Do call in Mr Graham
   I’ll know he’ll do them proud…

Top Tips on how NOT to Do Link Building for SEO in 2016

At Serendipity Online Marketing, we are always amazed at the paradox that is SEO, particularly as far as Google is concerned…

Recently we have seen that Google wants rich content, loving it seems pages that are 1,000 plus words long (which is a lot of text believe me), whilst at the same time saying that the use of ‘tabbed content’ (where the words are there for SEO by are not visible until called into being so to speak by the user clicking a button) is frowned upon.

The reason this is strange is simple in that Google also want sites to offer a great user experience. One would seem to be at odds with the other, in that the tabbed content stops pages looking like a ‘wall of words’ and thus harder to read.

Linking – Another Paradox.

The issue with links is perhaps even more strange in that Google ‘needs’ links to sites as they use them to decide what is good content or not, and yet they do not really want anyone to build them (in any artificial manner) as they want them to grow ‘naturally’ as people find and link to the content on a site.

All this is fine, but there is that large chicken and egg question here, as for new sites at least, how do people find the content (so that they can link to it) in the first place. Without links the site is not likely to get good rankings and thus cannot get links, which means it can’t get ranks…

Of course, other means can be used to get people to a site so that they can ‘like and link to it’s content’, perhaps using Google Adwords…..

However, there is a school of thought that says that building links the right way (as long as there is something good to point too – which means also attending to the content of the site as well) is a GREAT way of boosting rankings, and the tests and experience we have proves this 100%, building links the RIGHT WAY ALWAYS increases traffic.

BUT, and it is a big BUT, there is a wrong way of building links and if you do too much of any of the things below, you are more than likely steering a course for disaster, at least as far as getting rankings and thus traffic from the Search Engines (particulary Google).

For the full article on how not to build links click the link!

Link building has had a rough year. Thanks in part to Google’s John Mueller’s comments that link building, in general, is a strategy to avoid, a number of SEO practitioners have moved away from the practice.

Naturally Acquired Links are Best

More specifically, they’ve flocked to a more natural form of link building involving the creation, syndication and promotion of thoroughly researched original content; the idea here is to attract or earn links naturally without ever manually building a link on an outside source.

I’m a big fan of this approach. It’s safe, natural and can earn you a ton of links if your content is good enough. However, I still believe there’s a place for manual link building — as long as your focus is on providing valuable content to your readers.

Good and Bad Links

So what exactly differentiates a “good” link from a “bad” link in Google’s eyes? How can you be sure that a link you’ve manually built isn’t just going to get your website penalized?

As long as you can avoid these seven characteristics, all of which can make a link “bad,” you’ll remain in good standing:

1. It’s On A Low-Authority Or Questionable Domain

The higher your site’s authority is, the higher you’re going to rank in Google. Links on already-high authority sites pass far more authority to your site than those on low-authority sites. If your link appears on a site with a poor reputation, it could do active harm to your organic search visibility.

Generally, unless you’re perusing spam sites or blacklisted pages, you won’t have to worry much about this. Google looks for patterns that it can verify with a high level of certainty, so a single low-authority link won’t hurt you; but hundreds or thousands sustained over the course of a month or more certainly could.

Overall, it’s in your best interest to get links on the highest-authority sources you can find, while avoiding disreputable ones.

2. It’s Pointing To A Source Irrelevant To Its Content

Context is important in Google’s modern search algorithm. It’s not enough to have a link pointing to your site — that link needs to be associated with content that’s somehow relevant to your site, as well.

For example, if you’re a manufacturer, and you post a link to your site in an article about hamburger production in a butcher shop forum, chances are it will raise some red flags.

Keep all your links context-specific, and pay close attention to the types of sources you rely on — the closer they are to your industry, the better.

3. It’s Repeated Too Many Times On The Domain

Quantity is important when it comes to links, but more links isn’t always better. Diversity is also important. If Google sees too many links pointing back to your domain on a certain site, it may flag that as suspicious.

Instead, Google likes to see lots of links pointing to your domain from multiple sources. Since each link after the first on a single domain suffers a downgrade in value, it’s in your best interest to diversify your link sources.

4. It’s A Part Of A Reciprocal Exchange

If you have a buddy who owns a similar site, it might seem like a clever idea to exchange links between the two in an effort to boost both your domains.

5. It’s Embedded In Suspiciously Keyword-Matched Anchor Text

Back in the days when keyword-focused optimization was synonymous with SEO, anchor text for links was a big deal. It was a best practice to embed your link in anchor text using the exact keyword you wanted to rank for — today that isn’t going to work.

6. It’s Isolated From Any Meaningful Content

Posting any kind of link without content accompanying it is bad—it doesn’t matter if you do it in a blog comment, forum post or any other medium.

Your links need to have some kind of semantic context to them, and preferably in the body of a detailed, meaningful post. Guest posts on outside blogs are your best friends here. Use them.

7. It’s A Part Of A Scheme

Link schemes aren’t as popular as they used to be, but somehow they’re still floating around. Participating in complex systems like link wheels or link pyramids is a violation of Google’s terms of service.

If you’re caught deliberately participating in a link scheme, you won’t just drop a rank or two — you could earn a bona fide Google penalty.
Final Thoughts

Thoroughly comb through your existing link profile to make sure none of your links possess these seven characteristics. You can use Moz’s Open Site Explorer, Ahrefs, Majestic or any other tool that functions as a search engine for links.

If you notice any that are questionable, work to remove them. It’s far better to remove a dubious link than leave it and suffer the potential consequences. Then, put safeguards in place to ensure your future link building efforts avoid these factors at all costs.

Google’s RankBrain – Is It Really That New?

I came across an article about Google making a big announcement about something called ‘RankBrain’, which is stated to be all about ‘machine learning’, or the process by which a computer program ‘learns’ what is the right and wrong way to ‘do things’.

Googles RankBrain
Public Domain from pixabay

This sort of thing is not new at all, computer devices that can learn the route through a maze being one of them. As soon as the recognise the ‘maze’ they can quickly navigate their way through.

Google have been using this sort of system for years in one form or another anyway. Take the way in which the huge test they ran on sites around the world. In this test, they rated sites for ‘usefulness’ and at the same time checked and noted certain information about the site’s pages.

Then later, they checked to see if there was any correlation between the sites rated as ‘not very useful’, like the way the things like the Meta Title were used.

It was no surprise when they found that the ‘poor’ sites demonstrated a lack of details in some areas, and using this data could therefore, with some certainty, deduce how useful a site is likely to be simply by comparing it with a list of characteristics that the ‘poor’ sites demonstrated.

This is machine learning, but it seems that the new system will be taking things one step further. At the moment you only see the sites that are marked  well ‘out of 10’, the really poor ones not getting (or even been considered for) a ranking.

But with the new algorithm, more sites may be considered when it comes to matching search terms with sites, OR, it could be that by learning what sites are best received for any phrase (that is the visitor does not bounce back to the listings after visiting a site) they can in future list sites in a better way.

Rats in Google’s Maze

However, if it is the latter, then even this is not new, Google have for some time been using its users as ‘rats in a maze’.

The process is really quite simple, if vast (Google translates 3.5billion searches a day) and goes like this:-

(1) Google have already matched sites to phrases to some degree / have a method where using current data they can return a list of sites (very quickly) for any search term.

Users are thus presented with a list of sites to choose from for any search term. This will be subtly different because of ‘personalisation’ and Factor X – this being the way in which Google include new sites every now and then to test users reaction. This Factor X, is important for (3).

(2) Google then sees what people click on from the listings. If a site/page does not get any or few clicks (poor Click Through Ratio CTR%) it may be removed from the listings for certain terms. However, if it does get clicks, Google then checks to see if people ‘bounce’ back to Google to try another site.

(3) If they bounce back in enough numbers, Google ‘knows’ that the site is not a good match for that phrase. Good matched are allowed to remain in the listings, bad ones are removed. This is why the addition of some new sites in the listings is important, as it widens the pool of data and helps Google to ‘understand’ what is ‘behind’ a users search term, why it is being used, this in turn being deduced from what seems to satisfy the request….

Of course this takes time, many 1,000’s of searches being required before any decision can be made. Google however, does have the time and the resources to do this and the entire process must be considered to be a form of ‘machine learning’, Google ‘learning’ to tell what is a good match and what is not.

All of this means that the RankBrain may not be that new after all?

The article that sparked off mine is included in part below. For the full article on Google’s RankBrain please click the link.

All About The New Google RankBrain Algorithm

Google’s using a machine learning technology called RankBrain to help deliver its search results.

Here’s what’s we know about it so far.

Yesterday, news emerged that Google was using a machine-learning artificial intelligence system called “RankBrain” to help sort through its search results.

Wondering how that works and fits in with Google’s overall ranking system? Here’s what we know about RankBrain.

The information covered below comes from three sources. First, the Bloomberg story that broke the news about RankBrain yesterday (see also our write-up of it). Second, additional information that Google has now provided directly to Search Engine Land. Third, our own knowledge and best assumptions in places where Google isn’t providing answers. We’ll make clear where any of these sources are used, when deemed necessary, apart from general background information.

What Is RankBrain?

RankBrain is Google’s name for a machine-learning artificial intelligence system that’s used to help process its search results, as was reported by Bloomberg and also confirmed to us by Google.

What Is Machine Learning?

Machine learning is where a computer teaches itself how to do something, rather than being taught by humans or following detailed programming.

What Is Artificial Intelligence?

True artificial intelligence, or AI for short, is where a computer can be as smart as a human being, at least in the sense of acquiring knowledge both from being taught and from building on what it knows and making new connections.

True AI exists only in science fiction novels, of course. In practice, AI is used to refer to computer systems that are designed to learn and make connections.

How’s AI different from machine learning? In terms of RankBrain, it seems to us they’re fairly synonymous. You may hear them both used interchangeably, or you may hear machine learning used to describe the type of artificial intelligence approach being employed.

So RankBrain Is The New Way Google Ranks Search Results?

No. RankBrain is part of Google’s overall search “algorithm,” a computer program that’s used to sort through the billions of pages it knows about and find the ones deemed most relevant for particular queries.

What’s The Name Of Google’s Search Algorithm?

It’s called Hummingbird, as we reported in the past. For years, the overall algorithm didn’t have a formal name. But in the middle of 2013, Google overhauled that algorithm and gave it a name, Hummingbird.

So RankBrain Is Part Of Google’s Hummingbird Search Algorithm?

That’s our understanding. Hummingbird is the overall search algorithm, just like a car has an overall engine in it. The engine itself may be made up of various parts, such as an oil filter, a fuel pump, a radiator and so on. In the same way, Hummingbird encompasses various parts, with RankBrain being one of the newest.

In particular, we know RankBrain is part of the overall Hummingbird algorithm because the Bloomberg article makes clear that RankBrain doesn’t handle all searches, as only the overall algorithm would.

Hummingbird also contains other parts with names familiar to those in the SEO space, such as Panda, Penguin and Payday designed to fight spam, Pigeon designed to improve local results, Top Heavy designed to demote ad-heavy pages, Mobile Friendly designed to reward mobile-friendly pages and Pirate designed to fight copyright infringement.

I Thought The Google Algorithm Was Called “PageRank”

PageRank is part of the overall Hummingbird algorithm that covers a specific way of giving pages credit based on the links from other pages pointing at them.

PageRank is special because it’s the first name that Google ever gave to one of the parts of its ranking algorithm, way back at the time the search engine began in 1998.