Research into the FRED Google update, confirming why sites lost rankings.

The Fred Update by Google caused quite a ripple in the SEO world, many sites losing ranks, and hence traffic, up to 90% in some cases.  I have been doing quite a bit of digging and have asked some Gurus some pointed questions about why and what has happened.

The overall thoughts on the matter are that Google penalised sites that had poor content, or ones that were simply there to make money and not give anything back to the visitor in the form of useful data or information.

User Experience is Another Factor

Other thoughts on the matter were more to do with the User Experience that a page gave its visitors. Here the sites that were said to be hit included those that placed the copy below the fold of the screen or in some cases had very low load times.

However, in some cases sites were hit that were not just ‘out to make money’, but that seem to have been ‘lumped in’ with those that do because of the lack of content on their page.

Having a Lot of Links Did Not Save Sites

There was also talk that FRED checked on the quality of the links to sites too. This may turn out to be the case, further research is needed on this matter. However, what we can say is that sites that fell foul of FRED’s On Page quality checks were not saved by having a lot of links. Instead their positions were taken by sites that had inferior linking profiles, both at Page and Domain levels.

This research only covers 9 sites, so it can hardly be said to be definitive, but the evidence so far is pretty conclusive. Further research into the sites that were affected but did not fit the profile of sites that ‘should have been affected’ (by Fred) is the next step. More on the ‘efficiency’ of Fred later.

The FRED Data

In each case, the sites that held a first page rank before Fred for a given term were compared with the sites that now hold the first page (for that term). The sites that had lost their first page rank (had to have a position of 7 or above pre Fred) were then checked, this with a view to see ‘what could have caused them to lose their rank’ and whether this fitted with the profile of sites that Fred ‘should have hit’.

The phrases checked covered a range of topics, ranging from ‘lqf fruit’ to ‘chemical companies’ so should be diverse enough to give some firm data.

Search Phrase ‘ lqf Fruit’

Before and After FRED

Search Results pre and post Fred

Google results before and after FRED

Here two sites lost their first page rank:-

1

Not enough text for FRED

A screen shot of the site

This site had lost a rank of 5, and when checked, we saw that the actual page that was shown when you clicked the link was https://www.thespruce.com/what-does-iqf-mean-995719, a page not even on the stated domain. Something that is sure to annoy Google to start with. Furthermore, this page had very thin content and seemed to be only really there provide to a place for Google Ads and other advertisments. Being a prime target for Fred, it is not surprising to see that it was hit.

2

 

Content to Thin

The fruitbycrops site

Again a site with very thin content, just 155 words with an Advert at the very top, again a prime target for Fred.

 

Search Phrase ‘ chemical companies’

Before and After FRED

 

Results before and after FRED

The results for the term before and after FRED

Again two sites affected:-

12

 

 

 

 

This is a big website, with a lot of links, some 222,000 links to the domain, (although only 3 to the page)  linking to the page, the reason it lost its ranks seemingly down to the fact that the page in question was just not related enough, it being just one short item on the page.

4

 

An example of a penalised site

Was this site penalised because it’s copy was not ‘good enough’. Seems to be the most likely.

Another page that held just a small amount of what I would call ‘filler text’, it not really ‘saying anything’, at least in my view, the total length being just 251 words. Again a prime target for the Fred update.

 

Search Phrase ‘welding supplies uk’

Changes in the Google results pre and post FRED

The results from Google, pre and post the Fred update

Two sites here:-.

11

An example of a site hit by FRED

The Weldingshop site one of many hit by the Fred update

This site is not that bad in reality, although some may think it is a bit old fashioned. But it is not as bad as many that do hold onto first page ranks.  What is most likely the cause of the pages loss of rankings is that the main copy is only 340 words long. This leads me to consider that the length on the copy is considered below the ‘satisfactory’ level laid down in the Google Quality Guidelines.

5

 

Little text below the fold

Too little copy, with it below the fold. Possible reasons for the site being hit by FRED

This page lost a rank of 7, again the amount of copy being the likely cause of the drop, only 270 words being on the page, whilst also being below the fold, a factor that Google has stated (in 2012) that caused the value of any copy to be degraded.

Search Phrase ‘metal fabricators’

See how Fred altered the Google results

Google results pre and post the FRED update

Three sites had lost their ranks for this phrase

6

 

 

 

To few words for FRED?

Another site hit by Fred, more than likely due to the small amount of copy

Yet another page that lost its ranks, apparently down to the lack of content, the copy amounting to just 154 words.

7

 

Text below the fold - a reason for a Fred hit?

A page with over 600 words, but being below the fold, this could have caused a Fred hit.

This site had a rank of 4 before Fred, and does have a fair number of words, over 600 in all. However, 90% of it is below the fold on the screen and this looks to be the reason for the drop.

8

 

Yet another site hit by Google's Fred.

Yet another site hit by Google’s Fred.

This page lost its 6th position, it again being a ‘low volume of copy’ casualty, the length of copy amounting to just 170 words.

 

Conclusion

In all cases we can see that the sites affected by Fred did seem to fit the patterns suggested by the Gurus and by other research in that they mostly had very thin copy or ‘hid’ the copy below the fold in the page.

The next step is to see if the pages we are currently looking after SEOwise that also suffered a drop in rankings also fit this pattern.

Watch out for another report on this later in April.

How Should You Position Your Web Content?

We were approached by Tracy at UKWebhostreview.com and asked if we would like to feature an infographic on how to position web content on a site to get the very best effect. This has always been an important topic, BUT, after the Google Fred Update, anything that improves the User Experience is something that deserves serious consideration. So, we were more than happy to host this post and hope that you find it as useful as we have.

Guest Post from UKHostReview on Positioning Web Content

If you’re asking this question then you are already thinking a lot more deeply about your online marketing than a large population of website owners. People can often get caught up in getting a website set up quickly or concentrating on which web host to go for and other aspects involved in website building.

Infographic by UKwebhostreview

Infographic on how to place web content supplied by UKwebhostreview

When this happens, some of the other important considerations like content positioning can be neglected, which will result in a website that isn’t as effective as it should be. When we talk about website effectiveness, the key measure that most people will be interested in is driving increased customer sales. If you are setting up a business website then one of your main priorities should be to get the positioning right on your website. This can seriously be the determining factor in how many sales your business is making, so should be treated as a top priority for you.

If you’re not an expert in developing content or positioning content for maximum effect, then you will probably find this infographic from James at UKwebhostreview.com of great use. It lists the 25 features that every online business must have in 2017, so as you can probably tell from the title it is a very comprehensive list. It shows you exactly where to add your key features like call to action button or logo with tagline. You can also use the list of features to check that you have remembered to include every essential item of content that a good website requires.

Whatever stage of website set up you are at, whether you are only just beginning or you have had your website set up for some time, you should use these 25 features as a guideline for how to structure your website content to drive the best results.

6 Social Commerce Trends You Should Know About

It’s great to know that your own posts get found and read, and in some cases this leads to even better things, one reader telling us about an article they had written that provided even more information on the topic we had been featuring, that being ‘Social Media’s power in the Ecommerce World’.

wpid-2555-2555-def30992b5a44a92b343c6266ff50fb2..jpg

We checked out the article and found it very interesting, Social Media now being an area that no one can totally ignore. At the moment, Serendipity really only uses Social Media to boost the power of the links we create. BUT, that is all set to change soon as we plan to offer a level of Social Media marketing to our customers.

Anyway, read on, and if you want to see the full article on Social Media Marketing, click the link!

Just a few decades ago, advertising only showed up in a few channels, such as television, radio, and billboards. Companies who wanted to increase sales had to shell out a significant amount of cash to get their products in front of people and there was no way they could guarantee their ads would get traction.

Social media has completely revolutionized the way commerce happens. Now masses of people spend their hours on social media platforms, consuming quantity in astronomical proportions. Companies who want to drive sales have to be innovative in their social media tactics.

Every year, these tactics change as social media platforms release new options. What trends can we expect for 2017? Here are 6 you need to know about.

It’s All About Those Videos

Have you noticed that video is everywhere?

Facebook, YouTube, Instagram, and Periscope have all released the option of live streaming video. Additionally, all these platforms offer the ability to create video ads. Even now, YouTube now has shoppable ads before videos. It also allows companies to create simple calls to action so that viewers can purchase their products.

Expect to see more companies tapping into the power of live, shoppable video even more in 2017.

Companies like QVC and the Home Shopping Network have long demonstrated that live video can generate huge amounts of sales. Now almost anyone can create live videos in which they demonstrate and sell products.

Because these videos can be so highly targeted, they represent a massive opportunity for advertisers.
If the live video trend continues, we should expect to see almost every company selling their products live on social platforms.

Cashing In On Those Impulses

Marketers have long tried to tap into impulsive buying. Whether that’s encouraging consumers to call immediately or offering a limited-time discount, impulse buying has always been deeply integrated into the shopping experience.

However, impulse buying is increasing at a staggering rate with social networks.

Platforms like Instagram and Facebook allow consumers to make purchases without ever leaving the platform. And while not exactly a social platform, Amazon has one click ordering to make it all the easier to purchase without thinking.

Companies know that impulse shopping can drive a huge amount of revenue and are doing everything in their power to make it as simple as possible for customer to purchase without thinking.

In 2017 we should expect to see more and more companies implementing impulse buying options across social media platforms.

Pinterest, for example, isn’t just a place for posting recipes and interior decorating ideas. They now offer a “Buy Now” button which allows consumers to make immediate purchases from the platform.

Considering that a massive amount of Pinterest users visit the site for product-related ideas, it’s a huge opportunity for marketers.

Smart Scientific SEO Strategies for 2017

It’s been a fair few weeks since we managed to post anything on our blog and frankly I’m amazed at how fast the year has gone so far, and at the rate at which things seem to be changing, not to mention a lot of really useful software that has become available.

The post we’ve highlighted today (see below) comes from a series published by a well respected Web Design and SEO company called AimInternet. It is certainly a useful piece and highlights the fact that the information in Google’s Webmaster Tools (now called Google Search Console) is very very useful. The main reason I say this is that Google (for reasons of privacy they say…) stopped reporting the keyword phrases used by any visitor to a site in Analytics. You can tell they come from Google, but not what search words they used. All very annoying when trying to work out what words are converting and what are resulting in a high bounce rate.

Google Search Console fills this gap, to a degree in that it gives you a good idea of the phrases being used, the number of times a phrase has resulted in someone seeing a Google listing for the site, the Click Through Rate (very useful this, as it gives you an idea if your Title and Meta Description are well tuned to get clicks) as well as the average position in Google. But, it does not tell you what page they land on or whether they stay or ‘bounce’.

You can start extrapolating the data to make some intelligent guesses about what is going on (there is software that will do this for you) but they are only guesses (you could always run an Adwords campaign to check, but that is another story).

But to get back to what the article is about.

Scientific Organic Search Strategy

In the article AimInternet mention that they had increased the ‘number of keywords present’, by which I think they mean the number of different search phrases (or ‘Queries’ in Google Search Console speak) that were associated with a site. They made a big difference (something that we too pride ourselves on being able to achieve), increasing the number of associated phrases from 300 to 800. What this really means is that the ‘footprint’ of the site on Google has more than doubled, hence it is more likely to be seen and thus get a click ! All very good.

The process by which they reached this point is covered in earlier posts and no doubt they follow the same ‘Scientific’ path as we do. If they do they will first carry out research to find the words being used by people searching for their customers services and products. Then they will weave these into the site and construct content that supports the drive for rankings for the chosen target phrases.

What they ‘might’ not do is to check on the sites that currently have the best positions for these target phrases and then ‘Reverse Engineer’ them. By following that path you ‘know’ the words that Google likes to see and can thus use them in the content. This system also gives you a list of all the similar words and phrases that should be used, which avoids keyword stuffing and gets the ‘message’ across to Google in the way that we know it likes.

Add some links (that themselves have to be intelligently added – there is software that helps with that now too) and the site WILL, like Eagle, be associated with more query phrases, get better rankings and thus more traffic.

But the trick is in carrying out each of these phases in a controlled scientific manner…

One very interesting point that Aim made is that once you have a list of the phrases that Google associate with a site, that you should build on this and write content (about these phrases) that will make the site that bit more interesting and helpful. This will not only cement your position with Google but will no doubt improve the rankings for the site and, more importantly, give your readers more reasons to come back for more, and even, hopefully, buy from you.

They also make the point that visitors don’t always come in through the front door (the home page) so you should make your interior pages interesting too. This is not really new though, in that most of the pages on a site should be doing their best to engage with viewers by providing useful content, each page targeting a different set of keyphrases.

So a very interesting article.

To read the whole post on A Smart Organic Search Strategy please click the link

How We Use A Smart Organic Search Strategy To Get Our Clients On The First Page Of Google

This week we expand on looking at how to get your website on the first page of Google by using a smart organic search strategy.

In our last blog, we looked at the importance of getting on the first page of Google. And, we examined how our methods of using local marketing tools are driving traffic to the homepage – and producing fantastic results – for a client of ours. This week, we’ll expand on part of that methodology – using an organic search strategy to drive traffic to particular product pages or blog pages which then link through to specific product pages. We also do this via Adwords, although this is something we’ll look at in more detail in following blogs.

What Is An Organic Search Strategy?

In brief, an organic search strategy consists of finely keyworded product pages or blogs, which get picked up by Google each time one is published on a website. At this point, you might be thinking “I’ve already got all the information about the products or services I offer on one page of my site so I’ve nailed it, right?” or “I make rubber plugs, why the heck do I need a blog about those, who is going to read it?!”.

OK, so you might not be totally wrong about the last point (but hey, you never know, there might just be a rubber plug enthusiast out there who would LOVE to read your blog about them!).

Getting back to business…

Creating separate product pages on your site and posting blogs is all part of your organic search strategy. Simply, doing so creates more pages on your website containing the relevant keywords that you want your website to be found for, which Google can then index. The more relevant and unique pages and content you have on your site, the more shots on target you have at being shown on the first page of Google.

The important things to note here are relevant and unique. Google is smart and will penalise your site if you post up a load of duplicate pages and content. The same goes if you keyword stuff (make your content unintelligible by jamming in too many keyword phrases) your posts and pages.

We won’t go into it here but recommend that you take some time to familiarise yourself with good content practice. That includes following referencing protocols if you are using content from another site. For example, you might choose to do a blog post which rounds up the “5 best things about rubber plugs” and which uses information from other websites. That’s absolutely fine, but just remember to acknowledge and reference your sources correctly.

Why Do This?

How many pages are currently on your website? Probably not that many? So, if you currently have one page that discusses your 10 different products, by separating them out into individual pages you just added 10 extra pages to your site virtually overnight. You’ll be able to expand the content around each product, and so the mentions of the relevant keyword, too. So, whereas on the original page, you may have only listed the type of products you sell, you can now go into more detail about each one on their own page. This naturally allows for an articulate way of including more of your desired keywords on your site – avoiding the extreme no-no practice of keyword stuffing.

Google likes new and relevant content. Each page becomes a new way for traffic to come to your site. Of course, once the core pages of your site are done it’s likely that you won’t be updating those that often. Which is why, as part of any organic search strategy, we advise our clients to do regular blogging. And, in the case of blogging, the more regular you post the better.

Employing an organic search strategy such as this might mean that traffic enters your website not via the traditional route of arriving at the homepage. Instead it might enter on a product page or a blog post page written around a specific topic, which then links to a product page. Typically, we notice that customers will land on one of the product pages of our client’s websites, because of the organic search that we’ve set up for the client.

If you’re in the pressed parts trade you might do a search in Google for “copper plating”. Google will take into account your location (it gets this information from your settings) and present to you the most relevant results. Let’s say you’re Midlands based, as is EC Williams.

As a result of this search, people enter EC Williams’ site on the Copper Plating product page. Once on the page, you are presented with all of the information you need about “copper plating” along with some important trust points about the company. Our analysis shows us that from landing on this entry point people also then navigate to other pages on the site. From this example in particular, we can see that “zinc plating” is the next most popular page. Once on their website, this alternative page is now easily found in the navigation bar above, under “Plating Services”. From our research, most people stay on the “zinc plating” page, as they’ve found what they want. But, if they want more depth they’ll go onto “zinc nickel plating”.

The point of this is that once on the EC Williams’ website, the customer is presented with everything they need to make a purchasing decision. And, if you were that person looking for a company who were experts in the field of coating pressed-parts, then, bingo – you just found them.

Straight away, serious buying customers get what a snapshot of relevant information once they are on the site. Because of the trade they’re in (pressed parts), they become interested in making an enquiry straight away. We’ve measured this extensively on EC Williams’ site plus many others’, and know that it works. You need to make it easy for your customers to find information on your site and this method works by doing just that. Everything has be there for the user so that they’re not having to look for things too much.

How Organic Search Strategy Works

Most people will find you through a long-tail keyword search. These are keywords that tend to be more specific. Your website content should be driven by the keywords that your SEO advisor gives you. They need to advise your outsourced content providers of these keywords so that they can write content around them.

Take a look at www.eagleplastics.co.uk. They are another client of ours. Again, you can see that similar to www.ecwilliams.co.uk, everything a customer requires is there easy to find on the homepage, above the fold.

From an SEO perspective, when we started working with Eagle Plastics, the number of keywords we had to work with was much less than it is now. The site was receiving much less traffic that it does today which meant that there were nowhere near as many clicks or impressions being recorded. This impacted on the number of keywords being presented to us by Google. At the time we were only getting about 300 keywords presented, yet a year or so on, Google is now presenting 800 keywords.

This is as a result of the organic search strategy we have implemented, like that we discussed earlier. Traffic gets signposted to the Eagle Plastics website all based around these 800 keywords. And, now we have more of those, we can start creating content based on different keywords and keyword phrases.

Through testing the blogs, we are able to determine which keyword phrases are the most successful by analysing which ones have the best impressions.

On Eagle Plastics, “High Impact Polystyrene” is a key term for them. We know that this keyword phrase works well for them so we use it regularly in their blog headlines, in the h2 sub-headers and throughout the blog text. Of course though, we ensure we use it professionally and never keyword stuff.

As a result of this organic search strategy, we are providing more content to Google. This is recognised by them and results in Google starting to suggest more keywords which are relevant. We then create content based around these suggested keywords and their variations. As we post regular content which uses those keywords, Google views this as quality content and so provides us with even more relevant keywords. We then use these to continue to push the search and content strategy. The result is more traffic. But more than that, in getting more traffic, Google rewards you for quality content. And so it continues…

As little as five years ago, most searches were conducted using use two keywords. Today people use an average of five words per keyword search term. What was once a keyword search for “plugs” is now a more unique phrase of “the best luxury rubber plugs”. As you can see, the one word keyword has become a keyword phrase made up of multiple words. Searches are now more unique and these long-tail keyword phrases more specific.

Ultimately, it’s important to remember that every keyword search represents an intent by someone to find some information out. Long-tail keywords help you to better address that user intent by creating unique tailored content.

Statistics show that of 3 billion searches a day, 20% of every search is unique. That’s a heck of a lot of unique searches – and to get displayed on the first page of Google, you need a successful organic search strategy to be found amongst all of that noise.

SEO Ho Ho – Search Engine Optimisation in 2016 – Xmas Message

The year is nearly at an end and Xmas has been and gone, but there is still a lot of cheer in the air and pleasant memories of all the festivities to boot (amongst them our company Xmas card – see the image below) which went down very well with our customers).

seo-ho-ho-card

But there are other reasons to be grateful about 2016, in that in my view Google has made some really good moves to make the results fairer and more accurate, the latest Penguin update really sorting things out.

This has been somewhat of a relief to SOM as we have been ‘preaching’ what we call ‘Proper’, ‘Scientific’ SEO. What we mean by this is that we research the words that people are searching for in a market area, find the words that Google ‘wants to see’ for these phrases so that they can be incorporated in to the copy. Then we add some relevant links (with a natural anchor text and source type mix) and he presto , things start to happen.

The best part of this is that it is all totally ‘Google legal’  and can never in our view be subject to any penalties that Google may dream up at some time. We can say this as all we are trying to do is to make sure that any site we optimise offers some of the best information there is on a given subject, and of course we make sure that there are enough links to the site’s pages so that Google thinks the same. We call this link building programme ‘priming the pump’ as once the site gets traffic, the links will start building organically. Link building is still required in many cases, but perhaps, only because others are trying to get their sites rankings higher too…

As to the blog post we have included below, we certainly agree about the rise of AI and believe that Google searchers have for some time been ‘rats in the Google maze’, in that they have been analysing what we click on and what sites we like, thus getting closer and closer their goal of truly understanding the real intent behind a given search term.

The other interesting thing raised here is the increased importance that mobile search is being given these days, not really surprising when you realise that people are accessing the web using mobile devices more and more these days.

For 2017 we see it as more of the same, Google getting cleverer and cleverer at spotting the good sites (the ones that deserve rankings) from the ones that don’t, all of which means you just have to ‘Do SEO properly’ or suffer the consequences…

To see the full article on SEO in 2016 and some predictions for 2017 please click the link.

What we’ve learned about SEO in 2016?

Since the inception of the search engine, SEO has been an important, yet often misunderstood industry. For some, these three little letters bring massive pain and frustration. For others, SEO has saved their business. One thing is for sure: having a clear and strategic search strategy is what often separates those who succeed from those who don’t.

As we wrap up 2016, let’s take a look at how the industry has grown and shifted over the past year, and then look ahead to 2017.

A growing industry

It was only a few years ago when the internet was pummeled with thousands of “SEO is Dead” posts. Well, here we are, and the industry is still as alive as ever. SEO’s reputation has grown over the past few years, due in great part to the awesome work of the real pros out there. Today, the industry is worth more than $65 billion. Companies large and small are seeing how a good search strategy has the power to change their business.

As search engines and users continue to evolve, SEO is no longer just an added service brought to you by freelance web designers. With the amount of data, knowledge, tools and experience out there, SEO has become a power industry all on its own.

Over the course of the year, my agency alone has earned a number of new contracts from other agencies that are no longer able to provide their own search efforts. A large divide between those that can deliver SEO and those that can’t is beginning to open up across the board.

The rise of AI

Artificial intelligence (AI) is now prevalent in many of our lives. Google, IBM, Amazon and Apple are very active in developing and using Artificial Narrow Intelligence (ANI). ANI can be used to automate repetitive tasks, like looking up product details, shipping dates and order histories and performing countless other customer requests.

The consumer is becoming more and more comfortable with this technology and has even grown to trust its results. Sundar Pichai, Google CEO, announced during his Google I/O keynote that 20 percent of queries on its mobile app and on Android devices are voice searches.

RankBrain, Google’s machine-learning artificial intelligence system, is now among the top three ranking signals for Google’s search algorithm. Why? Google handles more than 3.5 billion searches per a day, and 16 to 20 percent of those are unique queries that have never been searched before. To handle this, the team at Google has harnessed the power of machine learning to help deliver better results.

While we can’t “control” RankBrain, what we can do is learn more about how Google is using it and then help the tool by creating good content that earns shares and links, building connections with others in our niche or related niches, and building trust in very targeted topics.

We are still in the beginning stages of this technology, but as more and more homes become equipped with smart tools like Amazon Echo and Google Home, we can be sure that these tech giants will use the knowledge they gain from voice search to power their AI technology.

The “Google Dance”

Every so often, Google likes to surprise us with a major algorithm update that has a significant impact on search results — some years we get one, and other years we get a little more.
While they do make nearly 500 tweaks to the algorithm each year, some are big enough to garner more attention. Let’s look back at four of 2016’s most memorable updates.

Mobile-friendly algorithm boost

A little under a year after “Mobilegeddon,” an event marked by the launch of Google’s mobile-friendly ranking algorithm, the search giant announced that it would soon be increasing the effects of this algorithm to further benefit mobile-friendly sites on mobile search. That boost rolled out on May 12, 2016, though the impact was not nearly as significant as when the mobile-friendly ranking algorithm initially launched.

Penguin 4.0

While this ended up being a two-phase rollout, Penguin 4.0 made its entrance on September 23, 2016. This has been considered the “gentler” Penguin algorithm, which devalues bad links instead of penalizing sites. The second phase of Penguin 4.0 was the recovery period, in which sites impacted by previous Penguin updates began to finally see a recovery — assuming steps were taken to help clean up their link profiles.

“Possum”

While this update was never confirmed by Google, the local SEO community noted a major shake-up in local pack and Google Maps results in early September 2016.

Fellow Search Engine Land columnist Joy Hawkins noted that this was quite possibly the largest update seen in in the local SEO world since Pigeon was released in 2014. Based on her findings, she believes the update’s goal was “to diversify the local results and also prevent spam from ranking as well.”
Divided index

As mobile search continues to account for more and more of the global share of search queries, Google is increasingly taking steps to become a mobile-first company. In November, Google announced that it was experimenting with using a mobile-first index, meaning that the mobile version of a website would be considered the “default” version for ranking purposes instead of the desktop version:

“To make our results more useful, we’ve begun experiments to make our index mobile-first. Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results.”

The time to say goodbye to 2016 is fast approaching, and I am truly excited to see what 2017 has in store for the world of SEO!

The Importance Of Hiring The Right SEO Firm

As a website owner, our desire is to see that our website reaches the top spot in the search engines and receive a lot of traffic that converts. You can have the best content in the world but without search engine optimization you will not reach that goal. It is a rather depressing scenario as SEO seems so simple.

However, if you have ever tried to play around with your own site, you realize it is anything but a simple task. There are so many nuances and algorithms to take into account. Learning how to do SEO and then implanting those techniques on a site would be a full-time job for many of us. Thankfully, there are people who do this for a living and many of them do it well. This article is going to shed some light on how to find the best SEO services.

Solid Portfolio

One of the most important aspects in finding an SEO firm is going to be the quality of their work, They should be able to offer up a full portfolio of sites that they were able to rank for several keywords and phrases over the years.

It is essential to take note of the competitive nature of the keywords they ranked for. Are they easy words like “best fried chicken dinner in Louisville KY?” Or is something that would take skill to rank for like, “best credit cards?” Anyone can rank for that first phrase, but there would be skill and expertise needed for such a competitive word as the second.

Guarantee

Not too long ago the internet could have been compared to the wild west. And some SEO professionals were the proverbial train robbers. They would charge companies large amounts of money for really no work at all. This was due to many companies not really understanding search engine optimization like they do today.

Now most SEO companies will offer a guarantee on the work they do and will not expect a blank cheque in advance. This is advantageous for smaller companies with a limited budget. If the ranking is not completed within a specified time you can either get your money back or allow more time for the individual to rank the keyword.

The Secret Sauce

One final ingredient to keep in mind when looking for the best SEO company is to find out how they plan to rank their site. You will not get any specifics, you just want to make sure that only ethical and white-hat methods are being used. If an SEO agency were to use underhanded methods to rank your site, it may be penalized down the road. When this happens you will either have to pay a good deal of money to get the site back in the rankings or simply begin a new one.

It can be a very time-consuming process, so make sure they are doing things that will not harm your site.

If it was easy to rank a site, everyone would be doing it! However, it is a difficult task that is best left to the professionals to handle. Let them rank your site, while you reap the long-term rewards.

 

Like any worthwhile business investment, selecting a Search Engine Optimization (SEO) agency requires time for careful consideration, and this is doubly true if your business relies heavily on online search for brand discovery.

The sheer number and variety of SEO firms to choose from is enough to give anyone pause.

During this process of intensive research and analysis for service procurement, a number of facets may not be as upfront as looking up an About Us page or researching an agency on LinkedIn.

Yet these same facets are crucial to return on investment, you don’t want to kick yourself for now knowing about them before committing to a potentially long-term relationship.

 

Culture of Transparency and Communication

You’ll want to ensure that the SEO vendor you partner with embraces the same values of transparency and effective communication you expect between in-house teams and/or employees, and for the same reasons, really.

Transparency affords businesses better relationships, synergy, engagement and solutions. Your SEO agency needs to meet the same standards your internal people do on a regular basis.

Some things to consider:

  • Which key performance indicator (KPIs) will be available to you on-demand? Are they the right ones for performance tracking?
  • Can you request any data relevant to your relationship at any given time with good reason?
  • What about communication times? Some changes in long-term strategies like link building obviously require some time to take root, so you need to be able to immediately shift tactics, can your SEO agency turn on a dime in these cases?
  • How can you ensure you’re getting the truth and not a dressed up version of events to make things look good?

Secondary and Tertiary Competencies

While most SEO agencies might list secondary and tertiary competencies in their packages, always make sure to ask.

SEO on its own is strictly limited to traffic, not conversion. It’s a means to the bottom-line, which means it functions in concert with relevant channels within search (e.g. pay-per-click ads) as well as efforts indirectly related to it.

Your SEO agency needs to be at least competent enough in coordinating and communicating to the other moving parts of your digital marketing machine to guarantee that their efforts won’t exist in a bubble, and your campaigns are not in disparate silos not working together toward a single goal.

Tech Stack

You’d be surprised how many people brush off the importance of tech stack compatibility when looking for partners across the many channels of digital marketing. There are a few simple questions that can help you determine if your SEO agency of choice has the right tech stack for your operation:

  • Are they experts in your current tech? If you’re running on WordPress, as is more than 70 million sites on the web, can your SEO agency work with that, or are they better with more technical CMS like Drupal or Joomla? Also, it’s one thing to be an expert at a certain tech stack or build, and another to just be “handy” in it.
  • Can they help you migrate to a new one, if necessary? Might seem contradictory to the above, but technology is constantly shifting. E-commerce portal Bluefly, for instance, recently found itself on the wrong end of tech adoption when the E-commerce platform they originally signed up for, among the most popular ones in the past decade, couldn’t support what they wanted to do on mobile. They ultimately had to switch providers.
  • APIs, APIs, APIs: The tech world is badly fragmented, and your SEO agency needs to ensure it either has the right application programming interfaces (APIs) or the capacity to support them from third parties.

Scale Potential

Your partner’s tech stack is relevant to this factor: scale potential refers to how big your partner can help you get before becoming too small for your operation.

It’s a simple truth that different SEO firms have various clientele targets. Some cater to small to mid-sized businesses (SMBs), others focus on enterprise. While the ones that focus on SMBs can offer unique insight to enterprise level clients, they neither have the manpower or tech infrastructure to support enterprise-level SEO.

Read more: http://www.business.com/seo-marketing/7-things-to-keep-in-mind-when-choosing-an-seo-agency/

 

95% of websites are HURTING their Own Google Rankings

We have checked hundreds of websites over the years and the sad fact is that 95% of them are actually doing things that will make it harder (or impossible) to get rankings on Google.

95percent

Is Your Site One of the 95%?

The question that you (as a business website owner) might well be asking is MY site one of the 95%?? Of course, you may not be bothered, thinking that your site’s ‘job’ is just to ‘be there’ when someone wants to check on you. But that is really a waste, your site could be doing so much more than just sitting back, waiting for the occasional visitor…

Brochure Sites

Brochure sites are sites that are just meant to act, well, as an online brochure, a means to impart information about a business to anyone who is interested. They are often just visited by people who having heard about a company (or maybe they met someone at a networking event?) want a bit more information before they contact them for a quote etc.

A Wasted Marketing Opportunity?

This is a good way of using the power of the Internet (saves on a lot of brochure printing for a start), BUT, is it also a wasted opportunity? The thing is here you have a website, full of (hopefully) interesting stuff about your business, the services that you offer and ‘what makes you special’ and yet no great efforts are being made to get more people to read it all. This must be a wasted opportunity, as any one of those visitors (that the site is not getting) could be a potential customer…

So What Are These Sites Doing Wrong?

The fact is that there are many ways that business sites are ‘getting it wrong’ when it comes to getting Google to ‘like’, and thus give their pages a prominent position for a given search term. Some of them are quite basic mistakes too and could easily be fixed with a few clicks (and a little bit of thought).

Some Examples of the Mistakes Sites Make

The Title Tag

You may not notice (although Google always does) this one, as it a bit hidden, but if you take a look at the top of your Internet Browser window, you will see the ‘Title’ information for the page you are looking at. In many cases you will see words like ‘Home’ or ‘About Us’. Whilst not being incorrect (as you would be looking at the Home or About us page), they are not really very informative to the very ‘person’ you really want to impress and that of course is Google.

Think about it, would not a phrase like ‘IT Support Services | Computer Repairs’ ‘tell’ Google a bit more than the word ‘Home’? It really is a no brainer and so very easy to fix….

The Meta Description

When you look at a page you don’t even see this (not even at the top of the Browser), it only being visible in Google’s search results, under the Title and the URL of a site. This might make you think that it is worthless from an SEO point of view, but you would be wrong. It is true that the words in the Description do not have a lot of clout SEOwise, but if you leave the field empty or use the same one on many pages, you run the risk of making the site appear to be ‘lazy’ as far as Google is concerned and that ‘black mark’ could make all the difference when Google has to decide what site to list for a phrase you want to be found for.

Again, a few clicks on the keyboard can make the problem go away.

The Elevator Speech

Another thing you should bear in mind is that a good Description can make all the difference when it comes to getting that all important click from the Google search results. Think of this 160 character text block as your ‘elevator’ speech and create one that would make someone just have to click through to your site, as it is only then that you get a chance to start that dialogue that could result in a sale or enquiry.

The Header Tags

This is another of those things that you will probably not have noticed (and yes you guessed it, Google is looking at this too), other that is that the text might look a bit bigger. But why is the correct use of Header tags important? To explain this I need to give you a bit of a history lesson, it all starting with the way that documents are constructed. This actually goes back to the time that newspapers were laid out using lead type as here the editors had to be able to let the people who were laying out the type which bits were the important, that is, what words (like the Headlines) needed to be big. This was all done using a ‘Header Tag Number ranging from 1 to 6 (or something similar).

This rule set was used when the code that describes how a page would be displayed on Wordprocessors and screens was written , it again being used to control how words would be displayed. This in turn fed through to the language that controls printers and also, most lately, how web pages are rendered by Browsers, this of course being HTML.

The Advent of CSS Styles

In the early days on the Internet there were in fact only a few ways you could control how big the words on a page were, these Header tags being one of them. Today of course you can control the font, size and colour of the text on your webpages using CSS Styles, but the importance of the Header tag lives on as Google still use these to work out which words on a web it should take more notice of, something that is vitally important when trying to get your page to the top of the results.

A Problem With Web Designers

It must be said that most sites use these Header tags, but the problem is they are often used incorrectly, the majority of web designers still using them to control the size of text, often compounding the issue by then using them for such terms as ‘Home’, ‘Contact Us’ or ‘Blog’. Highlighting words like these to Google is useless, far better to use them to point out to Google those words that you want to be found for like ‘IT Support Prices’ or ‘Best Anti Virus Software’.

Putting this right is a little harder than both of the above, but it is still not that big a job and makes your site that bit better in Google’s eyes and thus that bit more likely to get a good listing in their results.

Links – The Popularity Voting System of the Internet

Whilst the majority of the power that links bestow come from links to a site from other sites (so called ‘backlinks’ as they link back to you), the links FROM a webpage to other sites and the INTERNAL links in a site are also important. The first tells Google that you are a part of the community that makes up your market place (as well as pointing them at some other valuable resources, which Google likes to see), whilst the second type helps Google understand what each of your pages is about as well as helping people move about your site. As Google rates sites that offer the best ‘user experience’ higher than others, such internal links can only help.

Incoming Links

Whilst the links to a site cannot be put right by making changes to the site, they are a vital part of the ‘battle’ to get a site listed on Google, accounting for about 40% of the marks that Google allocate when deciding what site to list for what term. However, the fact is that the majority of sites either don’t have the any (or enough) links or have the wrong sort. Both of these can really hinder a sites chances of getting a first page (or any) ranking. Fixing them can take a long time and a lot of work though and has to be done very CAREFULLY.

 

SEMANTIC SEO and the Words on the Page

Semantic SEO is all about making sure that Google understand what a site is all about, thereby ensuring that it’s ‘meaning’ is fully comprehended. This is easier to do than you might think, the major thing to get right being to make sure you use the right words on the page. The right words of course are the words that Google wants to see. The good news is that Google will tell you what these words are, all you have to do is to ask in the right way, this being done by ‘Reverse Engineering‘ the top pages on Google …

Writing the Right Copy

Armed with these words and phrases, and a good understanding of the subject (it helps if you are a genuine expert) you can then write the right copy, adding some images, and if you can audio and video components as you go. Sprinkle some internal and external links at the same time and you have gone a long, long way of cracking this particular nut.

 

Polishing the Spitfire

You may not believe it, but it is said that back in World War 2 they used to polish the photo reconnaissance Spitfires (as well as painting them pink so that they were harder to spot in the dawn or dusk skies) just so that they could gain a few mph, something could make all the difference, life or death in this instance, when being chased by enemy fighters.

If you follow the guidance above and fix any of the items mentioned in the above information, it will in effect polish your website a little, perhaps gaining just enough extra speed to get your site onto Page 1 of Google and thus get the extra traffic that could make all the difference to your business.

 

Need Help With the Polishing?

However, if you need help with the polishing, even if it’s just some assistance in finding out what bits to polish the hardest, please do give us a call. We are here to help and offer a lot of free advice and assistance.

WHAT IS SCIENTIFIC SEO?

First a bit of history about Search Engine Optimisation

SEO can trace its history way back to 1994 when the early pioneers discovered that they could use the Internet to drive traffic to their sites and hence sell their goods. As this idea became more accepted, people started competing with each other for traffic and that meant that they had to ‘convince’ the Search Engine of the day to list their site for appropriate terms.

The Search Engine of the Day has changed over the years, Alta Vista, Ask Jeaves and Yahoo all being the top dog at some time. However, today, the big player is Google and thus that is the engine everyone wants to get listings on, and that of course means you have to understand the rules.

 

The Rules of The Old SEO

The rules that the Search Engines use have altered drastically over the years, as they have become more and more sophisticated. At the start, it was easy to ‘trick’ the Engines, all you needed to do was to stuff the pages with your keywords and get some links to the site (Google’s first stab at SEO was based on something called PageRank which basically is all about the number of links to  a site – and not much else).

These ‘old’ rules however had one big problem, in that the SEO professionals of the day kept finding ways around them and thus the Engines had to keep taking steps to close these ‘holes’ in their rule sets.

This process escalated over the years, especially since 2010, and basically Google decided that enough was enough and decided on a whole new approach, one that could not be
tricked and relied on one thing, perceived quality.

 

The New SEO and Perceived Quality

Today, with the advent of something called ‘SEMANTIC SEO’ (the meaning of a site, what it is really all about), things are a lot different, it being all about the quality of the content of a site.

But Why use the term Perceived Quality?

I use this term as I believe that there are limits to what Google can do, in that its computer algorithms cannot ‘really’ decide on what is real ‘quality’ content and what is not. Also, as mentioned above, links had, and still have a vital role to play in how Google decides what site to list for what.  But it cannot always tell if these links are ‘real’ or have been created, thus in all cases Google looks at a page/site and decides (using it’s rule sets) if it is quality or not.

This is why I say it is the quality that Google perceives in a site that is important. So how can you convince Google that your content is good enough to get a top ranking??

The Rules of the NEW SEO in Detail

Despite all the changes that have taken place in the world of SEO since 1994, but all of them are based on four things, one of these only recently coming to the fore.

The Four Things SEO is and was Based Upon

 Site Construction

The way a site is built is important as if it is constructed in the wrong way then Google cannot (or may just not want to be bothered to) find all the pages in a site. Also if the site is built in such a way that it is very slow, or is not mobile friendly, then too Google will downgrade the site in various ways.

One thing that does not cause so much of a problem today is that of the ‘Code to Text’ ratio (the amount of code that is used to build a site versus the number of words visible to the visitor). In the old days, too much ‘construction code’ was an issue, but today, with the advent of WordPress and the like, Google has been ‘forced’ to ignore this area, virtually all sites being very code heavy.

You MUST however ensure that the site can easily be navigated, a failure in that department being very serious indeed. Plus you should also use a fair number of internal links (not just the navigation) to highlight to Google what each page is about.

Words, Pictures and Videos

This is the area most affected by the new SEMANTIC SEO, it being vitally important to use all the ‘right’ words in a page. Gone are the days of just stuffing a page with the words you want to be found for. Today you need to understand what words Google wants to see and then make sure you include them in the copy, also making sure that you include pictures and where possible audio and video content on the page.

Reverse Engineering is the Key

This is where reverse engineering can help, the idea being that if you know what words are being used on the top pages (for a given term) then by including them (using correct grammar of course, as this is also checked) you must be getting closer to the perfect page.

Links

In the early days of SEO Links were vitally important, in fact they could, all by themselves get a page listed. However, today things have changed a lot. Links are still important counting for some 40% of the reason for a site getting a rank, but they are not as all powerful as they used to be.

Google is Watching You

Besides not being as important as they used to be, the links to a site are now carefully checked by Google. Their aim?, to make sure that the links to a site are ‘natural’ and not all built by an SEO company (although they know of course that the practice goes on all the time).

This checking is carried out by Google, the process being labelled as ‘Penguin’. Basically this checks a sites linking structure to see if it complies with the ‘rules’ and is hence seen to be natural. Here the number of links using the domain or URL of the site as the anchor text (the bit we humans click on) are checked, as are the number of links using ‘money words’ (the terms that a site wants to be found for) and those ‘noise’ links, like ‘see this site’, or ‘click here’. If the balance is not right, or they seem to have been created too fast, then a site can be heavily penalised.

This means that a site’s links have to be built very carefully over time and not all in a rush.

Social Media

This is very new in SEO terms and the amount of ‘power’ that social media chit chat, comments on Facebook and Twitter provide is not fully understood. In my view, the importance of Social Media is more to do with other marketing channels, but nevertheless, obtaining links via things like ‘Social Bookmarks’ can be useful.

Putting it All Together – Scientific SEO

So, what does all this mean?? Basically, it means that you must

 

  1. Find the words you want your site to be found for – KEYWORD RESEARCH
  2. Find the words you need to include in the copy of the page(s) using Reverse Engineering – CONTENT RESEARCH
  3. Build the links to the site, CAREFULLY
  4. If you can get some Social Media comments going (more important for sites selling direct to the public than B2B sites)
  5. Monitor the progress and make changes to improve matters further

 

 

I hope this helps you understand how the matter of SEO has to be approached today.

What Google Wants…

So What Does Google Want?

If the full answer to this question was indeed understood, you can bet just about every site that wanted top rankings would make changes to the way their site looked and worked in pretty short order, the prize, that top place on the first page of Google being worth a lot of money…

But of course, Google won’t tell anyone just what they want, instead they just give out information about some of the things they want to see and as importantly, don’t want to see. Whilst the knowledge that is imparted is useful, it only gives us a part of the picture.

 

wpid-2555-2555-Googles_RankBrain_3x29gn.jpg

A Vital Point – Google Often Ignores Its Own Rules

The biggest problem, from my point of view (as a SEO Professional) is that Google don’t only not tell you the rules, they also don’t keep to the ones you know about. This makes applying any scientific approach to the process difficult, its just like trying to find the boiling point of water when someone  is altering the  air pressure all the time. One time water will boil at 100’C, whilst at another it will boil at 90’C….

You can see this ‘not following their own rules’ phenomenon all the time (if you know what to look for) sites that break the rules still enjoying top ranking positions. This does make life difficult, but does not invalidate the data you can obtain by checking a site’s linking or page structure as it is more than possible (especially when it comes to links) that Google have yet to impose some form of penalty, the site then potentially losing the rankings it currently has.

Some of the Known SEO Rules

Keyword Stuffing

In the early days of the web, it was quite easy to trick the Search Engines into providing a first page rank simply by using the target words over and over again…. Things have moved  on now though, and if you try this trick today you will (more than likely) get worse rankings and not better ones.

Status – Avoid…

Use of Title Tag

The Title tag is not on the page, but is shown in the Browser Window and is used by Google to ‘understand’ what topic the page I question is all  about. It also, importantly forms the phrase that is seen when a site is listed by a Search Engine, so is something that needs to be carefully chosen.

Status – Use Wisely (best to keep to 65-70chrs)

Meta Description Tag

Like the Title tag, the words in this area are not  shown on  the actual page, instead they are used in the Search Engine listings and are to all intents and purposes an ‘elevator speech’. Their effect on SEO is very limited, expect that if they the same text is used on lots of pages that is. In such instances, it is believed that they may have a negative effect.

Status – Ensure that your website has a unique ‘elevator speech’ and is 165 – 170 characters long.

Header Tags

These tags have a long history, their use dating back to the days when newspapers were printed using lead type in blocks. More recently, they formed a part of the postscript language that allowed computers to communicate with printers. They were then subsumed into HTML and at the very start of the Internet, were the only way of creating bigger text on the screen. There are 6 Header styles, from H1 (the most important) to H6 (the least).

Google have stated in the past that they use the text within these tags <H1>the text</H1> as pointers about what the page is about, but now, as CSS styles are used to  control the size of the text on pages there is some debate that Google also treat any BIG text as important..

Status – Use, but only for important phrases (not for Navigation) and only have one H1 tag.

Word Count

There is evidence that the top pages for many search terms are ones that have over 1,000 words of copy, although this ‘requirement’ can go up and down depending on the level of competition. The most important factor here is to use the ‘right words’ on the page (this best found by reverse engineering the top sites  for any term) and to use as many as you can. Size is important here for two reasons, the first is (as explained above) that Google likes lots of words (words are its food after all) but there is another, as important reason  to have lot of text.

This second reason is based on the fact that obtaining traffic for ’long tail searches’ can be great for business, such search terms (normally 4 words or longer) often being used by people who are nearer the end of the buying process and thus that more likely to convert.

Status – Try to create pages that are 1,000 words or longer which contain relevant words and terms

Tabs and Accordions (Copy Triggerd by User Interaction)

One of the reasons that pages are often to light in copy is that the site owner (and the designer) rightly points out that a page that looks like  a ‘wall of text’ is likely to be off putting to viewers and would therefore increase bounce rates and reduce conversion rates.

There is a way of placing the text on the page so that Google can read it, but at the same time ‘hide’ it from viewers, until that is they want to see it. There are various methods doing this, but in every case, it is a user action  that causes the text to be made visible. This process is not treated as hiding text (in the old days people used white text on a white background would  you believe), something that Google frown upon and which could get a site banned, but is a practice that Google have reportedly said they are not altogether happy about.

I find this stance of Google’s somewhat strange as they also want sites to offer the best possible ‘user experience’ and  makes me feel that Google want the cake and to eat it too. But as I don’t believe they are actively  penalising sites that use this in their interface, it seems the  best way of providing Google with the words it needs whilst giving users the best way of assimilating the site’s message.

Status – Use with caution.

Internal Links and Links From Your Site

The power of links to sites are well understood, the right type and number enhancing the possibility of a site getting better rankings. However, it is not so widely understood that the links from a site also have their place. The reason that they are important is that they ‘prove’ to Google that the site is a part of the wider community (in that market area) as well as potentially helping users locate other relevant information.

Internal links also have a role in that they allow users to move through a site in an easier way than just using the navigation system. Used carefully these link types can really assist in improving both the ‘user experience’ and Search Engine rankings.

Status – Do implement links to relevant sites, the more powerful the better. Also, consider what internal  links you could place on your pages.

 

Links To Your Site.

Links to a site are still very powerful, accounting for at least 50% of the reason that a site is selected by Google for a ranking and form a VITAL part of any plan to get better Search Engine listings.

There are however some important factors to bear in mind…

  • Ensure that the links come from a wide number of locations / sites
  • Make sure that the anchor text used contains no more than 25% of ‘money phrases’
  • Check to see that the number of ‘other phrases’ is high, at least 30-40%.
  • Remember that a site is more than just a home page, links to internal pages also being needed
  • Plus when building links, make sure that you don’t build too many too quickly..

 

There is more to SEO of course, so please do see our site for more details and assistance.

What is the aim of Search Engine Optimisation?

The aim of Search Engine Optimisation is obviously about getting traffic, the right sort of traffic from the Search Engines, this being achieved by making a website more ‘attractive’ to Google, Bing etc, so that for certain phrases, the site is listed when someone searches for that phrase.

wpid-2555-2555-SEO_-_The_Old_and_the_NEW.jpg

The process of making a site appeal to the Engines is well understood, the rules dictating where you place the words that are important to the sites SEO being one that Google, for example, are happy to share. Some of the things that you should NOT do have also been shared, but the content of a site is only half the story, the power of links still being one that cannot be ignored.

It is this latter point that is to a degree strange, it being an effect of the very start of Google when it’s PageRank algorithm powered the way by which sites were graded. PageRank was based on the idea that sites that had lots of links MUST be good (or otherwise why did people take the time to create the link). It was not just numbers of links that counted though, the PageRank system looked at the page that held the link and at what pages linked to it, then checking the links to that page and so on. I am not sure as to the ‘depth’ that Google went to here, but it was quite deep and in the beginning worked quite well.

 The Start of The SEO ‘Battle’.

As soon as people became aware of how  the PageRank system worked, and bearing in mind the pot of gold that this form of marketing seemed to offer, companies sprung up offering services that were designed to create links, thus ‘fooling’ Google into thinking that a site was more popular than it really was.

Besides the links, Google also (at this early stage) took a cursory look at the words on the pages, it being heavily influenced  by the simple inclusion of a phrase, hence the start of pages that were ‘stuffed’ with strings of words, no real effort being made to make the page appeal to anyone  other than the Search Engines themselves.

Once this ‘war’ started, Google began to fight back, their systems starting to spot and penalise sites that stuffed their pages with the words they wanted to be found for, while also starting to check on the linking structures of sites in greater and greater detail, both of course with the idea of stamping out the ‘cheating’ that was going on.

Like most wars, both sides got cleverer and cleverer, one thinking of ways to get around the checks and rules that were created, the other trying to combat the attempts, one of the results being the birth of two of Google’s animals, Panda and Penguin.

Google’s Penguin – The Link Checker

As mentioned above, at the start it was links that mattered more than anything else, it being said by some that they could get a blank HTML page ranked if they created enough links to it. Google of course tightened its rule sets to try to counter such practices, in the end deciding to run periodic checks on the links to a site, the rule set being named ‘Penguin’.

Penguin’s aim is to ensure that the linking structure looks ‘normal’ (that is one that has not been manipulated too much) and there many checks that we know it runs (and many that I suspect we do not), these including the type of sites the links come from, and the words used as the ‘anchor text’ (the bit you click on). Failure to keep your linking structure looking ‘normal’ could result in an automatic penalty, one that could cause a site to lose rankings and potentially to be removed from the listings entirely.

However, the real change is not so much about checking the links, but the way that Google evaluate sites in an overall manner.

Google’s Panda

Besides links, it is the power of the content that Google measures. In the beginning, it was quite easy to ‘fool’ Google by simply including the words you wanted to be listed for, the quality of the site was not important. Of course, Google, who wanted to make money from advertising, could not allow these poor quality sites to dominate its rankings, as that would cause people to switch to another Engine, and with Google competing against the likes of Yahoo and AskJeaves this was important…

What Google needed was a system by which they could ensure that the pages they listed first were relevant and offered the information/service which people needed and  wanted. Poor quality sites with little or copied content were not wanted…

Thus the Panda rule set was born, its job being to sniff out sites and pages that were of poor quality this including sites that were not updated frequently enough, or seemed not to be ‘bothered’ enough even to create the right Meta Descriptions and Titles, in short sites that appeared to be ‘lazy’.

Panda also checked for copied and duplicated content as well as looking for pages that were ‘thin’ on words (less than 250), at the same time giving points to sites that included videos and images as well as links to interesting and relevant sites.

Then Came (or Comes) the Semantic Web

Both Penguin and Panda (in their various forms – they kept being altered to make them tougher to fool) were attempts by Google to ensure that the sites they listed were the very best (for any given search phrase), but they were not perfect and time after time the SEO community found ways around them.

Google could see this was going to be a never ending battle, so turned their attention to creating a set of rules that could not be fooled, this rule set looking not just at pages and links, but at the overall ‘meaning’ of the site (this is what Semantic SEO is all about), what ‘it was really all about’ and what problems it was trying to solve and the services it provided.

The Current State of Play

Once Google have perfected this rule set, then it is pretty certain that only the best sites and pages will appear at the top of its rankings, BUT and it is still a BIG BUT, they are far from achieving  this at the moment. The truth is though, at least for the very near future that they are far from achieving their aim, you only have to look at the top sites to see that in many cases, there are far better sites that should be occupying those coveted first page positions.

This of course is extremely annoying for any site owner who KNOWs their site deserves better treatment, but until Google (for one) really do implement this strategy in full, all you can do is to make the best you can to promote your site and that means carrying out works both on and off page.

SEO What You Need To Do

There are two main areas that need to be done in the right manner, one is making sure that the pages are full of useful content, the other to make sure that they site has enough links so that Google ‘believes in it’ enough to rank it. This is as you can imagine, quite a wide topic, but one that we cover in great detail.

So, please see the other pages of this site for more details on what Search Engine Optimisation is all about and the services that we offer at Serendipity Online Marketing Ltd