Search Engine Optimisation and Link Building in 2018

Let’s face it, SEO is considered by many to be a ‘black art’, by others to be a waste of time, whilst those who do agree that SEO is worthwhile will endlessly debate about what is good and what is bad, what tactics are ‘white’ and what are ‘black’.

“If you ask six SEO experts a question you will probably get 7 answers….”

Then again, if you turned to Google to ask them about SEO, they appear to suggest that they disagree with the concept (wanting their results to be natural and not manipulated), whilst at the same time knowing that without someone to help all the website owners ‘understand’ how to set up their sites so that Google can read them properly, they (Google) would be lost.

The Death of Link Building Announced Again (and Again and Again)

When it comes to the thorny topic of Link Building, not only do we see Google denouncing the process because it is not natural, we also see them desperate for some help in deciding what sites to list and what ones not to bother with. Like it or not Google needs links..

BUT, not all links are equal and there is definitely a way of gaining Google’s displeasure when it comes to building them. Do it the wrong way and your site is doomed, that is one of the known Google rules and is put into play all the time.

Turning to the Experts

This is why many businesses turn to the experts, as here they can rest easy, knowing their site will be built in a way that Google can read and the content will be created to suit the Search Engines and their readers alike. They will also know that the links built to their site will be created in such a way that it will not be penalised by Google. They will then expect that their site gets better rankings and more traffic. In many cases this is exactly what happens, but in some it does not, and sometimes it is impossible for anyone to discover just what has gone wrong, what appears to be working in one area, fails to do so in another…

Either way, you can be pretty sure that the website owner will not really be aware of what is going on and that there are many ways of creating a ‘buzz’ and the boost of (relevant) traffic that all website owners want.

The Key to Top Class Traffic

If you own a business which has a website, you will, I am sure, have been inundated with telephone calls and emails promising you top rankings on Google, sometimes for little cost. You will also have seen countless bits of software that will boost your site, often, they say, at the touch of a button….

Some of these claims will be by bona fide companies and of the software – particularly research software – can be useful, but what few of them will tell you is that it will be the CONTENT of the site that will win the day, both for getting traffic and for converting your visitors to customers.

The True Power of Content

So why is content so very important? This may seem to be a strange question, but many site owners do not give it much attention. They spend a get deal of time discussing format and presentation, but often give scant regard to the content the pages are to hold. So doing things this way is simply not going to work because there is nothing for Google to get its ’teeth’ into, so the rankings are poor. When (and if) a potential customer arrives, there is nothing to ‘make’ them want to buy or at least to take the relationship any further.

The correct way of approaching any market place (and the keyword market place of the web is no different) is to see what people want. When it comes to Google, this means finding out what phrases people use in the market sector which relate to your products, and thus the pages of your site that will be selling those products.

Reverse Engineering and Latent Semantic Indexing

Then you can start writing content that uses those phrases (and similar words using a technique called LSI (or Latent Semantic Indexing) – this being vital as Google gets cleverer and cleverer). You can even reverse engineer the top sites in Google for a given phrase, therefore TELLING you what words to use.

Pages written this way will not only give Google what it wants, but will also give the visitor the information that will enable them to decide if your product is for them or if you can help them solve the problem that drove them to search for help in the first place.
This content ought to include images, videos, flow charts and anything else that will help them to make a good decision (which hopefully means doing business with you).

The OTHER reason for TOP QUALITY CONTENT

Saying all this, content has another VITAL job to do in the battle for traffic and sales. Having good content will mean that others will link to the site and mention it in their Social Media postings, after all they will have good reason to, they will have something WORTH SHARING.

content is king internet concept

content is king internet concept

But, it is not always easy getting people to notice how good your copy is. The whole thing is a bit of a ‘chicken and egg’ situation. After all, your fantastic copy can’t get links until someone finds it, and reads it. Without rankings or some form of Social Media chatter, no one will ever know it is there..

Priming the Pump (and keeping the pressure up too)

This is where SEO, Paid Search and Social Media come into play. By using all or some of these systems, website owners can start the ball rolling so that people can see just how good they, and their all important copy, is.

Great Copy Required

This is where the need for the top rate copy comes in, as even though the SEO and Social Media work above will bring in the visits, if there is nothing there to grip the audience, all the time and effort will have been wasted as no sales will be made and perhaps as importantly, no one will find anything worthwhile to come back for, mention or link to.

Without these mentions and links Google will not get the signals it wants to give the site repeatedly better rankings and thus more effort is needed to keep things going. If, however, there is something to ‘write home about’ then the links will come in, your site / product will be mentioned on Social Media, and what is more you will get repeat visits.

Are you Getting Returning Visitors?

This is just one area where Google Analytics can help. Just as having lots of New Visitors is good, a low percentage of Returning visitors indicates that you site is not delivering and people are not coming back for more.

 

Are you getting a high enough level of returning visitors?

Are you getting a high enough level of returning visitors?

If your site is one with a poor level of returning visitors, then a long hard look at the contents is a must…

The Different Processes of SEO

The term Search Engine Optimisation covers a host of things, usually divided (basically) into Technical, On Page, Off Page and Social Media.

Technical SEO

This covers the way a site is built, how fast it is, how easy it can be read and navigated as well as topics like Rich Snippets and Schema. Some of this is easy to do, some of it a bit more difficult, and not all website developers or SEO professionals cover all of these areas.

On Page SEO

Here we are talking about the words on the pages and the placement of the ‘target keywords’ in the important places on the page, all with a view of ensuring that Google finds what it needs in the ‘appropriate places’ on a sites’ pages. Of course this includes the copy / content of a site, but that is not SEO.SEO is how you make sure that the copy is found, not the actual copy itself (except that SEO will help you find out what to talk about in the first place via the Keyword / Market Research phase).

Off Page SEO

This area covers the issue of Links (and to a degree Social Media). It is these ‘signals’ that attract Googles attention and that will get the rankings and ‘seed’ traffic needed. Creating links also helps to keep the pot boiling while the site builds up its momentum.

Social Media

This is included here because even though this is nothing to do with SEO per say, it is important when considering the process of getting the site, brand and product noticed and talked about in a way that will enhance the site in Google’s eyes and extend the reach of the site beyond that of the Search Engine Results.

The Basic SEO process

In all cases it is necessary to carry out the keyword research so that you can target the phrases that are relevant to your market place AND are being used today.

The site must then be built the right way (the technical SEO bit), and then the copy created. This should be of a high quality, but does not have to be as good as it has to be for ‘Top Notch SEO’, for reasons that will be apparent later. Things such as internal linking should be carried out of course, but basically this is a ‘quick’ method of SEO.

Then the link building starts. These are built in the right way at the right speed, using techniques like ‘Power Link Structures’. Social Media signals are also created using this method. In some cases the pump is also primed by actually creating a small amount of traffic to some of the articles and posts that form a part of the linking structure.

Guest Posts Are Used in the quick method too

Guest posts are included in the ‘quick’ method too of course, but they are used differently. As you will see later on, the ‘proper way’ of placing Guest posts is to find a top Infuencer site, chat to them and get them to accept the post (or pay an lot of money for the privilege). However, this process is a LOT more expensive than just placing an article on a relevant site, so for those clients with limited budgets this is the way we go. Basically, these Guest Posts are ‘link vehicles’ and as long as they are well written (no article spinning here at SOM) and contain links that are not going to trigger a Penguin penalty, they do help, we have many examples  the prove the point.

Carrying out SEO in this manner DOES work and is the way the majority of SEO companies work.

The Top Quality SEO process

If you talk to those SEO professionals who practice only the whitest of white SEO, then they will say this is the only way, everything else being a waste of time. Well I disagree with that, but there is no doubt that this process is superior and offers a greater chance of success, BUT, it is a lot harder and thus more expensive in time and money.

This process includes all the On page SEO that Basic SEO requires, including things such as having explanatory ‘Category Pages’ for Ecommerce sites. These are needed as most sites of this type have lots of product pages that (a) often use the same words as a host of other sites and (b) are also often far too short. Thus these Category pages allow the owner to present the products they sell, with links of course to the product pages themselves. These pages can be much better at getting rankings and thus their use should be seriously considered for all levels of SEO.

Power Pages

Remember, this whole process is based on having TOP QUALITY content on your site. Such pages are often called ‘Power Pages’, their contents varying from ‘How to do something’ to a great infographic, anything that would be interesting to visitors and has not been done before (or at least as sufficiently).

Text based Power Pages need to be around 2,000 words long and contain images, videos and links to other authoritative content on the web, PLUS of course the areas in your site that you want people to see and the pages that will result in conversions and sales. Infographics can be just that, but having some words on the page as well can help in my opinion (just as having a transcript of the words used in a video can).

What to Create the Power Page About

The Keyword Research for the site would of course have been carried out first, so the target phrases are known and understood. Using these words, the bulk of the site, (the ‘normal’ pages) will be written and optimised, this including interlinking relevant pages (Google likes this).

Power pages, however, have a different mission. Their job is to get noticed BIG TIME, to become a fount of knowledge and a ‘go to’ source of information on a particular subject (relevant to the products and services of the hosting website). With this in mind it is easy to see that the very first thing you have to know is what subject to write about.

Research into Trends (or try to start one yourself)

This is where checking on trending posts and web pages can be a great help, as it allows you to see what people have become interested in over a period of time (which you can set). You can then have a look at these posts / pages and use them as a basis of your own works, all in the knowledge that people are INTERESTED in the topic.

finding trending topics

 

Of course you can also plough your own farrow, choosing a topic that is relevant to your market place, for instance ‘What is the History of Plastering’ or ‘How to choose the right lawnmower’. There are countless topics to choose from. Besides Kudani, you could also use Buzzsumo.

Writing the Page

Either way, you can start your research into what to talk about, and that will mean looking not only at the trending sites, but also at all the top INFLUENCER sites, in this instance the ones that are at the top of the Search Engines’ results for some top terms.

All the while it is vital to make sure that the page will be ‘entertaining’ and fulfil one of its main purposes, that of being WORTH SHARING.

Note, for a great definition of what an Influencer is, click this link.

Supporting Guest Posts

One important part of this SEO process is to make sure that there are links to the Power Page from trusted sites, but as it can take some time to get an INFUENCER to mention the page or allow a Guest Post on their site, the first thing that needs to be done is to place a well written ‘taster’ post on a High Domain Authority site.

Thus one post (perhaps more) is written and placed on some relevant sites. In most instances this means paying a ‘publishing fee’. Here I must state that there are some SEO’s who think that placing a post on a site that is known to take money for the privilege is worthless. However when you know that high profile sites like the HuffingtonPost take money for Guest posts, you can see that their argument holds little water.

Converting the Influencers

This starts at the website level, where some selected sites are contacted with a view to them mentioning the Power Page or by having a Guest post placed on their site. It need not contain a DO FOLLOW link as we are after traffic as much as link juice, but if they will allow a FOLLOW link then all the better.

It is best if these influences have been contacted and nurtured for some time before you make a request to place a guest post on their site (this also being the case with Social Media Influencers).
Hopefully one of the sites you contact will allow the publication and thus provide you with a link and the potential for a lot of relevant traffic.

Create a Press Release

Press Releases are a well known way of creating a ‘buzz’ in a manner in which Google approves. All the posts are the same of course, but as the links are always NO FOLLOW this does not matter. Google, it is said, really loves press releases and so one pointing to your power page and telling all about how interesting it is and how they should not miss it, is a good idea.

Posting on your Own Social Media Channels

Presuming you have some Social Media accounts, now is the time to start posting about the power page, (although maybe you have been talking about it coming for a few weeks already – another neat trick). Remember that you will have to post again and again here, Social Media posts being, for the most part, short lived, as they are soon replaced with the next tweet and thus scroll off peoples screens. This makes choosing the right time to post important too.

Contacting the Social Media Influencers

Now is the time to start contacting the Social Media Influencers. There are various ways the leaders in a field can be found and once found the ‘nurturing process’ needs to be continued. This process needs to have been started before the power page is posted, the SEO Agency in question having to have commenced this process some time before.

The idea here is to mention that they may be interested in the Power Page’s contents, perhaps also mentioning the Guest Posts that have already been posted and the Press Release. All of this with the aim of getting them to ‘add their weight’ to the campaign.
This is important, as if they can be convinced to mention the power page on their Social Media accounts, the ripples will build and build, all resulting in more traffic and higher rankings.

Monitor and Interact

Hopefully you will have had some comments on your Social Media channels and on the Guest Posts (where the sites allow). It is VITAL that you monitor these and respond as that will only strengthen the whole campaign.

In Conclusion

So there we have it, a brief summary of what SEO is, and how the two main types differ. Hopefully you can see the differences between the two approaches and can understand why SEO carried out ‘by the book’ is such a long, complicated and thus expensive process.

The good news for businesses with shallower pockets is that the ‘basic SEO’ does work in most markets, you just have to choose to approach any highly competitive areas in a cleverer manner, and not try to charge headlong in to get top rankings for highly competitive keyword phrases.

Tips for Protecting Your Website from Hackers that Use SEO Keywords for Spreading Malware

There’s a growing trend for hackers to spread their malware by infecting websites that rank well in the search engines for certain keywords, and using those SEO keywords to get lots of visitors who will then become victims of ‘drive by downloads’.

If you want to protect your website from those hackers, then you will need to be proactive with your security. Many websites are breached not because a hacker specifically targeted them, but simply because they were vulnerable – the hackers get a list of websites that rank well for given keywords, then use software to see if those websites are vulnerable to generalised attacks. If they find a site that is vulnerable, then they’ll ‘break in’ and infect it with their malware.

WordPress and Magento are two of the most popular platforms for business websites – they are used for blogs and content sites, and for online stores. Because they are so popular, they get a lot of attention from malicious developers and users, who know that if they can find a security hole they can exploit it for financial gain – or just for fun. WordPress, in particular, is quite an ‘open’ platform in that anyone can just develop plug-ins and themes for it and distribute them without them having to undergo extensive checks. This means that there are a lot of plug-ins out there that are not well written, and that are riddled with potential holes for hackers to exploit.

If you run WordPress, Magento – or any other online content management system or store platform for that matter – then you should look at ways of securing it. For WordPress, that means installing plug-ins to block repeated failed login attempts, renaming the admin account, and keeping the main WordPress up to date, as well as keeping plug-ins up to date as well and removing any plug-ins and themes that you are not using. You should also delete the installation directory once you are satisfied that the installation was successful.

The same goes for Magento. It’s important that you remove the installation directory, change the admin path, and rename all the users to something hard to guess. Keep the platform itself patched up to date, and keep all your extensions patched as well. This will go a long way towards ensuring that the platform runs well and is secure.

If your site does get hacked, the first that you know of it might be when a malware warning pops up when you visit the site, or when you see a warning against it in the search engines. If that happens, your first priority should be removing the malware, then fixing the exploit that caused it. Once your site is clean – and not likely to be immediately re-infected – then you can look at telling the search engines that it’s fixed and asking them to remove the warnings against your site.

Removing malware is an involved process that takes some technical knowledge – and in some cases the vulnerability is something that only the web host can fix, not an end user. So, for this reason, it’s a good idea to hire a web developer to look at your site for you – or a security expert. It can be quite expensive to fix these issues, so try to prevent them from happening in the first place! Following best practices from day one is much easier than trying to fix the issues with your site after an infection has cropped up, and takes less knowledge too.

 

The security protocol used to protect the vast majority of wifi connections has been broken, potentially exposing wireless internet traffic to malicious eavesdroppers and attacks, according to the researcher who discovered the weakness.

Mathy Vanhoef, a security expert at Belgian university KU Leuven, discovered the weakness in the wireless security protocol WPA2, and published details of the flaw on Monday morning.

“Attackers can use this novel attack technique to read information that was previously assumed to be safely encrypted,” Vanhoef’s report said. “This can be abused to steal sensitive information such as credit card numbers, passwords, chat messages, emails, photos and so on.

Vanhoef emphasised that the attack works against all modern protected wifi networks. Depending on the network configuration, it is also possible to inject and manipulate data. For example, an attacker might be able to inject ransomware or other malware into websites.”

The vulnerability affects a number of operating systems and devices, the report said, including Android, Linux, Apple, Windows, OpenBSD, MediaTek, Linksys and others.

“If your device supports wifi, it is most likely affected,” Vanhoef wrote. “In general, any data or information that the victim transmits can be decrypted … Additionally, depending on the device being used and the network setup, it is also possible to decrypt data sent towards the victim (e.g. the content of a website).”

Vanhoef gave the weakness the codename Krack, short for Key Reinstallation AttaCK.

Britain’s National Cyber Security Centre said in a statement it was examining the vulnerability. “Research has been published today into potential global weaknesses to wifi systems. The attacker would have to be physically close to the target and the potential weaknesses would not compromise connections to secure websites, such as banking services or online shopping.

“We are examining the research and will be providing guidance if required. Internet security is a key NCSC priority and we continuously update our advice on issues such as wifi safety, device management and browser security.”

The United States Computer Emergency Readiness Team (Cert) issued a warning on Sunday in response to the vulnerability.

“The impact of exploiting these vulnerabilities includes decryption, packet replay, TCP connection hijacking, HTTP content injection and others,” the alert says, detailing a number of potential attacks. It adds that, since the vulnerability is in the protocol itself, rather than any specific device or software, “most or all correct implementations of the standard will be affected”.

The development is significant because the compromised security protocol is the most secure in general use to encrypt wifi connections. Older security standards have been broken in the past, but on those occasions a successor was available and in widespread use.

Crucially, the attack is unlikely to affect the security of information sent over the network that is protected in addition to the standard WPA2 encryption. This means connections to secure websites are still safe, as are other encrypted connections such as virtual private networks (VPN) and SSH communications.

However, insecure connections to websites – those which do not display a padlock icon in the address bar, indicating their support for HTTPS – should be considered public, and viewable to any other user on the network, until the vulnerability is fixed.

Equally, home internet connections will remain difficult to fully secure for quite some time. Many wireless routers are infrequently if ever updated, meaning that they will continue to communicate in an insecure manner. However, Vanhoef says, if the fix is installed on a phone or computer, that device will still be able to communicate with an insecure router. That means even users with an unpatched router should still fix as many devices as they can, to ensure security on other networks.

Read more: https://www.theguardian.com/technology/2017/oct/16/wpa2-wifi-security-vulnerable-hacking-us-government-warns

 

Research into the FRED Google update, confirming why sites lost rankings.

The Fred Update by Google caused quite a ripple in the SEO world, many sites losing ranks, and hence traffic, up to 90% in some cases.  I have been doing quite a bit of digging and have asked some Gurus some pointed questions about why and what has happened.

The overall thoughts on the matter are that Google penalised sites that had poor content, or ones that were simply there to make money and not give anything back to the visitor in the form of useful data or information.

User Experience is Another Factor

Other thoughts on the matter were more to do with the User Experience that a page gave its visitors. Here the sites that were said to be hit included those that placed the copy below the fold of the screen or in some cases had very low load times.

However, in some cases sites were hit that were not just ‘out to make money’, but that seem to have been ‘lumped in’ with those that do because of the lack of content on their page.

Having a Lot of Links Did Not Save Sites

There was also talk that FRED checked on the quality of the links to sites too. This may turn out to be the case, further research is needed on this matter. However, what we can say is that sites that fell foul of FRED’s On Page quality checks were not saved by having a lot of links. Instead their positions were taken by sites that had inferior linking profiles, both at Page and Domain levels.

This research only covers 9 sites, so it can hardly be said to be definitive, but the evidence so far is pretty conclusive. Further research into the sites that were affected but did not fit the profile of sites that ‘should have been affected’ (by Fred) is the next step. More on the ‘efficiency’ of Fred later.

The FRED Data

In each case, the sites that held a first page rank before Fred for a given term were compared with the sites that now hold the first page (for that term). The sites that had lost their first page rank (had to have a position of 7 or above pre Fred) were then checked, this with a view to see ‘what could have caused them to lose their rank’ and whether this fitted with the profile of sites that Fred ‘should have hit’.

The phrases checked covered a range of topics, ranging from ‘lqf fruit’ to ‘chemical companies’ so should be diverse enough to give some firm data.

Search Phrase ‘ lqf Fruit’

Before and After FRED

Search Results pre and post Fred

Google results before and after FRED

Here two sites lost their first page rank:-

1

Not enough text for FRED

A screen shot of the site

This site had lost a rank of 5, and when checked, we saw that the actual page that was shown when you clicked the link was https://www.thespruce.com/what-does-iqf-mean-995719, a page not even on the stated domain. Something that is sure to annoy Google to start with. Furthermore, this page had very thin content and seemed to be only really there provide to a place for Google Ads and other advertisments. Being a prime target for Fred, it is not surprising to see that it was hit.

2

 

Content to Thin

The fruitbycrops site

Again a site with very thin content, just 155 words with an Advert at the very top, again a prime target for Fred.

 

Search Phrase ‘ chemical companies’

Before and After FRED

 

Results before and after FRED

The results for the term before and after FRED

Again two sites affected:-

12

 

 

 

 

This is a big website, with a lot of links, some 222,000 links to the domain, (although only 3 to the page)  linking to the page, the reason it lost its ranks seemingly down to the fact that the page in question was just not related enough, it being just one short item on the page.

4

 

An example of a penalised site

Was this site penalised because it’s copy was not ‘good enough’. Seems to be the most likely.

Another page that held just a small amount of what I would call ‘filler text’, it not really ‘saying anything’, at least in my view, the total length being just 251 words. Again a prime target for the Fred update.

 

Search Phrase ‘welding supplies uk’

Changes in the Google results pre and post FRED

The results from Google, pre and post the Fred update

Two sites here:-.

11

An example of a site hit by FRED

The Weldingshop site one of many hit by the Fred update

This site is not that bad in reality, although some may think it is a bit old fashioned. But it is not as bad as many that do hold onto first page ranks.  What is most likely the cause of the pages loss of rankings is that the main copy is only 340 words long. This leads me to consider that the length on the copy is considered below the ‘satisfactory’ level laid down in the Google Quality Guidelines.

5

 

Little text below the fold

Too little copy, with it below the fold. Possible reasons for the site being hit by FRED

This page lost a rank of 7, again the amount of copy being the likely cause of the drop, only 270 words being on the page, whilst also being below the fold, a factor that Google has stated (in 2012) that caused the value of any copy to be degraded.

Search Phrase ‘metal fabricators’

See how Fred altered the Google results

Google results pre and post the FRED update

Three sites had lost their ranks for this phrase

6

 

 

 

To few words for FRED?

Another site hit by Fred, more than likely due to the small amount of copy

Yet another page that lost its ranks, apparently down to the lack of content, the copy amounting to just 154 words.

7

 

Text below the fold - a reason for a Fred hit?

A page with over 600 words, but being below the fold, this could have caused a Fred hit.

This site had a rank of 4 before Fred, and does have a fair number of words, over 600 in all. However, 90% of it is below the fold on the screen and this looks to be the reason for the drop.

8

 

Yet another site hit by Google's Fred.

Yet another site hit by Google’s Fred.

This page lost its 6th position, it again being a ‘low volume of copy’ casualty, the length of copy amounting to just 170 words.

 

Conclusion

In all cases we can see that the sites affected by Fred did seem to fit the patterns suggested by the Gurus and by other research in that they mostly had very thin copy or ‘hid’ the copy below the fold in the page.

The next step is to see if the pages we are currently looking after SEOwise that also suffered a drop in rankings also fit this pattern.

Watch out for another report on this later in April.

How Should You Position Your Web Content?

We were approached by Tracy at UKWebhostreview.com and asked if we would like to feature an infographic on how to position web content on a site to get the very best effect. This has always been an important topic, BUT, after the Google Fred Update, anything that improves the User Experience is something that deserves serious consideration. So, we were more than happy to host this post and hope that you find it as useful as we have.

Guest Post from UKHostReview on Positioning Web Content

If you’re asking this question then you are already thinking a lot more deeply about your online marketing than a large population of website owners. People can often get caught up in getting a website set up quickly or concentrating on which web host to go for and other aspects involved in website building.

Infographic by UKwebhostreview

Infographic on how to place web content supplied by UKwebhostreview

When this happens, some of the other important considerations like content positioning can be neglected, which will result in a website that isn’t as effective as it should be. When we talk about website effectiveness, the key measure that most people will be interested in is driving increased customer sales. If you are setting up a business website then one of your main priorities should be to get the positioning right on your website. This can seriously be the determining factor in how many sales your business is making, so should be treated as a top priority for you.

If you’re not an expert in developing content or positioning content for maximum effect, then you will probably find this infographic from James at UKwebhostreview.com of great use. It lists the 25 features that every online business must have in 2017, so as you can probably tell from the title it is a very comprehensive list. It shows you exactly where to add your key features like call to action button or logo with tagline. You can also use the list of features to check that you have remembered to include every essential item of content that a good website requires.

Whatever stage of website set up you are at, whether you are only just beginning or you have had your website set up for some time, you should use these 25 features as a guideline for how to structure your website content to drive the best results.

95% of websites are HURTING their Own Google Rankings

We have checked hundreds of websites over the years and the sad fact is that 95% of them are actually doing things that will make it harder (or impossible) to get rankings on Google.

95percent

Is Your Site One of the 95%?

The question that you (as a business website owner) might well be asking is MY site one of the 95%?? Of course, you may not be bothered, thinking that your site’s ‘job’ is just to ‘be there’ when someone wants to check on you. But that is really a waste, your site could be doing so much more than just sitting back, waiting for the occasional visitor…

Brochure Sites

Brochure sites are sites that are just meant to act, well, as an online brochure, a means to impart information about a business to anyone who is interested. They are often just visited by people who having heard about a company (or maybe they met someone at a networking event?) want a bit more information before they contact them for a quote etc.

A Wasted Marketing Opportunity?

This is a good way of using the power of the Internet (saves on a lot of brochure printing for a start), BUT, is it also a wasted opportunity? The thing is here you have a website, full of (hopefully) interesting stuff about your business, the services that you offer and ‘what makes you special’ and yet no great efforts are being made to get more people to read it all. This must be a wasted opportunity, as any one of those visitors (that the site is not getting) could be a potential customer…

So What Are These Sites Doing Wrong?

The fact is that there are many ways that business sites are ‘getting it wrong’ when it comes to getting Google to ‘like’, and thus give their pages a prominent position for a given search term. Some of them are quite basic mistakes too and could easily be fixed with a few clicks (and a little bit of thought).

Some Examples of the Mistakes Sites Make

The Title Tag

You may not notice (although Google always does) this one, as it a bit hidden, but if you take a look at the top of your Internet Browser window, you will see the ‘Title’ information for the page you are looking at. In many cases you will see words like ‘Home’ or ‘About Us’. Whilst not being incorrect (as you would be looking at the Home or About us page), they are not really very informative to the very ‘person’ you really want to impress and that of course is Google.

Think about it, would not a phrase like ‘IT Support Services | Computer Repairs’ ‘tell’ Google a bit more than the word ‘Home’? It really is a no brainer and so very easy to fix….

The Meta Description

When you look at a page you don’t even see this (not even at the top of the Browser), it only being visible in Google’s search results, under the Title and the URL of a site. This might make you think that it is worthless from an SEO point of view, but you would be wrong. It is true that the words in the Description do not have a lot of clout SEOwise, but if you leave the field empty or use the same one on many pages, you run the risk of making the site appear to be ‘lazy’ as far as Google is concerned and that ‘black mark’ could make all the difference when Google has to decide what site to list for a phrase you want to be found for.

Again, a few clicks on the keyboard can make the problem go away.

The Elevator Speech

Another thing you should bear in mind is that a good Description can make all the difference when it comes to getting that all important click from the Google search results. Think of this 160 character text block as your ‘elevator’ speech and create one that would make someone just have to click through to your site, as it is only then that you get a chance to start that dialogue that could result in a sale or enquiry.

The Header Tags

This is another of those things that you will probably not have noticed (and yes you guessed it, Google is looking at this too), other that is that the text might look a bit bigger. But why is the correct use of Header tags important? To explain this I need to give you a bit of a history lesson, it all starting with the way that documents are constructed. This actually goes back to the time that newspapers were laid out using lead type as here the editors had to be able to let the people who were laying out the type which bits were the important, that is, what words (like the Headlines) needed to be big. This was all done using a ‘Header Tag Number ranging from 1 to 6 (or something similar).

This rule set was used when the code that describes how a page would be displayed on Wordprocessors and screens was written , it again being used to control how words would be displayed. This in turn fed through to the language that controls printers and also, most lately, how web pages are rendered by Browsers, this of course being HTML.

The Advent of CSS Styles

In the early days on the Internet there were in fact only a few ways you could control how big the words on a page were, these Header tags being one of them. Today of course you can control the font, size and colour of the text on your webpages using CSS Styles, but the importance of the Header tag lives on as Google still use these to work out which words on a web it should take more notice of, something that is vitally important when trying to get your page to the top of the results.

A Problem With Web Designers

It must be said that most sites use these Header tags, but the problem is they are often used incorrectly, the majority of web designers still using them to control the size of text, often compounding the issue by then using them for such terms as ‘Home’, ‘Contact Us’ or ‘Blog’. Highlighting words like these to Google is useless, far better to use them to point out to Google those words that you want to be found for like ‘IT Support Prices’ or ‘Best Anti Virus Software’.

Putting this right is a little harder than both of the above, but it is still not that big a job and makes your site that bit better in Google’s eyes and thus that bit more likely to get a good listing in their results.

Links – The Popularity Voting System of the Internet

Whilst the majority of the power that links bestow come from links to a site from other sites (so called ‘backlinks’ as they link back to you), the links FROM a webpage to other sites and the INTERNAL links in a site are also important. The first tells Google that you are a part of the community that makes up your market place (as well as pointing them at some other valuable resources, which Google likes to see), whilst the second type helps Google understand what each of your pages is about as well as helping people move about your site. As Google rates sites that offer the best ‘user experience’ higher than others, such internal links can only help.

Incoming Links

Whilst the links to a site cannot be put right by making changes to the site, they are a vital part of the ‘battle’ to get a site listed on Google, accounting for about 40% of the marks that Google allocate when deciding what site to list for what term. However, the fact is that the majority of sites either don’t have the any (or enough) links or have the wrong sort. Both of these can really hinder a sites chances of getting a first page (or any) ranking. Fixing them can take a long time and a lot of work though and has to be done very CAREFULLY.

 

SEMANTIC SEO and the Words on the Page

Semantic SEO is all about making sure that Google understand what a site is all about, thereby ensuring that it’s ‘meaning’ is fully comprehended. This is easier to do than you might think, the major thing to get right being to make sure you use the right words on the page. The right words of course are the words that Google wants to see. The good news is that Google will tell you what these words are, all you have to do is to ask in the right way, this being done by ‘Reverse Engineering‘ the top pages on Google …

Writing the Right Copy

Armed with these words and phrases, and a good understanding of the subject (it helps if you are a genuine expert) you can then write the right copy, adding some images, and if you can audio and video components as you go. Sprinkle some internal and external links at the same time and you have gone a long, long way of cracking this particular nut.

 

Polishing the Spitfire

You may not believe it, but it is said that back in World War 2 they used to polish the photo reconnaissance Spitfires (as well as painting them pink so that they were harder to spot in the dawn or dusk skies) just so that they could gain a few mph, something could make all the difference, life or death in this instance, when being chased by enemy fighters.

If you follow the guidance above and fix any of the items mentioned in the above information, it will in effect polish your website a little, perhaps gaining just enough extra speed to get your site onto Page 1 of Google and thus get the extra traffic that could make all the difference to your business.

 

Need Help With the Polishing?

However, if you need help with the polishing, even if it’s just some assistance in finding out what bits to polish the hardest, please do give us a call. We are here to help and offer a lot of free advice and assistance.

WHAT IS SCIENTIFIC SEO?

First a bit of history about Search Engine Optimisation

SEO can trace its history way back to 1994 when the early pioneers discovered that they could use the Internet to drive traffic to their sites and hence sell their goods. As this idea became more accepted, people started competing with each other for traffic and that meant that they had to ‘convince’ the Search Engine of the day to list their site for appropriate terms.

The Search Engine of the Day has changed over the years, Alta Vista, Ask Jeaves and Yahoo all being the top dog at some time. However, today, the big player is Google and thus that is the engine everyone wants to get listings on, and that of course means you have to understand the rules.

 

The Rules of The Old SEO

The rules that the Search Engines use have altered drastically over the years, as they have become more and more sophisticated. At the start, it was easy to ‘trick’ the Engines, all you needed to do was to stuff the pages with your keywords and get some links to the site (Google’s first stab at SEO was based on something called PageRank which basically is all about the number of links to  a site – and not much else).

These ‘old’ rules however had one big problem, in that the SEO professionals of the day kept finding ways around them and thus the Engines had to keep taking steps to close these ‘holes’ in their rule sets.

This process escalated over the years, especially since 2010, and basically Google decided that enough was enough and decided on a whole new approach, one that could not be
tricked and relied on one thing, perceived quality.

 

The New SEO and Perceived Quality

Today, with the advent of something called ‘SEMANTIC SEO’ (the meaning of a site, what it is really all about), things are a lot different, it being all about the quality of the content of a site.

But Why use the term Perceived Quality?

I use this term as I believe that there are limits to what Google can do, in that its computer algorithms cannot ‘really’ decide on what is real ‘quality’ content and what is not. Also, as mentioned above, links had, and still have a vital role to play in how Google decides what site to list for what.  But it cannot always tell if these links are ‘real’ or have been created, thus in all cases Google looks at a page/site and decides (using it’s rule sets) if it is quality or not.

This is why I say it is the quality that Google perceives in a site that is important. So how can you convince Google that your content is good enough to get a top ranking??

The Rules of the NEW SEO in Detail

Despite all the changes that have taken place in the world of SEO since 1994, but all of them are based on four things, one of these only recently coming to the fore.

The Four Things SEO is and was Based Upon

 Site Construction

The way a site is built is important as if it is constructed in the wrong way then Google cannot (or may just not want to be bothered to) find all the pages in a site. Also if the site is built in such a way that it is very slow, or is not mobile friendly, then too Google will downgrade the site in various ways.

One thing that does not cause so much of a problem today is that of the ‘Code to Text’ ratio (the amount of code that is used to build a site versus the number of words visible to the visitor). In the old days, too much ‘construction code’ was an issue, but today, with the advent of WordPress and the like, Google has been ‘forced’ to ignore this area, virtually all sites being very code heavy.

You MUST however ensure that the site can easily be navigated, a failure in that department being very serious indeed. Plus you should also use a fair number of internal links (not just the navigation) to highlight to Google what each page is about.

Words, Pictures and Videos

This is the area most affected by the new SEMANTIC SEO, it being vitally important to use all the ‘right’ words in a page. Gone are the days of just stuffing a page with the words you want to be found for. Today you need to understand what words Google wants to see and then make sure you include them in the copy, also making sure that you include pictures and where possible audio and video content on the page.

Reverse Engineering is the Key

This is where reverse engineering can help, the idea being that if you know what words are being used on the top pages (for a given term) then by including them (using correct grammar of course, as this is also checked) you must be getting closer to the perfect page.

Links

In the early days of SEO Links were vitally important, in fact they could, all by themselves get a page listed. However, today things have changed a lot. Links are still important counting for some 40% of the reason for a site getting a rank, but they are not as all powerful as they used to be.

Google is Watching You

Besides not being as important as they used to be, the links to a site are now carefully checked by Google. Their aim?, to make sure that the links to a site are ‘natural’ and not all built by an SEO company (although they know of course that the practice goes on all the time).

This checking is carried out by Google, the process being labelled as ‘Penguin’. Basically this checks a sites linking structure to see if it complies with the ‘rules’ and is hence seen to be natural. Here the number of links using the domain or URL of the site as the anchor text (the bit we humans click on) are checked, as are the number of links using ‘money words’ (the terms that a site wants to be found for) and those ‘noise’ links, like ‘see this site’, or ‘click here’. If the balance is not right, or they seem to have been created too fast, then a site can be heavily penalised.

This means that a site’s links have to be built very carefully over time and not all in a rush.

Social Media

This is very new in SEO terms and the amount of ‘power’ that social media chit chat, comments on Facebook and Twitter provide is not fully understood. In my view, the importance of Social Media is more to do with other marketing channels, but nevertheless, obtaining links via things like ‘Social Bookmarks’ can be useful.

Putting it All Together – Scientific SEO

So, what does all this mean?? Basically, it means that you must

 

  1. Find the words you want your site to be found for – KEYWORD RESEARCH
  2. Find the words you need to include in the copy of the page(s) using Reverse Engineering – CONTENT RESEARCH
  3. Build the links to the site, CAREFULLY
  4. If you can get some Social Media comments going (more important for sites selling direct to the public than B2B sites)
  5. Monitor the progress and make changes to improve matters further

 

 

I hope this helps you understand how the matter of SEO has to be approached today.

Is Your Online Presence Failing to Sell?: Here Are 4 Reasons Why

There is an old saying in that ‘you can bring the horse to water but you cannot make them drink’ and never has one been so accurate when talking about web traffic…

Getting_the_horse_to_drink_is_Key_in_the_Internet_World_oQX5QK

From an SEO or Social Media point of view, getting traffic to a site is the first big goal, but it has to be the right sort of traffic and then the site must do its job and get them to engage, taking a ‘sip’ if not a big gulp.

The Engagement Process

A part of this ‘engagement process’ is of course down to design, it has to appeal (very quickly) to the browser, or risk loosing them in those vital first seconds.

The next thing of course is the content of the page. Is it what the customer wants?, your Bounce Rates will tell you (and Google too if they come from a search) so need to be constantly rewiewed, just in case you are not doing things the way that your customers want, these after all being the final arbiter…

The site’s content and the way it approaches it’s customers is therefore key. It does not matter how many potential customers (horses) you deliver to a site if the ‘water’ does not look good and tasty.

Getting on the Customers Shortlist

But what is ‘tasty?’ A very good question and one that will change depending on what the site is about and where in the buying cycle your customer is. The article below covers this in one of it’s points saying in effect that those who are just starting in their quest are looking for very general data and thus don’t want the full nine words on your product / service, but just an initial description. If you get on their shortllist they will be back..

Besides the issue of good ‘useful’ content, there is the matter of Re-Engagement. This is another topic and one that we will come back too in the future, but it is important as just because the visitor does not buy today, does not mean that they might not buy tomorrow, so keeping in contact and reminding them that you are there waiting to serve them, is a good idea.

For the full article on Why your site is not converting, please click the link.

The cheese moved. The buying process has changed. Technology to support and further that change continues to grow and evolve. Communicating through the vast array of digital channels (website, SEM, social, email) is no longer an option. It is a must-do.

The online presence of your business must attract and convert prospects. It must engage with leads through a variety of channels as users travel through a longer and more complex buying cycle.

How we market and communicate online has come a long way from static, brochure-like internet pages and “spray-and-pray” email blasts. Unfortunately, for many, online marketing is still failing to reach its full potential.

Pointing a finger at the underlying technology would be easy, e.g., marketing automation, content management systems or any of the tools and solutions laid out in Scott Brinker’s Marketing Technology Super Graphic.

It’s easy to say the technology is failing, so the marketing effort is failing. But the reality is more complex. Here are four of the biggest reasons why your online presence is failing to drive sales.

Engagement For The Wrong Reasons

Using engagement tactics that are not aligned with business goals is a huge waste of time and money. Too often, I see engagement for engagement’s sake. This results in leads stagnated in the buying cycle and low-volume sales funnels.

Having a high number of Twitter followers or a successful content syndication program is great, but that is not success.

CMOs are being judged on sales. And following your brand or downloading an asset is not a sale.

Social followers matter. They are your advocates. They can extend your message.

But focusing on the number of followers and not their engagement and conversion ratios results in negative ROI for the money spent to generate them. It also takes the marketing eye off the important goal of a sale.

Weak Commitment To Prospects

Generating new leads through content syndication or SEM is the start of the buyer’s journey. Most leads are not ready to buy at that point.

So not using retargeting or nurture programs to bring them back for further communication is a waste of the money spent to find them to begin with.

When they fail to travel along the pipeline because they are left to rot somewhere between the marketing and sales organization, it reflects negatively on the organization.

Lack Of Good Content

Everywhere I look, the numbers show an increase in content marketing spend and usage. Businesses are spending millions to have content developed — to tell their story, engage with their prospects, and help convert their leads through the buying cycle.

And yet much of what is used to attract and engage leads is sales enablement content. It’s all about features and functions. Or it’s focused on selling something, rather than trying to educate.

Take, for example, the content used in top-of-funnel nurture programs. More times than I care to remember, I’ve seen programs use 45-minute product webinars or 20-page product briefs.

Breakdown happens when leads don’t engage, and the prospects in the funnel dry up. This is because leads in the early buying cycle don’t want to know everything about the product, and they don’t want to be sold to. Rather, they want to know what the options are and what to consider as they do their research.

Marketing Teams Are Not Living In The Now

Stagnating means not going to where your customers are by using better ways and different channels to communicate with them online. It results in low communication. And poor communication results in low sales.

Semantic SEO and Google, the (not so) Blind Man

In some of my previous posts, and when discussing SEO with my clients, I’ve often alluded to Google being like a blind man in a department store. I used this analogy as, without some help, both the man and Google could easily get lost and not be sure that they were in the right place.

In the case of the blind man, this would result in him leaving the store without making a purchase (perhaps never to return); in Google’s case it could mean that they will not understand what the site is really all about. This could be catastrophic as far as getting rankings for just about anything is concerned.

Leaving signposts on your web pages

fc57fda85fe64fea99de8d6980d60221.

Of course, in a store you have Braille signs, but what is the equivalent on a website? The answer is of course the Meta Title, Description and Header tags of the pages. Using these to inform Google about the content of the pages is a great first step; even though it’s very much part of the ‘Old SEO’ it’s still vital today.

Google ‘the not so blind man’ and old and new SEO

Even with all of its power and the new SEO practices that it’s forcing us all to follow, Google is still like a blind man in that it needs help to ensure that it gets the right end of the proverbial stick. There is however a huge difference between Google of old and the one that is evolving before our eyes.

If you’re of a certain age, you may remember the TV series Kung Fu. In it, David Carradine stared as a Shaolin monk (Kwai Chang Caine) who, through the training he received, became a martial arts expert. However, it’s not David that’s interesting here, but his mentor, Master Po. Po was totally blind, yet he could ‘see’ everything, pointing out the grasshopper at the feet of the young Kwai Chang – something the latter, even with his perfect vision, had missed.

Today, Google is like Master Po: it can’t see everything, but it can see a lot and all that it does see is taken into account when considering what site to rank for what. But it’s vitally important to understand how it is planning (and to some degree already is) to use this enormous amount of data. That’s because this is the big difference between old and new SEO.

Old SEO equals keyword matching

To be fair, old SEO was more than simply matching a keyword phrase to the ‘best’ sites for that term; even the old systems had 200 or so ‘factors’ that were taken into account. But in the end, it was mostly to do with how well the ‘signposts’ you placed on a site (be they in the Titles, Headers or copy, not to mention all those links) matched the keyword phrase; that’s what really counted.

This of course led to gaming of the system. SEO companies would alter the pages of a site to SHOUT the target keywords to Google. And to reinforce the message they’d create thousands of links to reinforce the message. Pages without any real merit reached the top of the listings and Google came out with more and more rules to try to combat the situation. It was a time of new trick after new trick, with each one being found out and the gains it had brought removed. But it worked, and to some degree still does.

The days of Old SEO are numbered

Google, it seems, concluded that it wasn’t going to continue with this ‘arms race’. Instead, it would change the game entirely. In my view, it didn’t do this out of spite; I believe Google just wanted to ensure that it would always be able to pick the best sites for any phrase and never be tricked again.

This was no mean task, but Google has a plan based on the fact that, instead of just matching keywords to sites, they will (try to) look beyond the words to the meaning of the search phrase – in other words, what you or I, as searchers, are really looking for.

This was one of the reasons for the introduction of the Hummingbird update (technically this was more like changing the engine than replacing a part of it, but let’s call it an update for simplicity). In doing so, Google wanted to be better able to understand what people wanted when they used the new Voice Search feature on smartphones. (By the way, according to the experts, the reason for this is that people express things differently when speaking, compared to when they write them down.)

The reason it’s called Semantic SEO

This leads nicely to the reason this whole process is called Semantic SEO. Semantic is a Greek word that means ‘meaning’. As Google is trying to work out what the intent (and what it really means) behind a search phrase is, this has led to the whole process being called Semantic SEO.

Google does more than just try to work out what the real user intent behind a search phrase is. In order to come up with matches in its database of sites, it must also understand the real meaning of any page. To do this, it must work out what the content is trying to say; that is, how it can help, inform and entertain.

It is thus vital to understand what message you are trying to put across with any content. You can read more advice on this in the next post.

But how does Semantic SEO work?

This is the big question for anyone who wants the best rankings possible for any relevant search phrases. But it’s here that we hit the first real change. You see, even though keywords still have their importance, they’re not the be-all and end-all that they used to be. That’s because Google no longer relies on simple keyword matching.

So, if Google isn’t using the words on pages to decide what it should list, what is it using? This is where it gets tricky to explain; basically, Google will look at the information, the real meaning of a page and the site it is part of, and the purpose behind its creation. It will also look at what others say about it (and on it in the case of comments) before deciding if this matches the meaning behind the search phrase.

Being found when you’re not even being searched for

This is what Serendipitous Search is all about. It’s another another huge change to the old SEO because Google is now more of an ‘answer engine’ that provides suggestions for sites it thinks might be useful – even though they don’t include the keywords being searched for.

The more you make your site answer the questions and needs of your potential customers, the more Google traffic you will give you.

 

Semantic SEO and the feedback loop

908bf711c9ad4517aba6157dabf9a2fe.

This is another very interesting (and potentially scary) thing about the new Google. Not only does it look at the words on pages, their meaning, links to and from a page, and social media comments (as well as who made them). Google also looks at the data it has gleaned from the billions of searches it makes every day and sees how each one went.

This means that every time a site is listed, Google can tell how popular that site was from the CTR (click-through rate) to the site. It has been using this methodology for years with Pay per click (AdWords); adverts getting the best CTR are charged less than those with low CTR. With organic listings there is of course no payment. But if a site’s Title and Description don’t get people to click on the link, Google will eventually notice and simply stop giving that page a listing for that term. You can imagine that, if this happens too often, a whole site could just disappear from the rankings. So beware and do check the CTR in your Webmaster Tools.

There’s more too. You see, a site could well have a really great Title and wonderful descriptive text causing all who see it to click through. You might think that’s good news, but if the site doesn’t live up to the visitor’s expectations and they click back to Google to try again, Google will notice this – and conclude that, for that term at least, the site doesn’t deliver the goods. As with poor CTR, this could eventually lead to the site not being listed at all.

Google will also use the feedback process to ‘learn’ what people want to see in the first place, which helps it understand what the meaning of the search was really likely to be about. This allows Google to make its best guess about what sites it should list for any term, and then just sit back and wait to see how people react. If they click on a site and don’t bounce, then they’ve got it right. But if they bounce they haven’t, so Google ‘learns’ with every decision searchers make. What’s more, it will never forget and will keep updating its knowledge all the time. Spooky, eh?

There’s more too. You see, a site could well have a really great Title and wonderful descriptive text causing all who see it to click through. You might think that’s good news, but if the site doesn’t live up to the visitor’s expectations and they click back to Google to try again, Google will notice this – and conclude that, for that term at least, the site doesn’t deliver the goods. As with poor CTR, this could eventually lead to the site not being listed at all.

Google will also use the feedback process to ‘learn’ what people want to see in the first place, which helps it understand what the meaning of the search was really likely to be about. This allows Google to make its best guess about what sites it should list for any term, and then just sit back and wait to see how people react. If they click on a site and don’t bounce, then they’ve got it right. But if they bounce they haven’t, so Google ‘learns’ with every decision searchers make. What’s more, it will never forget and will keep updating its knowledge all the time. Spooky, eh?

The above process is made even more powerful by the fact that, just as Google can deduce what a page or a site is about (and therefore what answers and information it gives), when it really does satisfy a user it can then deduce the original intent. This is yet another part of the great feedback loop.

Semantic SEO and gaming the system

As we’ve seen, it’s the copy and how well the message and meaning of a site is put across to Google and any visitor, that really counts in the end. The former to get a listing in the first place; the latter, in effect, to keep it.

There is, of course, more to convincing Google than the copy, though I think this will take the lion’s share. Inbound and outbound linking, the social media signal and the level of interaction (including sharing) are also major factors.

Although it may be possible to game the system by creating a bigger social signal than the site really deserves, the experts’ view is that this will be more and more difficult, with Google looking at each person who comments or Likes, then deciding if they’re real or not. If they are one of the millions of fake profiles set up in the past, they will count for nothing, and may even damage a site.

Thus under the intense scrutiny of Google, it may be as hard and unproductive to create huge amounts of social signal as the process of creating thousands of worthless links…

This doesn’t mean that a small quantity of such links and signal are useless. Both can ‘prime the pump’ a little so the real power of the site is allowed to shine through. If this is the case, a small level of gaming (or old-fashioned SEO work) still looks as if it will be worthwhile.

However, if the page or site in question doesn’t really deserve a high ranking, it will eventually be denied one when people tell Google that it’s no good via low CTR’s and high bounce rates. Therefore, the whole process depends on having a site that answers visitors’ needs. And that means high quality, useful content delivered via words, pictures and video.

The new Semantic SEO

So what will the new SEO process look like? In my view it will still start with the keyword phrase. After all, this is the start of the process and can’t be ignored. The next stage is to try to work out which words are likely to be used by someone who has the intent to react with your site in the way you’d want. This could be to buy something, or simply to understand that you could help them with their problem or needs.

Once you’ve decided on these words, you can reverse engineer the Google results to see what sorts of words it likes to see.

Combine this data with the questions that are being asked, and the problems that your site solves, and you have the recipe for a perfect page that answers people’s needs and uses the words Google expects to see. Interestingly, the latter neatly covers the area of LSI (Latent Sematic Indexing) – without all the effort.

Once this page is created, and you’ve placed all the standard ‘blind man signposts’ on it, you can proceed to getting it noticed via old-fashioned links and social media.

As you can see, the above includes some old SEO practices, this being for the simple reason that they’re still as relevant and required as they were several years ago.

The biggest change and the greatest challenges are to understand what you should write about and post on a site, and how you can generate the necessary signal on Social Media. I’ll cover this in my next post.

30% of retailers see SEO agencies as “expensive” and unable to deliver

It’s always interesting to see what comments make about SEO, it is after all a difficult task and one where you just cannot promise good results. Any SEO company will try their best of course (within the budget they are allowed), but with Google ‘making things up as they go along’ (or at least that is what it often feels like) it can be very very difficult to get the results the customer wants.

That said, there is also the issue of what the customer does want and the question ‘Can the website deliver’, even when traffic is delivered. This is a point taken up by some of the people commenting on the blog below, as they point out that some companies simply believe that they deserve a first page position and to ‘sell’ whilst not really understanding what Google and their customers want.

The former point is one that really demands understanding, as a good site in Google’s eyes is one that does not just ‘sell’, but provides good and useful content to visitors. It is also worth pointing out that it is indeed foolish to openly trick Google, but that does not mean that some fancy SEO footwork cannot bring some good results…

Please read the item below and if you want to see the full article on SEO Agencies and Results click on the link:-

Nearly a third (32 per cent) of UK retailers see organic search (SEO) agencies as “expensive” and unable to offer them clear results while 15 per cent find them disappointing, according to research from OneHydra.

Of the 200 retail e-commerce managers and marketing directors questioned the majority (82 per cent) said search marketing is an integral part of their business model but on average less than 20 per cent of their SEO requirements were met in the last 12 months.
It was found that most work with an agency in some capacity, either in partnership with their own in-house team (37 per cent) or exclusively (37 per cent). Only a quarter handle SEO in-house.

David Freeman, head of SEO at Havas Media, said he often hears businesses and marketing teams discuss the difficulties of getting SEO projects implemented.
“However, we must consider that development teams normally have a continuous stream of work to implement and changes for the greater good of SEO performance won’t get implemented by default,” he said.

“SEO teams need to understand the way their clients/internal teams operate and accompany SEO recommendations with a clear commercial case.”
Freeman explained this commercial case should allow work to be prioritised accordingly and alleviate some of this wastage. However, he added that it is “vital” that SEO teams understand the capabilities and limitations of the content management systems to ensure that the recommended changes are feasible to start with.

Meanwhile 17 per cent of respondents stated that the lack of a “strong business case” was the key reason for being unable to implement changes, but by far the biggest barrier for companies was “technological resource and capacity”, a problem which 71 per cent of respondents cited.
Andrew Girdwood, head of media innovation at DigitasLBi, said it is a shame that so many sites are still built without SEO in mind.
“An approach that blends media savvy, like SEO, with brilliant design and build capabilities should not be seen as a luxury but as a necessity for brands. The approach helps save money in the long term.”

Regarding spend, the research found that a quarter of retailers could be wasting more that £100,000 a year on failed SEO procedures, of which many don’t even make it through the IT Department.

However, Oscar Romero, head of search strategy and product at Starcom MediaVest, said this figure is difficult to quantify due to changes implemented by Google.

“Since Google began restricting visibility of keyword data in 2011 under the (not provided) label, businesses have been denied fundamental information about how their sites were performing in organic search. As Google further increases online security measures, the proportion of keyword data being labelled as (not provided) is reaching close to 100 per cent. With this lack of visibility on keyword-level performance, businesses are faced with the significant challenge of how to assess ROI and justify future investment in SEO.

“Over time the industry has developed a variety of alternative methods to define SEO metrics and targets, including analysis of rankings and landing page performance.

“However this is not able to replace the level of data previously available to search marketers and renders the ability to assess the performance of SEO in isolation very challenging.”

Making SEO work with responsive design

With the rise in mobile devices being used to browse and access the internet, website owners have had to make changes to their sites so that visitors can view their site regardless of the device they are using. The most common method of adapting websites is to use responsive web design. This means that the site can be viewed on virtually any size of screen without losing scale or clarity, so is a huge advantage for site owners whose sites are being accessed from mobile phones. Responsive sites still need to be properly optimized and have the same high quality content, keywords and links so that they are placed well on the search engine rankings and do not fall foul of the Google algorithms. Keeping up with all the changes can be difficult, as there seem to be changes every other week these days. Here are some tips on SEO for responsive web design:

Responsive design is obviously a big deal; such a big deal that Mashable has hailed 2013 as “the year of responsive design.” Most web professionals understand this — responsive design is changing the way that the Internet looks, feels, and works.

There’s something less obvious going on, though. Responsive design also changes SEO. When we look beyond the CSS of responsive design, we see a major shift in search practices that is exerting an impact on both mobile and desktop searches.

What are the SEO issues brought about by the advent of responsive design? Here are five.

1.    Google likes responsive design, meaning that search results will likely favor sites that employ responsive best practices.

While we hesitate to declare baldly that Google is in love with RWD, we can identify a strong affinity for RWD best practices. After Google’s blog post about Responsive Design, SEO Round Table published an article outlining the reasons why Google likes responsive design. The three reasons — non-duplicated content, no canonical URL issues, and no redirect problems — are all part of a strong SEO arsenal.

When Google flinches, everyone jumps. So it is with responsive design. Since Google actually wrote the Mobile Playbook, it only makes sense to give them due respect for their mobile and responsive proclivities. As algorithms continue to be tweaked throughout 2013 and beyond, we will probably see more and more nods to sites that successfully employ responsive design.

If Google prefers responsive design, that’s a huge game changer for search.

2.    Mobile users crave a good experience, and responsive sites deliver optimal site quality for mobile users.

That point above is a bit convoluted. Nonetheless, it’s an important point for SEO. Here’s how it works.

More and more users are mobile. Your website is now receiving more mobile visitors than ever before. Trust me; check the analytics. All those mobile users need a good experience. The better their experience, the better your SEO

Further information

Why responsive web design matters for your SEO

Matt Cutts talks responsive design impact on SEO

How common are SEO problems with responsive web design?