Combined Arms is as necessary in SEO as it is in warfare

Any army general or historian will tell you that in order to win a battle, you need to use all the different types of offensive weaponry available, this being the same across the ages, from the time of 1066 (and earlier) to today.

This is called ‘Combined Arms’ it being the process where an army will use the three main types of fighting unit together in order to win. In the past, these were represented by the Infantry, the cavalry and the artillery. Over time of course this has changed, the role/type of cavalry changing the most, with tanks replacing the horse.

the three arms of SEO

You need to use the 3 arms of SEO just as you do when at war.

But regardless of the change, all three are needed. The artillery to subdue the enemy before the main attack, with the cavalry, after performing the important job of reconnaissance, taking on the task of pinning the enemy in place, thus allowing  the infantry to advance to take the objective.

Of course all this has changed since the days of Napoleon, but they are still basically the same today, all being  needed (with air power taking over a large part of the role of artillery of course).

I understand all of this, not because I have ever fought in a war (or am an acclaimed war historian) but rather because of my hobby, that of wargaming. I play games that cover battles fought by the Vikings, the Persians, the Crusaders, as well as the Napoleonic period. My favourite being that of World War Two. Here I have ‘fought’ in the deserts of Egypt, the steppes of Russian in Normandy, of which have been immensely  enjoyable.  In each battle I have learnt of the importance of using the three different arms together, something that lies at the heart of the hobby.

But what has this got to do with Search Engine Optimisation?

Well, putting it quite simply, there is no ‘magic bullet’ when it comes to getting the best possible rankings on Google (other search engines are available). Instead, you have to make sure that the three areas of Technical Site Build, Content, and Linking Structure are all properly attended too. It is vitally important that these three areas work together just as the three ‘arms’ on  the battlefield, if success is wanted.

The reason for this is that Google look at all three areas, giving ‘marks’ for each. To fail on any of them risks loosing the chance to get a top ranking, the reasons, and the details of each one being covered below:-

Technical Site Build

This is an area that is often forgotten, but is vital, as if the site is not built to allow Google to find all the pages easily, the site will fail at the very first step. Besides this, it is also vital to allow users to move around the site easily and to make the navigation easy to understand and use.

Perhaps the greatest area is however that of site speed, Google now more than ever (with the advent of the Mobile Index) looking for sites that download in the shortest times possible. If they are built in a manner that slows the page delivery  down, or are placed on a server that is overloaded, Google will downgrade the site severely, preventing it from gaining the high positions that it otherwise may deserve.

Content

The reasons that people visit sites are many, but in all cases they are looking for something, maybe the answer to a problem, or for a particular product or service. If the pages of a site do not provide these answers, or give enough detail on a product or service, they will fail to meet the needs of any visitor. And hence, as Google’s aim is to only list pages that are ‘worth the time of their users’ they will fail this important test.

It is therefore necessary to ensure that the pages of a site meet the needs of the visitor. This means that there is not only enough text on the page, but also that it contains pictures and where possible video content (this being another example of ‘combined arms’).

There is by the way a ‘hidden’ advantage to having lots of text on a page (as long as this is laid out in a manner that allows it to be easily absorbed – ‘walls of text’ not being a good idea – whitespace being important). This is all to do with what is known as ‘long tail keywords’ and the capturing of such searches on Google.

Long Tail Keyword Search Phrases.

It is well understood that users use different types of search phrases when they are looking for a product or service. For example, when looking for a TV, they may search for ‘large screen TV’ only to find that the number of search results is too large and that it is impossible to know where to start.

In such cases it is normal for the search phrased to be changed with a view  to getting a better list of sites to check. Perhaps the phrase will be altered to ’40 inch Smart TV’ at this stage.

Further pages and terms will be used until the searcher finds out the model that they want. This is the ‘buying stage’ of the search ‘lifecycle’ and is therefore most important. A term that is used here could be ‘Sony 40EXDB Smart TV in black’. In such cases, it is vital to make sure that your website is in the position to capture such a query.

There are many examples of long tail keywords, and in many cases they are the best ones to capture, as they are often used towards the end of a search for a product etc, at the very time the searcher is ready to purchase.

This is just what using a lot of text on a page can do for any website owner, it allowing them in effect to put more hooks in the ‘water’ of the internet. More hooks lead to more fish being caught, this translating to more visitors and thus hopefully sales.

The overlap with Technical SEO

There is also an overlap with the area of technical search engine optimisation to consider here, that of ensuring that the important areas on a page are used to best effect. These include the Title of a page (the most important real estate a page has) to the Header tags (the H1 being the most important and is best used only once). Besides this, using the other attribute tags like Bold, Italic and List should not be overlooked, these all being places where a part of the content of a page will be placed.

Linking Structure

The third arm of SEO trilogy is still very important today, but it must be said is not as all powerful as it was in the not so distant past (when it was said to be possible to get a blank page to position 1 on Google).

Today, it is still necessary to ensure that a domain, and the pages within the site, have a good number of links, the numbers needed being very different from market sector to market sector and from niche to niche, some being more highly contested than others.

There was a big change in the recent past however, a very big change, one that was started with the introduction of the so called  ‘Penguin’ update by Google. Google felt that they had to make these changes to their algorithm as SEO professionals started creating links in huge numbers to get the pages they wanted to the top of the SERPs.

Basically, this algorithm checked (it is now not run every now and then, but is integrated into the rule set that is used every day) on the links pointing to a site, penalising those that had created too many ‘spammy’ links or a linking structure that use too high a percentage of ‘money keywords’ (these are the phrases that are thought to bring in the sales / enquiries).

All of this means that this part of the ‘combined arms’ team needs to be very carefully handled indeed. So carefully that at Serendipity Online Marketing use specially designed software to handle the whole process, thus ensuring that we only build links that will enhance the standing of sites we work on.

 

So that is it, to succeed in SEO you have to use all three ‘arms’ and use them correctly.

 

The Decline of the Insect and The High Street..

For those of us who are lucky enough to live in, or visit the countryside, there is no doubt that the number and variety of insects has diminished dramatically over the past years. For instance, I can remember having to scrape the hordes of dead bodies from my car windscreen and lights every few days in the 1970’s, the numbers of insects about being so very high.

Do we want to see insects like these disappear?

Now days, however, you hardly notice them, and even when walking the fields you see and hear so very few. Only the other day I was walking through a wheat field and hardly saw anything fly past and certainly nothing crawling along the path.

Of course, this has all to do with our modern way of life and the desire to have unblemished fruit and vegetables (sometimes when I view that perfect lettuce, I cannot imagine that anything ever walked upon it, let alone tried to eat it). This desire (as well as the need to grow as much as possible, both because of demand and the need to make more profit) has in turn led to a huge increase in (even more deadly and effective) insecticides, which naturally (excuse the pun) has had a dramatic impact on insect numbers. Besides this the insect has also to contend with climate change, the way that land is now being farmed and other reasons for loss of habitat.

Whilst this may seem not to matter, the long term effects of reduced insects could be catastrophic. The food chain and the way the planet works being totally disrupted, “they are the little things that run the world” according to one eminent biologist.

The High Street is in Decline Too

But insects are not the only thing in decline in the modern world, the other one that has been making (the bigger) headlines being the decline of the high street.

Can we save the high steet?

Now there are many reasons why the high street is suffering at the moment, the BBC listing 6 of them in its’ new article published in March 2018. These include the fact that people in the UK have less disposable income because pay growth has been lower than inflation. Another good reason is the recent hike in Business Rates. However, the area I am most interested in is that of the competition that the high street faces from Online Retailers.

The Rise and Rise of Internet Shopping

The Internet dominates our lives today, we use it to chat (on social media), to exchange information (using email), to view entertainment programmes via the many streaming services, plus of course more and more people are purchasing their goods online as it is easier, simpler and in most cases cheaper than buying from a ‘bricks and mortar’ shop.

Being An Online Marketing Consultant

This makes me a little bit sad, as for the last 18 years, I through my work as an Online Marketing Consultant (specialising in Search Engine Optimisation and Adwords) I have been contributing, in my own little way, to this decline. It does not cause me to want to stop, as I am after all fulfilling a need my customers have, but it does make me sad.

There is however, as in the case of the decline in the number of insects, little that I can do to stop these changes, and here I can only hope that some great man or women, will step forward and take the necessary actions, before it is too late.

When it comes to the insect population issue, the answer will not be easy to find, especially with the continuing increase in the human population of the world, but it is not a problem that can be ignored forever…

The same is the case with the decline of the high street, but here there are already a number of ideas being floated, these including reducing the level of business rates, whilst at the same time increasing the level of taxes on online purchases; this being an attempt to level the playing field a little when it comes to the costs of operating a retail business.

Others include making the shopping process more enjoyable by ‘engaging’ more with the potential customer and having other entertainments available, plus of course to make parking, or the means to get to the shops (via Park and Ride schemes) easier. Here, what amazes me is the lack of any form of transportation to get heavy items back to the car park, it is after all not that easy to carry a bulky heavy object back to your car. This is something else that drives people to shop online, where delivery to your home is built it.

One idea could be to have shops that hold no stock and where you can only look at the goods you  are interested in, perhaps also trying on clothes, shoes and the like. Orders would be placed with the ‘shop’ with the goods being delivered the next day (as Amazon do today). The shop could be shared by a number of businesses if wanted, thereby keeping their costs down, whilst, and this is the important bit maintaining a presence on the High Street.

Maybe it is an idea that will catch on, who knows, but like the issue with our insect friends, it is something that cannot be ignored forever.

 

Googles New Quality Guidelines and What it Means..

Today content and making a page ‘User Friendly’ is more important than ever when it comes to the current ‘SEO rules’ that Google use. This of course only covers On Page SEO (linking is another kettle of fish) but it is an area that covers a lot of ground.

With this in mind (and acknowledging that I do not know it all) I contacted an SEO Guru to find out if I had all the bases covered. I’m glad to say that I had, but the reply I got back did highlight the fact that Google have just changed their ‘Quality Guidelines’.

But back to my question. I wanted to know more about how content and UX were graded, especially, as before a page is visited by a human (and could hence provide Google with data via the Chrome Browser about time on page etc – if it wanted to take that into account) Google MUST have a means of calculating the ‘value’ of the content and how user friendly it is.

To me this is / was the ‘egg’ part of the chicken and egg story, it being the ‘egg’; a page not seen by Google or anyone else before, that is analysed and given a ‘value’ ranking. This is then being used as a basis for any later search related ranking procedure, pages with ‘higher value’ rankings being more likely to get a position at the top of the SERPS.

I listed the signals that I thought Google use, these being:-

  • Title of Page
  • Description of Page (not truly used, but a lot of poor ones can degrade an entire sites quality, so I have been told)
  • Header Tags on the pages (although these are not as powerful and many a site breaks the ‘rules’ about using them and still gets high ranks)
  • Bold, Itatlics, lists
  • Words used (more on this later)
  • Links out to relevant / useful sites (although I have seen comments from SEO professionals who also say this is not a useful signal)
  • Embedding videos
  • Using Images with the ‘descriptive’ file names and ALT text, and GeoTagging them for LOCAL SEO

Plus on the UX side

  • The Speed of the Page
  • Using whitespace
  • Not allowing too many adverts at the head of page
  • Ensuring that the above the fold area is not just images (the use of Carousels is said, by some to be harmful, but is used extensively and many sites still get a high rank..)

The Words Used on the page:

Here I pointed out that as Google uses a computer programme to analyse any page, that this in turn meant that it must use a lot of TRUE / FALSE checks, this leading on to the use of Words in the content. To me this is an important fact, as it would take a committee of ‘experts’ to view a page to tell whether it was truly good and useful (and they would surely disagree in many cases) and as this is just not the case with Google (even with the power of RankBrain), it surely MUST be making its decisions at a far lower, more ‘mechanical’ level.

The problem about what words to use has been overcome by the use of LSI and Reverse Engineering, and from the reply I got back, I would say that this is still the case today.

Of course, if you want to ‘get a message’ across to Google about what a page was all about, with some specific keyword phrases in mind,  you just CANNOT stuff a page with those target words, this being a dangerous method now.

Google’s Quality Guideline Update

It must be said that these guidelines ARE NOT A PART of the SEO algorithm, but they are important as they form a part of the ‘feed back’ process that Google use when evaluating their own SERPS listings…

The way it Works…

We know a fair bit about the way Google rates pages for any given term, and we also know that Google is constantly changing these rules. In the past, they had to keep changing the rules as SEO professionals were constantly ‘taking advantage’ of an anomaly in the algorithm, but today, with Google’s more holistic approach (also known as Semantic SEO), I believe that the changes they make are all about presenting the best possible results.

Google however has a problem here, as they need some way of checking that they are getting it right…

This is where their army of human evaluators come in. They have been around for many years of course, and were responsible for the rule set that Google used to highlight sites whose general quality is low. Here the sort of thing that they found was that sites which use a lot of duplicate Meta data, or Titles, or have a lot of pages with ‘thin content’ (low word count) tend to provide poor user experience and are basically not worth Google’s time to include in the results.

In order to help these evaluators, Google provided them with an Aide Memoir, this listing all the things that should be checked on a sites pages. We will cover this in more detail later in this post.

So, how does Google use the results of the human evaluators?

Of course, they don’t give you the full picture, but looking at it logically, if the human evaluators rate a page as being of the Highest Quality AND this page is NOT listed in the results for a relevant term, then, the Algorithm may well need some work. The same would be the case if those pages that were considered to be of Low quality WERE in the rankings.

So, even though you cannot affect that part of the ‘Quality Assessment’ that is not being worked out by the set of computer rules that is the Google rule set, you can help Google get it right.

This is important as if a human evaluator ranks a page on your site (or a page like it) highly, this feedback process will eventually ensure that your page gets the best possible rank…

The Google Quality Rules

There is a very detailed blog post on this, and you can also download the full details from Google if you want. But to help, the information below (taken from a part of the post mentioned) will enable you to ensure that all of your pages are of the highest quality.

 

Page Quality Ratings

Overall Page Quality Rating

Google has completely rewritten this part of their guide lines,  expanding this section from the very brief version it has had before.

Old version:

The overall Page Quality rating scale offers five rating options: Lowest , Low , Medium , High, and Highest .

New version:

At a high level, here are the steps of Page Quality rating:

  1. Understand the true purpose of the page. Websites or pages without any beneficial purpose, including pages that are created with no attempt to help users, or pages that potentially spread hate, cause harm, or misinform or deceive users, should receive the Lowest rating. No further  assessment is necessary.
    2. Otherwise, the PQ rating is based on how well the page achieves its purpose using the criteria outlined in the following sections on Lowest , Low , Medium , High, and Highest quality pages.

Here you can see that Google is putting the focus on the beneficial purpose of the page.

 

Page Quality Rating: Most Important Factors

Google’s change to this section yet again put the focus on the purpose of the page, but also bring in the ‘reputation of the creator’ of the content.

Here are the changes, with the changes in italics to this section:

Here are the most important factors to consider when selecting an overall Page Quality rating:

  • The Purpose of the Page
    Expertise, Authoritativeness, Trustworthiness: This is an important quality characteristic. Use your research on the additional factors below to inform your rating.
    Main Content Quality and Amount: The rating should be based on the landing page of the task URL.
    ● Website Information/information about who is responsible for the Main Content: Find information about the website as well as the creator of the MC.
    Website Reputation/reputation about who is responsible for the Main Content: Links to help with reputation research will be provided.

 

 

Expertise, Authoritativeness and Trustworthiness (E-A-T)

Again there are some significant changes here.  First, the instances where Google referred to “high quality” have  now been changed to “high EAT”.

Here we believe Google is directing its human evaluators to look beyond simple quality and consider other aspects that contribute to the value of that content.

So, Google has added this new part:

Remember that the first step of PQ rating is to understand the true purpose of the page. Websites or pages without some sort of beneficial purpose, including pages that are created with no attempt to help users, or pages that potentially spread hate, cause harm, or misinform or deceive users, should receive the Lowest rating.

For all other pages that have a beneficial purpose, the amount of expertise, authoritativeness, and trustworthiness (E-A-T) is very important. Please consider:

  • The expertise of the creator of the MC.
    ● The authoritativeness of the creator of the MC, the MC itself, and the website.
    ● The trustworthiness of the creator of the MC, the MC itself, and the website.

Later in the section, they make some changes specific to the content creators in several key areas, including medical, news, science and financial sites.

Here are those changes, with the changes in italics:

  • High E-A-T medical advice should be written or produced by people or organizations with appropriate medical expertise or accreditation. High E-A-T medical advice or information should be written or produced in a professional style and should be edited, reviewed, and updated on a regular basis.
    High E-A-T news articles should be produced with journalistic professionalism—they should contain factually accurate content presented in a way that helps users achieve a better understanding of events. High E-A-T news sources typically have published established editorial policies and robust review processes ( example 1 , example 2 ).
    High E-A-T information pages on scientific topics should be produced by people or organizations with appropriate scientific expertise and represent well-established scientific consensus on issues where such consensus exists.
    High E-A-T financial advice, legal advice, tax advice, etc., should come from trustworthy sources and be maintained and updated regularly.
    High E-A-T advice pages on topics such as home remodeling (which can cost thousands of dollars and impact your living situation) or advice on  parenting issues (which can impact the future happiness of a family) should also come from “expert” or experienced sources that users can trust.
    High E-A-T pages on hobbies, such as photography or learning to play a guitar, also require expertise.

Here you can see that  Google is putting a lot of stress on the content creators as well, this being all the more important for YMYL (Your Money or Your Life) sites.

 

High Quality Pages

Characteristics of High Quality Pages

Google has also expanded this section, the first reference to the new title changes being mentioned , as well as more on the beneficial purpose of a page.  Changes/additions are in italics.

High quality pages exist for almost any beneficial purpose, from giving information to making people laugh to expressing oneself artistically to purchasing products or services online.

What makes a High quality page? A High quality page should have a beneficial purpose and achieve that purpose well.  In addition, High quality pages have the following characteristics:

  • High level of Expertise, Authoritativeness, and Trustworthiness (E-A-T).
    ● A satisfying amount of high quality MC, including a descriptive or helpful title.
    ● Satisfying website information and/or information about who is responsible for the website. If the page is primarily
    for shopping or includes financial transactions, then it should have satisfying customer service information.
    ● Positive website reputation for a website that is responsible for the MC on the page. Positive reputation of the
    creator of the MC, if different from that of the website.

 

This is all very useful stuff, but hidden in the text is the interesting phrase ‘A satisfying amount of high quality MC, including a descriptive or helpful title’.  This is important as it highlights the fact there is no set number of words and that Titles need to be descriptive and relevant (Click Bait Titles could well result in penalisation).

The Highest Quality Pages

Highest Quality Pages

Again, beneficial purpose is added as a requirement for a highest quality page.

They have also added the “and quantity if MC” as a marker for a distinction between high and highest quality.  This does raise a question about whether all content length is really considered equal in the eyes of Google.  Both Gary Illyes and John Mueller have stated you don’t need to write an essay for a piece of content that doesn’t need it, and to write as much as you need to in order to answer the question the title presents.  But here, quantity of the main content is something rates should specifically look for when deciding if a page is highest quality or only high quality.

And we see yet another reference to the need of having a “very positive reputation of the creator of the main content, if different from that of the website.”

But they have removed references to this on pages for stores or other financial transactions.

Here is the old version:

Highest pages are very satisfying pages that achieve their purpose very well. The distinction between High and Highest is based on the quality of MC as well as the level of EAT and reputation of the website.

What makes a page Highest quality? A Highest quality page may have the following characteristics:

  • Very high level of Expertise, highly Authoritative, and highly Trustworthy for the purpose of the page (EAT), including the EAT
    of the publisher and/or individual author for news articles and information pages on YMYL topics.
    ● A satisfying amount of high quality MC.
    ● Highly satisfying website information and/or information about who is responsible for the website or for stores and pages involving financial transactions, highly satisfying customer service reputation is very important.
    ● Very positive website reputation for a website that is responsible for the MC on the page.

And the updated version:

Highest quality pages are created to serve a beneficial purpose and achieve their purpose very well. The distinction between High and Highest is based on the quality and quantity of MC, as well as the level of reputation and E-A-T.

What makes a page Highest quality? In addition to the attributes of a High quality page, a Highest quality page must have at least one of the following characteristics:

  • Very high level of Expertise, Authoritativeness, and Trustworthiness (E-A-T).
    ● A very satisfying amount of high or highest quality MC.
    ● Very positive website reputation for a website that is responsible for the MC on the page. Very positive reputation of the creator of the MC, if different from that of the website.

 

 

And for Low Quality Pages…

This entire section on low quality pages has been updated.  Some was removed as it was replaced with something more concise, while other areas were expanded, particularly around reputation and beneficial content.

Low Quality Pages

The first paragraph has been updated completely.

This was removed:

Low quality pages are unsatisfying or lacking in some element that prevents them from achieving their purpose well. These pages lack expertise or are not very trustworthy/authoritative for the purpose of the page.

And it was changed to this:

Low quality pages may have been intended to serve a beneficial purpose. However, Low quality pages do not achieve their purpose well because they are lacking in an important dimension, such as having an unsatisfying amount of MC, or because the creator of the MC lacks expertise for the purpose of the page.

Here is the reference to beneficial purpose once again.  But this time it also concedes that sometimes these pages were intended to serve a beneficial purpose but something on the page – or missing from it – means it is still low quality.

Google has removed the possibility that some pages that meet their “low quality pages” criteria might not be considered low.  Now, raters must always rate a page as Low – or Lowest – if any one or more applies.

Here is what the section used to be:

If a page has one of the following characteristics, the Low rating is usually appropriate:

  • The author of the page or website does not have enough expertise for the topic of the page and/or the website is not trustworthy or authoritative for the topic. In other words, the page/website is lacking EAT.
    ● The quality of the MC is low.
    ● There is an unsatisfying amount of MC for the purpose of the page.
    ● MC is present, but difficult to use due to distracting/disruptive/misleading Ads, other content/features, etc.
    ● There is an unsatisfying amount of website information for the purpose of the website (no good reason for anonymity).
    ● The website has a negative reputation.

And here is the new revised version:

If a page has one or more of the following characteristics, the Low rating applies:
● An inadequate level of Expertise, Authoritativeness, and Trustworthiness (E-A-T).
● The quality of the MC is low.
● There is an unsatisfying amount of MC for the purpose of the page.
● The title of the MC is exaggerated or shocking.
● The Ads or SC distracts from the MC.
● There is an unsatisfying amount of website information or information about the creator of the MC for the purpose of the page (no good reason for anonymity).
● A mildly negative reputation for a website or creator of the MC, based on extensive reputation research. If a page has multiple Low quality attributes, a rating lower than Low may be appropriate.

Note that it no longer includes the reference that anonymity for some content might be appropriate.

Lacking Expertise, Authoritativeness, or Trustworthiness (E-A-T)

This section has been completely rewritten, and was formerly section 6.5.

Removed:

Some topics demand expertise for the content to be considered trustworthy. YMYL topics such as medical advice, legal advice, financial advice, etc. should come from authoritative sources in those fields, must be factually accurate, and must represent scientific/medical consensus within those fields where such consensus exists. Even everyday topics, such as recipes and house cleaning, should come from those with experience and everyday expertise in order for the page to be trustworthy.

You should consider who is responsible for the content of the website or content of the page you are evaluating. Does the person or organization have sufficient expertise for the topic? If expertise, authoritativeness, or trustworthiness is lacking, use the Low rating.

Revised:

Low quality pages often lack an appropriate level of E-A-T for the purpose of the page. Here are some examples:

  • The creator of the MC does not have adequate expertise in the topic of the MC, e.g. a tax form instruction video made by someone with no clear expertise in tax preparation.
    ● The website is not an authoritative source for the topic of the page, e.g. tax information on a cooking website.
    ● The MC is not trustworthy, e.g. a shopping checkout page that has an insecure connection.

 

User Generated Content Guidelines

It also made some slight changes to the user generated content section of this, and now specifically includes references to social networking pages, video sharing sites, and wiki-type sites.

Old version:

User-generated websites span the Page Quality rating spectrum. Note that in some cases, contributors choose their own topics with no oversight and may have very poor writing skills or no expertise in the topic of the page. Contributors may be paid per article or word, and may even be eligible for bonuses based on the traffic to their pages. Depending on the topic, pages on these websites may not be trustworthy.

New version:

Note: Websites with user-generated content span the Page Quality rating spectrum. Please pay careful attention to websites that allow users to publish content with little oversight, such as social networking pages, video sharing websites, volunteer-created encyclopedias, article sharing websites, forums, etc. Depending on the topic, pages on these websites may lack E-A-T.

The user generated content section is noteworthy, because they aren’t automatically discounting user generated content as low or lowest, but rather as something that warrants further investigation before rating it.  There are plenty of examples of high quality user generated content, but it seems the majority is definitely lacking in quality and EAT.

It has also changed the notation at the end from “Important : Lacking appropriate EAT is sufficient reason to give a page a Low quality rating.” to “Important : The Low rating should be used if the page lacks appropriate E-A-T for its purpose.”  So Google has a new distinction on EAT for the purpose of the specific page.

 

Low Quality Main Content

This section has been significantly reduced, although some of it was incorporated into new individual sections Google has added to the guidelines, so just because it is noted as removed here, doesn’t mean it was removed entirely.  But we also get our new guidance on the clickbait style titles vs actual content that Google now wants its human evaluators to call Low.

They entirely removed this part which was an example used to illustrate types of low quality content, as well as the differentiation between professional websites and those from hobbyists:

One of the most important criteria in PQ rating is the quality of the MC, which is determined by how much time, effort, expertise, and talent/skill have gone into the creation of the page, and also informs the EAT of the page.

Consider this example: Most students have to write papers for high school or college. Many students take shortcuts to save time and effort by doing one or more of the following:

  • Buying papers online or getting someone else to write for them.
    ● Including inaccurate information, such as making things up, stretching the truth, or creating a false sense of doubt about well-established facts.
    ● Writing quickly with no drafts or editing.
    ● Failing to cite sources, or making up sources where none exist.
    ● Filling the report with large pictures or other distracting content.
    ● Copying the entire report from an encyclopedia, or paraphrasing content by changing words or sentence structure here and there.
    ● Using commonly known facts, for example, “Argentina is a country. People live there. Argentina has borders.”
    ● Using a lot of words to communicate only basic ideas or facts, for example, “Pandas eat bamboo. Pandas eat a lot of bamboo. Bamboo is the best food for a Panda bear.”

 

Here Google point out that the content of some webpages is similarly created. So, where you find content like this, it should be rated as Low quality if it is created without adequate time, effort, expertise, or talent/skill. Inaccurate or misleading information presented as fact is also a reason for Low or even Lowest quality ratings. Pages with low quality MC do not achieve their purpose well.

 

Keep in mind that we have very different standards for pages on large, professionally-produced business websites than we have for small amateur, hobbyist, or personal websites. The quality of MC we expect for a large online store is very different than what we might expect for a small local business website.

All Page Quality ratings should be made in the context of the purpose of the page and the type of website.

Important : Low quality MC is a sufficient reason to give a page a Low quality rating.

The very much abbreviated version of this section has specifics to clickbait:

The quality of the MC is an important consideration for PQ rating. We will consider content to be Low quality if it is created without adequate time, effort, expertise, or talent/skill. Pages with low quality MC do not achieve their purpose well.

In addition, please examine the title on the page. The title of the page should describe the content.

Exaggerated or shocking titles can entice users to click on pages in search results. If pages do not live up to the exaggerated or shocking title or images, the experience leaves users feeling surprised and confused. Here is an example of a page with an exaggerated and shocking title: “Is the World about to End? Mysterious Sightings of 25ft Sea Serpents Prompt Panic!” as the title for an article about the unidentified remains of one small dead fish on a beach. Pages with exaggerated or shocking titles that do not describe the MC well should be rated Low.

Important : The Low rating should be used if the page has Low quality MC.

 

Unsatisfying Amount of Main Content

Here there is a small change, but it does make a evaluator aware that there is a difference between the amount of content for the purpose of the page.

Old version:

Important : An unsatisfying amount of MC is a sufficient reason to give a page a Low quality rating.

New version:

Important : The Low rating should be used if the page has an unsatisfying amount of MC for the purpose of the page.

 

Lack of Purpose Pages

This is a very important area, Google stating that  “Some pages fail to achieve their purpose so profoundly that the purpose of the page cannot be determined. Such pages serve no real purpose for users.”

Pages that Fail to Achieve Their Purpose

This is another section that was reorganized and rewritten.  Here is the updated version:

Lowest E-A-T

One of the most important criteria of PQ rating is E-A-T. Expertise of the creator of the MC, and authoritativeness or trustworthiness of the page or website, is extremely important for a page to achieve its purpose well.

If the E-A-T of a page is low enough, users cannot or should not use the MC of the page. This is especially true of YMYL topics. If the page is highly inexpert, unauthoritative or untrustworthy, it fails to achieve its purpose.

Important : The Lowest rating should be used if the page is highly inexpert, unauthoritative, or untrustworthy.

No/Little Main Content

Pages exist to share their MC with users. The following pages should be rated Lowest because they fail to achieve their purpose:
● Pages with no MC.
● Pages with a bare minimum of MC that is unhelpful for the purpose of the page.

Lowest Quality Main Content

The Lowest rating applies to any page with Lowest Quality MC. Lowest quality MC is content created with such insufficient time, effort, expertise, talent, and/or skill that it fails to achieve its purpose. The Lowest rating should also apply to pages where users cannot benefit from the MC, for example:

  • Informational pages with demonstrably inaccurate MC.
    ● The MC is so difficult to read, watch, or use, that it takes great effort to understand and use the page.
    ● Broken functionality of the page due to lack of skill in construction, poor design, or lack of maintenance.

Have high standards and think about how typical users in your locale would experience the MC on the page. A page may
have value to the creator or participants in the discussion, but few to no general users who view it would benefit from the
MC.

Copied Main Content

Interesting part they removed from the beginning of this section is the comment that “Every page needs Main Content.”

They also combined the two sections “Copied Main Content” and “More About Copied Content”, although it is nearly identical.

They did remove the following:

If all or most of the MC on the page is copied, think about the purpose of the page. Why does the page exist? What value does the page have for users? Why should users look at the page with copied content instead of the original source?

That is a curious part to remove, since it is a valid way to determine if there is any way the content has value despite being copied or syndicated.

Auto-Generated Main Content

This section was renamed from “Automatically-Generated Main Content”, perhaps to change it to match industry lingo.

This section is primarily the same, but added “Another way to create MC with little to no time, effort, or expertise is to create pages (or even entire websites)” to the first paragraph.

 

 

 

Conclusion

There is a lot here as you can see, but for me the main point is that a page should be USEFUL and be WORTH READING.

Curiously though, the guidelines do not state that Copied Content is necessarily a bad thing. I read this as if a page uses content from another site, IF that page then goes on to Add Value then that page should not be down rated.

It also points out that there are no firm guidelines on the amount of content that should be considered as too low. BUT it does state that the length of content can be used to identify those pages as of being of the highest value…

I do hope that this information helps and thanks again for the work done by Jennifer Slegg

 

 

 

Search Engine Optimisation and Link Building in 2018

Let’s face it, SEO is considered by many to be a ‘black art’, by others to be a waste of time, whilst those who do agree that SEO is worthwhile will endlessly debate about what is good and what is bad, what tactics are ‘white’ and what are ‘black’.

“If you ask six SEO experts a question you will probably get 7 answers….”

Then again, if you turned to Google to ask them about SEO, they appear to suggest that they disagree with the concept (wanting their results to be natural and not manipulated), whilst at the same time knowing that without someone to help all the website owners ‘understand’ how to set up their sites so that Google can read them properly, they (Google) would be lost.

The Death of Link Building Announced Again (and Again and Again)

When it comes to the thorny topic of Link Building, not only do we see Google denouncing the process because it is not natural, we also see them desperate for some help in deciding what sites to list and what ones not to bother with. Like it or not Google needs links..

BUT, not all links are equal and there is definitely a way of gaining Google’s displeasure when it comes to building them. Do it the wrong way and your site is doomed, that is one of the known Google rules and is put into play all the time.

Turning to the Experts

This is why many businesses turn to the experts, as here they can rest easy, knowing their site will be built in a way that Google can read and the content will be created to suit the Search Engines and their readers alike. They will also know that the links built to their site will be created in such a way that it will not be penalised by Google. They will then expect that their site gets better rankings and more traffic. In many cases this is exactly what happens, but in some it does not, and sometimes it is impossible for anyone to discover just what has gone wrong, what appears to be working in one area, fails to do so in another…

Either way, you can be pretty sure that the website owner will not really be aware of what is going on and that there are many ways of creating a ‘buzz’ and the boost of (relevant) traffic that all website owners want.

The Key to Top Class Traffic

If you own a business which has a website, you will, I am sure, have been inundated with telephone calls and emails promising you top rankings on Google, sometimes for little cost. You will also have seen countless bits of software that will boost your site, often, they say, at the touch of a button….

Some of these claims will be by bona fide companies and of the software – particularly research software – can be useful, but what few of them will tell you is that it will be the CONTENT of the site that will win the day, both for getting traffic and for converting your visitors to customers.

The True Power of Content

So why is content so very important? This may seem to be a strange question, but many site owners do not give it much attention. They spend a get deal of time discussing format and presentation, but often give scant regard to the content the pages are to hold. So doing things this way is simply not going to work because there is nothing for Google to get its ’teeth’ into, so the rankings are poor. When (and if) a potential customer arrives, there is nothing to ‘make’ them want to buy or at least to take the relationship any further.

The correct way of approaching any market place (and the keyword market place of the web is no different) is to see what people want. When it comes to Google, this means finding out what phrases people use in the market sector which relate to your products, and thus the pages of your site that will be selling those products.

Reverse Engineering and Latent Semantic Indexing

Then you can start writing content that uses those phrases (and similar words using a technique called LSI (or Latent Semantic Indexing) – this being vital as Google gets cleverer and cleverer). You can even reverse engineer the top sites in Google for a given phrase, therefore TELLING you what words to use.

Pages written this way will not only give Google what it wants, but will also give the visitor the information that will enable them to decide if your product is for them or if you can help them solve the problem that drove them to search for help in the first place.
This content ought to include images, videos, flow charts and anything else that will help them to make a good decision (which hopefully means doing business with you).

The OTHER reason for TOP QUALITY CONTENT

Saying all this, content has another VITAL job to do in the battle for traffic and sales. Having good content will mean that others will link to the site and mention it in their Social Media postings, after all they will have good reason to, they will have something WORTH SHARING.

content is king internet concept

content is king internet concept

But, it is not always easy getting people to notice how good your copy is. The whole thing is a bit of a ‘chicken and egg’ situation. After all, your fantastic copy can’t get links until someone finds it, and reads it. Without rankings or some form of Social Media chatter, no one will ever know it is there..

Priming the Pump (and keeping the pressure up too)

This is where SEO, Paid Search and Social Media come into play. By using all or some of these systems, website owners can start the ball rolling so that people can see just how good they, and their all important copy, is.

Great Copy Required

This is where the need for the top rate copy comes in, as even though the SEO and Social Media work above will bring in the visits, if there is nothing there to grip the audience, all the time and effort will have been wasted as no sales will be made and perhaps as importantly, no one will find anything worthwhile to come back for, mention or link to.

Without these mentions and links Google will not get the signals it wants to give the site repeatedly better rankings and thus more effort is needed to keep things going. If, however, there is something to ‘write home about’ then the links will come in, your site / product will be mentioned on Social Media, and what is more you will get repeat visits.

Are you Getting Returning Visitors?

This is just one area where Google Analytics can help. Just as having lots of New Visitors is good, a low percentage of Returning visitors indicates that you site is not delivering and people are not coming back for more.

 

Are you getting a high enough level of returning visitors?

Are you getting a high enough level of returning visitors?

If your site is one with a poor level of returning visitors, then a long hard look at the contents is a must…

The Different Processes of SEO

The term Search Engine Optimisation covers a host of things, usually divided (basically) into Technical, On Page, Off Page and Social Media.

Technical SEO

This covers the way a site is built, how fast it is, how easy it can be read and navigated as well as topics like Rich Snippets and Schema. Some of this is easy to do, some of it a bit more difficult, and not all website developers or SEO professionals cover all of these areas.

On Page SEO

Here we are talking about the words on the pages and the placement of the ‘target keywords’ in the important places on the page, all with a view of ensuring that Google finds what it needs in the ‘appropriate places’ on a sites’ pages. Of course this includes the copy / content of a site, but that is not SEO.SEO is how you make sure that the copy is found, not the actual copy itself (except that SEO will help you find out what to talk about in the first place via the Keyword / Market Research phase).

Off Page SEO

This area covers the issue of Links (and to a degree Social Media). It is these ‘signals’ that attract Googles attention and that will get the rankings and ‘seed’ traffic needed. Creating links also helps to keep the pot boiling while the site builds up its momentum.

Social Media

This is included here because even though this is nothing to do with SEO per say, it is important when considering the process of getting the site, brand and product noticed and talked about in a way that will enhance the site in Google’s eyes and extend the reach of the site beyond that of the Search Engine Results.

The Basic SEO process

In all cases it is necessary to carry out the keyword research so that you can target the phrases that are relevant to your market place AND are being used today.

The site must then be built the right way (the technical SEO bit), and then the copy created. This should be of a high quality, but does not have to be as good as it has to be for ‘Top Notch SEO’, for reasons that will be apparent later. Things such as internal linking should be carried out of course, but basically this is a ‘quick’ method of SEO.

Then the link building starts. These are built in the right way at the right speed, using techniques like ‘Power Link Structures’. Social Media signals are also created using this method. In some cases the pump is also primed by actually creating a small amount of traffic to some of the articles and posts that form a part of the linking structure.

Guest Posts Are Used in the quick method too

Guest posts are included in the ‘quick’ method too of course, but they are used differently. As you will see later on, the ‘proper way’ of placing Guest posts is to find a top Infuencer site, chat to them and get them to accept the post (or pay an lot of money for the privilege). However, this process is a LOT more expensive than just placing an article on a relevant site, so for those clients with limited budgets this is the way we go. Basically, these Guest Posts are ‘link vehicles’ and as long as they are well written (no article spinning here at SOM) and contain links that are not going to trigger a Penguin penalty, they do help, we have many examples  the prove the point.

Carrying out SEO in this manner DOES work and is the way the majority of SEO companies work.

The Top Quality SEO process

If you talk to those SEO professionals who practice only the whitest of white SEO, then they will say this is the only way, everything else being a waste of time. Well I disagree with that, but there is no doubt that this process is superior and offers a greater chance of success, BUT, it is a lot harder and thus more expensive in time and money.

This process includes all the On page SEO that Basic SEO requires, including things such as having explanatory ‘Category Pages’ for Ecommerce sites. These are needed as most sites of this type have lots of product pages that (a) often use the same words as a host of other sites and (b) are also often far too short. Thus these Category pages allow the owner to present the products they sell, with links of course to the product pages themselves. These pages can be much better at getting rankings and thus their use should be seriously considered for all levels of SEO.

Power Pages

Remember, this whole process is based on having TOP QUALITY content on your site. Such pages are often called ‘Power Pages’, their contents varying from ‘How to do something’ to a great infographic, anything that would be interesting to visitors and has not been done before (or at least as sufficiently).

Text based Power Pages need to be around 2,000 words long and contain images, videos and links to other authoritative content on the web, PLUS of course the areas in your site that you want people to see and the pages that will result in conversions and sales. Infographics can be just that, but having some words on the page as well can help in my opinion (just as having a transcript of the words used in a video can).

What to Create the Power Page About

The Keyword Research for the site would of course have been carried out first, so the target phrases are known and understood. Using these words, the bulk of the site, (the ‘normal’ pages) will be written and optimised, this including interlinking relevant pages (Google likes this).

Power pages, however, have a different mission. Their job is to get noticed BIG TIME, to become a fount of knowledge and a ‘go to’ source of information on a particular subject (relevant to the products and services of the hosting website). With this in mind it is easy to see that the very first thing you have to know is what subject to write about.

Research into Trends (or try to start one yourself)

This is where checking on trending posts and web pages can be a great help, as it allows you to see what people have become interested in over a period of time (which you can set). You can then have a look at these posts / pages and use them as a basis of your own works, all in the knowledge that people are INTERESTED in the topic.

finding trending topics

 

Of course you can also plough your own farrow, choosing a topic that is relevant to your market place, for instance ‘What is the History of Plastering’ or ‘How to choose the right lawnmower’. There are countless topics to choose from. Besides Kudani, you could also use Buzzsumo.

Writing the Page

Either way, you can start your research into what to talk about, and that will mean looking not only at the trending sites, but also at all the top INFLUENCER sites, in this instance the ones that are at the top of the Search Engines’ results for some top terms.

All the while it is vital to make sure that the page will be ‘entertaining’ and fulfil one of its main purposes, that of being WORTH SHARING.

Note, for a great definition of what an Influencer is, click this link.

Supporting Guest Posts

One important part of this SEO process is to make sure that there are links to the Power Page from trusted sites, but as it can take some time to get an INFUENCER to mention the page or allow a Guest Post on their site, the first thing that needs to be done is to place a well written ‘taster’ post on a High Domain Authority site.

Thus one post (perhaps more) is written and placed on some relevant sites. In most instances this means paying a ‘publishing fee’. Here I must state that there are some SEO’s who think that placing a post on a site that is known to take money for the privilege is worthless. However when you know that high profile sites like the HuffingtonPost take money for Guest posts, you can see that their argument holds little water.

Converting the Influencers

This starts at the website level, where some selected sites are contacted with a view to them mentioning the Power Page or by having a Guest post placed on their site. It need not contain a DO FOLLOW link as we are after traffic as much as link juice, but if they will allow a FOLLOW link then all the better.

It is best if these influences have been contacted and nurtured for some time before you make a request to place a guest post on their site (this also being the case with Social Media Influencers).
Hopefully one of the sites you contact will allow the publication and thus provide you with a link and the potential for a lot of relevant traffic.

Create a Press Release

Press Releases are a well known way of creating a ‘buzz’ in a manner in which Google approves. All the posts are the same of course, but as the links are always NO FOLLOW this does not matter. Google, it is said, really loves press releases and so one pointing to your power page and telling all about how interesting it is and how they should not miss it, is a good idea.

Posting on your Own Social Media Channels

Presuming you have some Social Media accounts, now is the time to start posting about the power page, (although maybe you have been talking about it coming for a few weeks already – another neat trick). Remember that you will have to post again and again here, Social Media posts being, for the most part, short lived, as they are soon replaced with the next tweet and thus scroll off peoples screens. This makes choosing the right time to post important too.

Contacting the Social Media Influencers

Now is the time to start contacting the Social Media Influencers. There are various ways the leaders in a field can be found and once found the ‘nurturing process’ needs to be continued. This process needs to have been started before the power page is posted, the SEO Agency in question having to have commenced this process some time before.

The idea here is to mention that they may be interested in the Power Page’s contents, perhaps also mentioning the Guest Posts that have already been posted and the Press Release. All of this with the aim of getting them to ‘add their weight’ to the campaign.
This is important, as if they can be convinced to mention the power page on their Social Media accounts, the ripples will build and build, all resulting in more traffic and higher rankings.

Monitor and Interact

Hopefully you will have had some comments on your Social Media channels and on the Guest Posts (where the sites allow). It is VITAL that you monitor these and respond as that will only strengthen the whole campaign.

In Conclusion

So there we have it, a brief summary of what SEO is, and how the two main types differ. Hopefully you can see the differences between the two approaches and can understand why SEO carried out ‘by the book’ is such a long, complicated and thus expensive process.

The good news for businesses with shallower pockets is that the ‘basic SEO’ does work in most markets, you just have to choose to approach any highly competitive areas in a cleverer manner, and not try to charge headlong in to get top rankings for highly competitive keyword phrases.

Factors To Consider When Hiring A Search Engine Marketing Company

When people go online to find information, they almost always start by firing up their favourite search engine and typing in what they are looking for. The search engines, in turn, show a list of results that are related to the user’s query. These results are ordered by how relevant they are to the query, as well as a number of other factors. Search engine optimization (SEO) is the process of optimizing certain components of a website so that it will show up higher in the search results.

SEO is big business. In fact, there are companies that do nothing else but help businesses and website owners optimize their sites. It makes sense when you think about it. After all, if you are able to achieve a top listing with your site, you can get practically unlimited free traffic to your pages. This can dramatically boost your bottom line by allowing you to get more leads or sales.

Unfortunately, not all search engine marketing companies are created equal. The process of optimizing a website is rather complex. Even more important, however, is the fact that the variables that search engines look at when ranking websites are constantly changing. A good SEO agency needs to be up to date with all of the latest optimization techniques if they want to help their clients get results. This is especially important when you consider that certain optimization techniques that worked in the past such as link building can now get your site penalized in the search results.

Because of that, it is important to thoroughly vet any search engine marketing companies that you are planning on hiring to work on your website. A good place to start is by asking for a list of sites that they have worked on in the past. This can help you see firsthand how these sites are currently ranking in the search results. If the sites aren’t showing up on the first page for keywords that are related to the products or services that they offer, you should probably keep looking into you find a company that is more qualified.

You also need to be sure that the company you hire to work on your site provides excellent communication. They should not only keep you up to date with the changes that they are making to your site, but also with how your site’s ranking is changing over time. This can help you make sure that your money is being well spent.

Finally, you need to choose a company that is realistic about their expectations. If a company promises to get your site to the number one spot in the search engines, you should choose a different search engine marketing company instead. It is not possible to know for sure whether or not a site will ever achieve a top listing. Instead, they should clearly outline their overall optimization strategy for you, explaining how each step that they are going to take will benefit your site rather than making promises that they can’t keep.

 

If your website doesn’t show up on the first page of search results on Google, Bing or Yahoo, your potential customers might not even know you exist. Better search engine visibility can be critical to boosting visits to your website, which can lead to increased brand awareness and higher sales and profits.

But what if you lack the time and technical expertise to improve your site’s search engine ranking? It might make sense to hire an experienced, reliable search engine optimization (SEO) consultant.

Here are 10 essential questions to ask when considering prospective SEO consultants:

1. May I have a list of current and past clients?
A reputable SEO consultant should be open to sharing a brief list of current and former clients and his or her contact information, says Vanessa Fox, author of Marketing in the Age of Google (Wiley, 2012) and founder of Nine By Blue, a Seattle-based SEO software provider.

These references can help you gauge how effective the candidate is, as well as verify that the person did indeed work on specific SEO campaigns. Clients may not provide specific analytics, Fox says, but they should be able to at least tell you if they saw a positive impact on their search ranking, especially in conversions and in gaining an audience, as a direct result of the consultant’s efforts.

2. How will you improve my search engine rankings?
Steer clear of SEO consultants who won’t freely discuss their methods in detail, cautions Rand Fishkin, founder of Moz, a Seattle-based internet marketing software company and co-author of The Art of SEO (O’Reilly, 2012). They should explain the strategies they would use to drive up your website’s search engine ranking, as well as estimate how long it could realistically take to achieve the SEO campaign goals you agree on.

Make sure the candidate’s proposal includes an initial technical review of your website to weed out any problems that could lower your search engine ranking, including broken links and error pages. Consultants also should provide “on page” optimization, a process to make your website as search engine friendly as possible. It involves improving your website’s URL and internal linking structure, along with developing web page titles, headings and tags.

Also, ask consultants if they provide “off page” SEO strategies to raise awareness of your content on other websites, often via blogs, social media platforms and press releases.

3. Do you adhere to search engines’ webmaster guidelines?
You want a consultant who strictly abides by Google’s publicly posted webmaster best practices, which specifically prohibit 12 common SEO tricks, including automatically generating spammy content and adding bogus hidden text and links. If a candidate doesn’t follow those guidelines, your website could be relegated to a dismally low search results ranking. Or, worse yet, Google could ban it from search results altogether.

Bing and Yahoo also post webmaster best practices that consultants should confirm they follow.

4. Can you guarantee my website will achieve a number-one ranking on Google, Bing and Yahoo?
If the candidate answers yes, Fox warns, “Turn and run in the other direction as fast as you can.” Although it’s impossible to guarantee a number-one ranking on any search engine, she says, some unethical SEO consultants do make such bogus guarantees.

Consider it a red flag if the candidate claims to have an insider relationship with Google or any other search engine that will get you priority search results rankings. Only Google, Bing and Yahoo can control how high or low websites appear in their search results.

5. Are you experienced at improving local search results?
Appearing in the top local search engine results is especially important to small brick-and-mortar businesses trying to attract nearby customers, Rand says. You’ll want a consultant who has expertise in local SEO techniques.

If your website is optimized for what’s known as “local SEO,” it should appear when someone nearby is searching for keywords that are relevant to your business. To achieve that, a consultant should add your business’s city and state to your website’s title tags and meta descriptions, and get your site listed on Bing, Google and Yahoo’s local listings, which are online directories of businesses that cater to a specific geographical area.

6. Will you share with me all changes you make to my site?
Search engine optimization will most likely require a number of changes to your existing web page coding. It’s important to know exactly what adjustments the consultant plans to make and on how many web pages. If you would like the candidate to get your permission before accessing and altering your website code, be sure to say so.

 

Read more: https://www.entrepreneur.com/article/227229

Making The Most Of Your Search Marketing Strategy

Business owners know that in order to be successful, advertising and marketing is essential. It does not matter what goods or services you offer, if customers don’t know about you, then your business will not thrive. Since we now live in a technological age, marketing is more important than ever before and just having a website is not enough, you have to be found on the search engines so that you can grow your customer base. This means that you will need to target niche keywords that potential customers are using to search for the services or goods that you supply.

Many businesses will try to save money and do a bit of search marketing themselves, they might take part in beginner’s class or buy a book or DVD to give them a bit of insight into search marketing. What these people might not realise or understand, is that this type of marketing takes in a number of different strategies. It is therefore advisable to engage a specialist SEO company who can undertake the work and make sure that it is done properly and within the rules of the search engines.

Doing search marketing takes time, effort and a certain level of skill in order to achieve results that will drive more quality traffic to the site. By hiring an internet marketing firm, you will be buying the services and expertise of people who really know what they are doing. It will save you time and money in the long run. Remember that marketing is just one aspect of running your business and you will be better served spending your valuable time on other areas where your own expertise can be harnessed.

Another benefit of using an online marketing firm is that you could well have an advantage over the competition, particularly if they are not doing much marketing or are doing it themselves. A new business or one that needs a bit of a boost will certainly make advances in their rankings by making use of expert services. There will be a range of services on offer and although you will be spending money on these, you will reap the rewards and start making money when the website starts converting traffic.

Remember that you also need to ensure that your website is accessible from mobile devices. This is important because it is a fact that more and more searches are being made from mobiles and you do not want to miss out on high levels of potential customers. Even if your site is just for information about your business, it is still important that you make sure that prospective clients can visit your website from their device. So, talk to us today about how we can help you with your search marketing campaign and what you would like to achieve.

 

 

The last few years, search engines such as Google, Bing, and even Apple, have been upgrading their algorithms and machine learning processes to account for the end-user’s experience. But, since their algorithms are built upon the work completed by automated crawling bots (pieces of software that manually scour the internet), it has always been difficult for them to truly simulate the actions of a flesh and blood user. And it’s not feasible for them to create an algorithm that’s based on the anecdotal feedback of an army of individual users that submit their findings.

Instead the search engines have started to write logic and incorporate machine learning algorithms that based on vast troves of user behavior metrics, to their best estimation, is what a user experience should be on a website. Some of the criteria they are now measuring are site speed, mobile optimization, site structure, content, and dozens of other signals that should give the algorithm an idea of whether or not search engine users are getting what they expect from a website.

So, what does this mean for companies, marketers, and website owners when it comes to their SEO?

Basically what I, and dozens of other SEO industry experts, have been writing about for years has now come to fruition. We’ve exited the era of search engine optimization (SEO), and have now entered the new age of search experience optimization (also… SEO).

And this is great news for anyone that performs digital marketing correctly. It means that “gaming” the system has become less and less viable, and that groups who rely on black hat techniques are seeing their efforts become less effective.

So, how should websites be optimized for the search engines now that user experience plays such a big role?

Ask Questions, Provide Answers

Previously, marketers used to obsess over ideas like keyword density, meta descriptions, and link profiles. They had everything down to percentages and numbers and it all made sense when it was placed into an excel sheet. But how on earth was a website that was built from data on an excel sheet supposed to appeal to a human being?

That’s the problem the search engines set out to fix. And you need to accommodate the changes they’ve made.

Specifically, you need to think about your website visitors at every stage of your web design and marketing process. And this can be done easily with a series of question and answer audits you can ask yourself as you’re creating your marketing campaign.

For instance, if you’re designing a web page and you’re wondering how to make it appear in the Google search results, you should start by asking what your customers are typing into the search engine. This sounds rudimentary, but think it through for a moment. Previously marketers would optimize for terms such as “snow tires” or “weight loss products”. But search habits have become more semantic and people are no longer typing in general terms, but rather they’re asking questions.

Thus, the search term “snow tires” has evolved into, “what are the best snow tires for a 2008 Ford F150?”

And it’s the companies that are answering the questions for their customers that are starting to win in the search engine rankings. So, stop fretting over how many times you mention the keyword in the content you’re writing on the page, and instead start asking yourself what your customers need help with.

Embrace Mobile

If you’ve been living under a rock for the last 10 years, you may be shocked to hear that most people use smart phones and that smart phone searches now account for a more search volume than desktop searches. However, if you’ve been living in the world with the rest of us, this isn’t too surprising. So, if everyone is using mobile devices to browse the web, shouldn’t you likewise be optimizing your site for mobile traffic?

Read more: http://www.forbes.com/sites/miketempleman/2016/02/16/seo-has-evolved-to-search-experience-optimization/2/#4930c0402bac

What is Unique Content and Why Is It Important?

When it comes to getting better rankings you will hear all SEO’s saying, “you need Good Unique Content as that is what Google wants now“. But what does this really mean?

Many have taken this to mean that all sites have to do is to create copy which is not duplicated somewhere else on the web and is also a good read, using proper English throughout.

Can Google Really Tell What is Good?

When you think of it, that should (and maybe is) enough for Google in that their systems however powerful, cannot really deduce what is ‘good’ and ‘useful’ content, that is still really a job only a human can do.

So, the first step is to write some good copy, that is not itself used anywhere else and make it is over 1,000 words long (that seems to be lowest level that Google seems to ‘like’). If you can then include some images and if possible video on the page and link out to a high power site (one in the same niche / market area), one that provides some ‘back up’ to the page, i.e. a page that provides the facts and figures referred to, being the best.

You will then need some links in to the page, but before we come to that, we need to work out why links are needed?

When Considering SEO You Have to Think Like a Computer

Here you have to start thinking like a computer, looking at things logically and asking yourself the question, “If I was a computer what would I look for if I was trying to decide if this page was any good or not?”

We have seen above that Google checks the words on the page, but that does not give it any idea about the real value of the content, only humans can do that, and as far as Google is concerned, this is signified by a page having been shared, liked, linked to or otherwise mentioned by others on the Web.

Finally, it seems that Google then, being a lot cleverer than it used to be, now also checks to see how many visits a page / site gets, this for the simple reason that a site/page cannot get lots of links or mentions if it has not been visited in the first place. Whilst this is not strictly true ( a site can get a lot of links from a Press Release without a single visitor going to a page) it is an indication that all is how it should be and that the site/page has not been subject to (too much) manipulation in SEO terms.

As you can see, there is a lot more than first comes to mind when considering what you should add to your website…

To read more on the subject of what makes unique content please click the link.

Modern criteria for content

So let’s start by talking about our modern criteria for content, and I have a slide that I like to show a lot that kind of displays this, and many other folks in the field have as well. So if I’m going to be producing content, I need to meet these five criteria.

One of a kind

One of a kind is basically what we meant when we said old unique content, meaning that the engines have never seen those words and phrases and numbers and visuals and whatever in that order on a page on the web previously. It’s been written for the first time, produced and published for the first time. Therefore, it is one of a kind, doesn’t appear elsewhere.

Relevant

Relevant meaning it contains content that both searchers and engines interpret as on topic to that searcher’s query or their intent. Sometimes you can be on topic to the query, meaning you’ve used the words and the phrases that the searcher used, and not be on topic to their intent. What did they actually want to get out of the search? What question are they trying to answer? What information are you trying to get?

Helpful

This one’s pretty obvious. You should resolve the searcher’s query in a useful, efficient manner. That should be a page that does the job that they’re hoping that that content is going to do.

Uniquely valuable

This is the one we’re going to be talking about today, and what we mean here is provides information that’s unavailable or hard to get elsewhere — I’m going to dive into that a little bit more —

Great user experience

This means it’s easy and pleasurable to consume anywhere on any device.

You meet these criteria with your content and you’ve really got something when it comes to a content marketing strategy or when it comes to content you’re producing for SEO. This is a pretty solid checklist that I think you can rely on.

Unique value and you (and your website)

The challenge is this one. Uniquely valuable has been a really hard concept for people to wrap their heads around, and so let’s dig in a little more on what we mean when we say “unique value.”

So these are kind of the three common criteria that we mean when we say “unique value,” and I’m actually going to show some examples as well.

1) Massive upgrade in aggregation, accessibility and design

The first one is a massive upgrade versus what’s already available on the web in aggregation, accessibility, and/or design. Meaning you should have someone who views that content say, “Wow. You know, I’ve seen this material presented before, but never presented so well, never so understandable and accessible. I really like this resource because of how well aggregated, how accessible, how well designed this resource is.”

Good examples, there’s a blog post from the website Wait But Why on the Fermi Paradox, which is sort of a scientific astrophysics, “why are we alone in the universe” paradox concept, and they do a brilliant job of visualizing and explaining the paradox and all of the potential scenarios behind it. It’s so much fun to read. It’s so enjoyable. I’ve read about the Fermi Paradox many times and never been as entranced as I was as when I read this piece from Wait But Why. It really was that experience that says, “Wow, I’ve seen this before, but never like this.”

Another great site that does pure aggregation, but they provide incredible value is actually a search engine, a visual search engine that I love called Niice.co. Not particularly easy to spell, but you do searches for things like letter press or for emotional ideas, like anger, and you just find phenomenal visual content. It’s an aggregation of a bunch of different websites that show design and visual content in a search interface that’s accessible, that shows all the images in there, and you can scroll through them and it’s very nicely collected. It’s aggregated in the best way I’ve ever seen that information aggregated, therefore, providing unique value. Unfortunately, since it’s a search engine, it’s not actually going to be indexed by Google, but still tremendously good content marketing.

2) Information that is available nowhere else

Number two is information that’s available nowhere else. When I say “information,” I don’t mean content. I don’t mean words and phrases. I don’t mean it’s one-of-a-kind in that if I were to go copy and paste a sentence fragment or a paragraph and plug it into Google, that I wouldn’t find that sentence or that paragraph elsewhere. I mean unique information, information that, even if it were written about thousands of different ways, I couldn’t find it anywhere else on the web. You want your visitor to have experience of, “Wow, without this site I never would have found the answers I sought.” It’s not that, “Oh, this sentence is unique to all the other sentences that have been written about this topic.” It’s, “Ah-ha this information was never available until now.”

Some of my favorite examples of that — Walk Score. Walk Score is a site that took data that was out there and they basically put it together into a scoring function. So they said, “Hey, in this ocean beach neighborhood in San Diego, there are this many bars and restaurants, grocery stores, banks, pharmacies. The walkability of that neighborhood, therefore, based on the businesses and on the sidewalks and on the traffic and all these other things, the Walk Score out of 100 is therefore 74.” I don’t know what it actually is. Then you can compare and contrast that to, say, the Hillcrest neighborhood in San Diego, where the Walk Score is 88 because it has a far greater density of all those things that people, who are looking for walkability of neighborhoods, are seeking. If you’re moving somewhere or you’re considering staying somewhere downtown, in area to visit for vacation, this is amazing. What an incredible resource, and because of that Walk Score has become hugely popular and is part of many, many real estate websites and visitor and tourism focused websites and all that kind of stuff.

Another good example, blog posts that provide information that was previously unavailable anywhere else. In our industry I actually really like this example from Conductor. Conductor, as you might know, is an enterprise SEO software company, and they put together a phenomenal blog post comparing which portions of direct traffic are almost certainly actually organic, and they collected a bunch of anonymized data from their platform and assembled that so that we could all see, “Oh, yeah, look at that. Sixty percent of what’s getting counted as direct in a lot of these websites, at least on average, is probably coming from organic search or dark social and those kinds of things, and that credit should go to the marketers who acquire that traffic.” Fascinating stuff. Unique information, couldn’t find that elsewhere.

3) Content presented with a massively differentiated voice or style

The third and final one that I’ll talk about is content that’s presented with a massively differentiated voice or style. So this is not necessarily you’ve aggregated information that was previously unavailable or you’ve made it more accessible or you’ve designed it in a way to make it remarkable. It’s not necessarily information available nowhere else. It’s really more about the writer or the artist behind the content creation, and content creators, the great ones, have some artistry to their work. You’re trying to create in your visitors this impression of like, “I’ve seen stuff about this before, but never in a way that emotionally resonated with me like this does.” Think about the experience that you have of reading a phenomenal book about a topic versus just reading the Wikipedia entry.

The information might be the same, but there are miles of difference in the artistry behind it and the emotional resonance it can create.

Is Your Online Presence Failing to Sell?: Here Are 4 Reasons Why

There is an old saying in that ‘you can bring the horse to water but you cannot make them drink’ and never has one been so accurate when talking about web traffic…

Getting_the_horse_to_drink_is_Key_in_the_Internet_World_oQX5QK

From an SEO or Social Media point of view, getting traffic to a site is the first big goal, but it has to be the right sort of traffic and then the site must do its job and get them to engage, taking a ‘sip’ if not a big gulp.

The Engagement Process

A part of this ‘engagement process’ is of course down to design, it has to appeal (very quickly) to the browser, or risk loosing them in those vital first seconds.

The next thing of course is the content of the page. Is it what the customer wants?, your Bounce Rates will tell you (and Google too if they come from a search) so need to be constantly rewiewed, just in case you are not doing things the way that your customers want, these after all being the final arbiter…

The site’s content and the way it approaches it’s customers is therefore key. It does not matter how many potential customers (horses) you deliver to a site if the ‘water’ does not look good and tasty.

Getting on the Customers Shortlist

But what is ‘tasty?’ A very good question and one that will change depending on what the site is about and where in the buying cycle your customer is. The article below covers this in one of it’s points saying in effect that those who are just starting in their quest are looking for very general data and thus don’t want the full nine words on your product / service, but just an initial description. If you get on their shortllist they will be back..

Besides the issue of good ‘useful’ content, there is the matter of Re-Engagement. This is another topic and one that we will come back too in the future, but it is important as just because the visitor does not buy today, does not mean that they might not buy tomorrow, so keeping in contact and reminding them that you are there waiting to serve them, is a good idea.

For the full article on Why your site is not converting, please click the link.

The cheese moved. The buying process has changed. Technology to support and further that change continues to grow and evolve. Communicating through the vast array of digital channels (website, SEM, social, email) is no longer an option. It is a must-do.

The online presence of your business must attract and convert prospects. It must engage with leads through a variety of channels as users travel through a longer and more complex buying cycle.

How we market and communicate online has come a long way from static, brochure-like internet pages and “spray-and-pray” email blasts. Unfortunately, for many, online marketing is still failing to reach its full potential.

Pointing a finger at the underlying technology would be easy, e.g., marketing automation, content management systems or any of the tools and solutions laid out in Scott Brinker’s Marketing Technology Super Graphic.

It’s easy to say the technology is failing, so the marketing effort is failing. But the reality is more complex. Here are four of the biggest reasons why your online presence is failing to drive sales.

Engagement For The Wrong Reasons

Using engagement tactics that are not aligned with business goals is a huge waste of time and money. Too often, I see engagement for engagement’s sake. This results in leads stagnated in the buying cycle and low-volume sales funnels.

Having a high number of Twitter followers or a successful content syndication program is great, but that is not success.

CMOs are being judged on sales. And following your brand or downloading an asset is not a sale.

Social followers matter. They are your advocates. They can extend your message.

But focusing on the number of followers and not their engagement and conversion ratios results in negative ROI for the money spent to generate them. It also takes the marketing eye off the important goal of a sale.

Weak Commitment To Prospects

Generating new leads through content syndication or SEM is the start of the buyer’s journey. Most leads are not ready to buy at that point.

So not using retargeting or nurture programs to bring them back for further communication is a waste of the money spent to find them to begin with.

When they fail to travel along the pipeline because they are left to rot somewhere between the marketing and sales organization, it reflects negatively on the organization.

Lack Of Good Content

Everywhere I look, the numbers show an increase in content marketing spend and usage. Businesses are spending millions to have content developed — to tell their story, engage with their prospects, and help convert their leads through the buying cycle.

And yet much of what is used to attract and engage leads is sales enablement content. It’s all about features and functions. Or it’s focused on selling something, rather than trying to educate.

Take, for example, the content used in top-of-funnel nurture programs. More times than I care to remember, I’ve seen programs use 45-minute product webinars or 20-page product briefs.

Breakdown happens when leads don’t engage, and the prospects in the funnel dry up. This is because leads in the early buying cycle don’t want to know everything about the product, and they don’t want to be sold to. Rather, they want to know what the options are and what to consider as they do their research.

Marketing Teams Are Not Living In The Now

Stagnating means not going to where your customers are by using better ways and different channels to communicate with them online. It results in low communication. And poor communication results in low sales.

SEO in 2015 – What Has Changed and By How Much?

There is always a lot of talk about how SEO is changing all the time and to a degree this is true. It is however more about the change in quantity of each part of the ‘SEO Recipe’ than the addition of new factors. However saying that, new factors do arise and in 2015 three have been added to the list of factors that affect the rankings of a site, these being ‘Vertical Search’,  ‘Direct Answers’ and ‘HTTPS’.

Positive SEO Factors

The changes in the recipe are reflected by ‘movements, both positive and negative’ in the amount that each item is likely to affect any rankings. There is no doubt some truth to much of this, but the fact of the matter is that Google only uses any of its own rules as a guide to how it will rate a site, and often seems to list sites that, based on the rules that we know simply do not deserve that high page ranking.

It must also be said that the experiments / research that have led to the publication of the data and table below cannot be said to be totally scientific, for the simple reason that there are too many variables influencing  the rankings gained. However, saying all this, the data is useful and for my part seems to be following the path / trend of Semantic SEO.

Quality Content is Good for SEO

The first indication that this is the case comes in the very first part of the SEO Periodic Table, the ‘Cq’ (for Quality of the Content’) being give an +3 factor, the ‘symbol’ below, for Research into the Keywords that you want to rank for, also being given a +3. The latter I feel really relates to the fact that this research leads on to the inclusion of the ‘right’ words on a page, words that are relevant to the search term targeted AND ones that Google ‘expects to see’.

The latter point is an interesting by the way, as it is all about the ‘Reverse Engineering’ of web pages. The process is simple enough. You decide upon the target phrase, then discover the top sites on Google for that phrase and what words are found on the majority. It stands to reason then, that using the same words (as far as is possible and looks right) on a page on a site will increase the possibility of that page being ranked for the target phrase.

This is all music to the copy writers ears, as now, besides the target words, they are also provided with a list of words to use as well as an indication to which are the most important.

However, to get back to the SEO table,  we can see the words on the pages actually has its own symbol, the with a +2.

Other factors are ‘old’ ones, but with their ‘power’ updated to take into account how much they appear to affect rankings today in 2015. One that is especially interesting is the ‘Hd’ symbol (for Meta Description). This is indeed a factor that has been around for many years now, but today, it has another way of altering the rankings of its page.

This change stems from the ‘SEO Feedback Loop’ that Google is now suspected of running. Here the CTR (Click Through Rate) % of all the links on a Google SERPS result are checked, the idea being that if a page is listed but only gets a low CTR there must be something wrong with the way it is listed and this of course is the owners fault.

The owner can of course monitor low CTR by checking on Webmaster Tools (Now Google Search Console) and then make some changes (to the Meta Description) in order to boost the CTR. However, if they don’t and the pages CTR continues to be low,  the page may lose it’s SERP’s listing (or at least be demoted), thus the need to make sure it is right becomes obvious  and why the power of the Meta Description has been increased.

Mobile Friendly Sites

Another change is that of the symbol ‘Am’ for mobile. This has increased by 2 points as well, and of course refers to the need for all sites to be Mobile Friendly.

On the other hand, we have those factors that have a Negative effect on rankings, all the old favourites like ‘paid links’ (Vp) , Keyword Stuffing (Vs) and Spammy links (VI) being present.

As I say, it is the quantity of each component of recipe that changes, not the (for the most part at least) the actual components.

The article (and table) is a good read though and I’d recommend it, and thank the hardworking staff at SearchEngineLand for it too.

The latest version of Search Engine Land’s Periodic Table Of SEO Success Factors is now out. This is the third edition since we first launched our search engine optimization framework in 2011. Below, a rundown of what’s new and changed, as well as a reintroduction to the table.

The Table’s Goal & Philosophy

Our goal with the Periodic Table Of SEO is to help publishers focus on the fundamentals needed to achieve success with search engine optimization. This means it’s not about trying to list all 200 Google ranking factors or detail Google’s 10,000 subfactors. It’s not about trying to advise if keywords you want to rank for should go at the beginning of an HTML title tag or the end. It’s not about whether or not Facebook Likes are counted for ranking boosts.

Search Engine Land Periodic Table of SEO Success Factors

http://searchengineland.com/seotable/download-periodic-table-of-seo

Instead, the table is designed to broadly guide those new to or experienced with SEO into general areas of importance. Title tags are generally important. Think about making sure they’re descriptive. Social sharing is often generally seen as good for SEO. Aim for social shares, without worrying about the specific network.

If you want to understand more about the philosophy of the table, read our posts from when the table debuted in 2011 and when it was updated in 2013.

What The SEO Table Covers

There are two major classes of factors:

On-The-Page: factors that are largely within the control of publishers
Off-The-Page: factors are influenced often by others or not directly tied to a publisher’s site
Within these two classes are seven categories of factors, which are:

Content – factors relating to the content and quality of your material
Architecture – factors about your overall site functionality
HTML – factors specific to web pages
Trust – factors related to how trustworthy & authoritative a site seems to be
Links – factors related to how links impact rankings
Personal – factors about how personalized search results influence rankings
Social – factors on how social recommendations impact rankings
Overall, there are 37 individual factors, which range from making use of descriptive HTML title tags to whether a site has success with visitor engagement. Here’s a close-up of the table, focusing on just the factors:

Mo Farah – Drug Testing – SMS Messaging and Newsjacking

You cannot have failed to notice all the ‘fuss’ about Mo Farah and drugs testing at the moment, it’s the sort of news that the press just love…

45d52237f1c74d7bb301ecb14025de15_y87Gkb

Image by Ronnie Macdonald via Flickr

 Newsjacking

Such news items are however also a great time to do a spot of ‘Newsjacking’ where with a bit of creative thought, you can ride the wave a little and promote a relevant business or product.

So, when I listened to the news and found out, to my surprise that all athletes had to register where they would be an hour an hour a day for the next three months, AND BE SURE TO BE THERE, just in case an unannounced dope test was scheduled for then, I thought, WOW, this is just where one of the clients of SOM could really, really help out.

The client in question is FastSMS, the service they provide being SMS messaging. You will no doubt have received text messages from your Doctor, the dentist, your garage, etc reminding you of a future appointment (and if has not happened it soon will, as long as you have a mobile phone that is).

SMS Messaging in Action

Thus, when I heard that these poor athletes have to remember to be somewhere (up to three months in advance) I thought, is not this a great application for SMS Text messaging?

All you would need to do is to have a mobile phone number associated with an Athletes entry into the ‘Where I will be’ Dope Testing system and a little bit of ‘programming magic’ that sends them a text the day before saying where they ‘have to be’ tomorrow, with another 2 hrs before.

JOB DONE, I reckon, no excuses and less hassle for all.

I have had a word with FastSMS and they are looking into this right now, and will soon have a post up on their blog, they are even considering offering this service to the dope testing people at UKAD.

This is a good example of ‘NewsJacking’, in a positive manner and should be born in mind by just about any business there is….

Mo Farah put his hopes of competing at London 2012 at risk by allegedly missing two drugs tests in the buildup to the Games and was warned by his coach Alberto Salazar that “they will hang you if you miss another”.

Under World-Anti Doping rules, a third missed test within the space of 12 months is the equivalent of a failed drugs test – and so would have left Farah, who went on to take 5,000m and 10,000m gold in London, facing a minimum of a two-year ban.

According to the Daily Mail, which has seen an email exchange between the UK Anti-Doping Agency and Farah’s representatives, Farah missed one test in 2010 and another in early 2011 shortly after he had joined Salazar’s training group in Oregon..

See the full article