Google’s New Tweet Section. What Will It Mean For You?

We have seen Tweets in the Google Search results before, so this is not new, but I suspect that the way they are being chosen and integrated is different this time around.

Tweets Showing in Google's SERP's Again
Public domain from pixabay

From a marketing perspective though, all that matters is that it is happening and that therefore any business should start thinking that bit more seriously about using Social Media and Twitter to better effect.

As I have mentioned previously, Social Media is important for SEO too, it being an increasingly important factor in the way Google chooses what site to list. This makes Oscar Wilde’s comment “There is only one thing in life worsr then being talked about, and that is not being talked about” even truer, if for different reasons…

The other thing to take from this interesting article is that by the looks of it, the hashtag #, could be being used to selec

Google and Twitter have been teasing us with a new integration of tweets into Google search results for months, but this week they finally made the official announcement. Real-time tweets are now showing in Google search results on mobile devices with desktop integration to come soon.

The integration takes the form of a carousel that appears in search results, which lets you swipe sideways to see various tweets. It only appears on some searches, and it’s unclear how and when exactly Google decides to show them. The examples we’ve seen have been for Twitter profile searches, celebrity searches, and newsy/trendy topic searches.

The placement of the tweets in search results varies. I assume it’s based on now relevant Google feels those tweets are to a particular search. If the search is related to an event, perhaps Google will be more likely to show them toward the top while it’s actually happening. I’m only speculating.

Google isn’t saying much of anything about how it determines what tweets to show or how it shows them. It’s refusing to answer questions about this, and the blog posts from both Twitter and Google on the integration are pretty short and vague. It’s easy to understand why this would be the case. They don’t want people to game the system and abuse the feature.

It’s entirely possible that we’re only seeing the very beginning of what Google will ultimately do with its newfound tweet access. We spoke with Stone Temple Consulting’s Eric Enge about the new integration, and he believes Google will be doing a lot of experimenting and potentially evolving its use of the tweets.

Earlier this year, after Google’s deal with Twitter was announced, we had a conversation with Enge about some studies his company had conducted, including one that analyzed Google’s use of tweets at the time. There were a lot of interesting findings in those, which you can learn more about here. Now that the new integration is live, we wanted to see what Enge thought about it, and if he can see any validations or contradictions to what the study found. Here’s what he told us:

Right now the integration between Google and Twitter is quite light. Currently, it’s only visible from Smartphone devices. In addition, it’s clear that they are experimenting. For example, when you search on a name, such as “Taylor Swift”, you see tweets that she has put out there. Yet, the initial release showed tweets that mentioned her. This is typical of Google, where they experiment with different implementations to see what works best, before settling on one for the longer term. I expect this experimentation to continue.

What this means for visibility in the short term is not much at all. This process is in the very early stages. Think of this as Google proving that they can access, process, and leverage the data from the Twitter firehose. I’d expect more substantial integration sometime soon. The whole process may take months to play out.

What I’d love to see is Google do something involving personalization related to Twitter. I.e., if you share a link in a tweet, and then later search on a related topic, that particular article might rank higher in the search results. I have no way to know if they are getting enough info to implement something like this, but it would be a very cool feature for them to be able to add.

As you know, our two most well-known studies on Twitter evaluated how Google Indexes tweets, and how to maximize Twitter engagement. The current integration tests between Google and Twitter don’t really feature anything that would dramatically change the conclusions of either of those studies. I think the real story is yet to come.

Frankly, I expect both studies to change. Twitter indexing could well skyrocket, as our indexing study showed indexation rates for Tweets of just over 7%. Imagine if this jumps to 50% or more. This could be a huge deal!

In addition, the simple act of rendering tweets in search results will not create a new source of engagement, which is whether or not you are able to get displayed. In particular, how timely are you with Tweeting our news. If you are fast with this, your tweet will get far more attention than ever before.

Overall, I think this initial integration is big news because it’s the start of a process. I can’t wait to see how the rest of the story unfolds!

It does seem like Google may use hashtag searches as an indication of when users might want to see tweets. While not all hashtag searches yield twitter results, others mainly related to things that are being talked about a lot at the time do.

Under Google’s previous Twitter deal, it had a realtime search feature, which included tweets in addition to content from other services. It would be cool if they could bring content from other sources like Instagram, Facebook, etc. into the carousel, at least for hashtag searches as hashtags extend well beyond the Twittervese these days. Either way, Google’s approach seems like all the more reason to include hashtags in tweets for visibility purposes. It is unclear how often people are actually searching Google for hashtags however.

Please see the full article at

‘How to’ Sites being targeted by Phantom Update

There are rumours, backed up with some hard facts that Google has been targeting ‘How to’ and ‘Informational site’ with it’s latest ‘Phantom’ update. This leads to  question ‘what does Google want?’.

Google's Phantom??
Public Domain from pixabay

What does Google Want?

I can understand if only entire sites with ‘thin content’ have been hit, or the pages that are there just to catch rankings for one thing and then sell another, BUT, if a page is a genuine one, giving information on say, how to fix a tap washer, and then goes on to tell them that there is a really easy way to turn off the water in your home / office (and thus reduce the risk of an internal flood) then how can that be considered wrong, they are after all about related topics.

Continued posts and video’s from Google state that they want pages to become more and more helpful, with good useful content, and how much more useful can you get than a ‘how to’ page? After all a huge percentage of searches on the web are looking for such help and information.

For my part as an SEO professional it looks like a totally silly update, brought about more by I don’t know what, just hope that ‘malice’ against any site trying to make money on the Web (like Google do), is not involved.

It would be really good for someone in Google to give an explanation, rather than denying the whole thing as they seem to be.

It really does make you think, what does Google want people to publish when you see things like this happening…

Google “Phantom” Update Rolling Out Targeting Informational, ‘How-To’ Content

NBC reports that an undisclosed, “phantom” algorithm update is rolling out in Google search right now which is primarily impacting publishers of “how-to” style content.

HubPages, a collection of more than 870,000 miniblogs containing informational content about wide varieties of topics, saw its Google search traffic drop 22 percent from one week to another on May 3.

Websites like eHow, WikiHow, and are other sites that have seen significant drops in traffic over the past couple of weeks.

Glenn Gabe, of New Jersey-based G-Squared Interactive, is the one to coin the phrase “Phantom” update because this update came without warning, and Google has yet to acknowledge its existence.

Through analyzing the data, Gabe has determined that this udpdate is not related to Panda or Penguin. However, similar types of “thin” content targeted by Panda are also being targeted by this update.

Gabe points out that clickbait articles, sites with an abundance supplementary information, pages of stacked videos, and pages difficult to navigate have all lost visibility in recent weeks.

This update is said to be “ruthless” in its approach — apparently having no problem punishing entire domains for a few instances of thin content.

“When you have a domain-level algorithm update or ranking change, it can impact the whole site… Pages that should be drawing well could also be pulled down in the results.”

Google has not formally commented on this update, although at SMX Sydney this week Gary Illyes, from Google’s Webmaster Trends team, alluded to there being a recent change that’s part of a core algorithm update.

Please see for the full update and for the comments and evidence that it is hitting some sites.

Google’s Penguin and Panda Updates Still Delayed

Anyone involved in the SEO world knows all about Google’s two most feared pets, Penguin who searches for anything that looks a bit fishy (excuse the pun) in the link profiles of sites and Panda, whose job it is to check for sites that have poor content, are not well set up (eg have the same Meta data for all the pages).



The importance of these two animals rests in the manner of their operation. Basically, both are sets of rules, e.g does a site have more than 70 characters in its Title, or do all (or most of) the links to a site use keywords, and it is the changes to the rules and there application to the Google Engine that changes the rankings.

In practice that means a site that has good rankings today will lose them all the day after an update to Penguin or Panda. Of course the opposite is also true, especially if a site had been ‘hit’ by an update and cleaned up its act. In those cases, the site should regain all (or some) of the rankings it had lost.

It is the latter fact that causes the greatest concern, as the sites that were hit and have taken action have to wait until the update is run again, because until then Google does not check to see if they are now ‘good boys’ and are playing the game according to Google’s rules.

As these updates have not been run for around 7 months now, many site owners are now getting desperate.

Google have said that they were going to run them more frequently, and there was I thought, some talk about them being built in to run in real time, but as yet no sign of the latter.

The biggest message that must be taken away from all of this is to “NOT GET IT WRONG IN THE FIRST PLACE” as then you won’t get hit by a penalty in the first place…

Any way, if you are interested in all of this, this post and video will be of interest I am sure:-

Google: We Are Trying To Update The Data For Panda & Penguin Faster
It has been several months since we had either a Google Penguin or Panda refresh, Google says they are working on updating it faster – but they said that before.

It has been several months since we had either a Penguin or Panda algorithmic refresh from Google and the natives, aka webmasters, are getting restless. As we covered, the algorithms may be real time, but those hit by one of these algorithmic penalties cannot recover until the underlining data is refreshed, and that data has not been refreshed in a relatively long time.

Google told us prior to the Penguin 3.0 release that they are working updating the algorithm so it updates more frequently. Now, Google is telling us again, eight months later, they are still working on making these two algorithms refresh faster.

John Mueller, Google’s webmaster trends analyst, said in a Google+ hangout, about at the 25 minute mark, “that is something we are definitely working on to kind of update that data again to make it a little bit faster,” in regards to having the data refresh more often for the Panda and Penguin algorithms.

Here is the transcript

We are working on updates there. I don’t have any time frames at the moment but I know the team is working on that. I know it is frustrating, if you’ve worked a lot on your web site already to clean up these issues.

The same applies to Penguin as well. Where maybe you cleaned up a lot of web spam issues. And you are just waiting for things to kind of open up again.

And that is something we are definitely working on to kind of update that data again to make it a little bit faster.

The last official Panda update was Panda 4.1 on September 25, 2014 and the last official Penguin update was Penguin 3.0 on October 18, 2014. Each algorithm did have minor updates within a month or so after those launch dates, but since, there have been no real movements around those algorithms. Webmasters and publishers currently hurt by these algorithms are eager for a data refresh to see if their clean up efforts will resolve their ranking problems in Google.

Goodbye to Webmaster Tools (Well at least the name)…

There is nothing static about the Web these days, even the names of things that we love change (when what they do stays just the same). This it appears is happening to good old ‘Webmaster Tools’, the new name being ‘Google Search Console’.

(Now Google Search Console)


One reason for this is that the term ‘Webmaster’ is really a bit of an anachronism from the days when you needed a webmaster to run your site for you. These days of course, with the advent of WordPress, just about anyone can be there own webmaster, so the change really does make good sense.

What really matters is the data that the tool provides, this having reached new heights since Google stopped providing the ‘keyword’ data in Analytics. This means that the only way of getting such data is via the ‘Google Search Console’.

Another good set of data that is often overlooked, is that of Impressions, and CTR, as together these give you a good idea of the ‘footprint’ that a site has and how good it is at getting clicked once it is shown. The latter is important, as if a site has a low CTR, it could in the future be shown less frequently and thus lose traffic.

I will be publishing more information on how to use Webmaster Tools, sorry, the ‘Google Search Console’ over the next weeks.

Please read the full article for all the news..


After nearly 10 years of being known as Google Webmaster Tools the search giant has decided to rebrand one of its more popular web-based services: say hello to Google Search Console. This change comes as a result of user feedback — only a select amount of users identify as “webmasters”, and Google wanted to establish a new name for Webmaster Tools that was more inclusive of its entire user base.

The post Google Rebrands Webmaster Tools as Google Search Console by @mattsouthern appeared first on Search Engine Journal.

via: Google Rebrands Webmaster Tools as Google Search Console by @mattsouthern


The Biggest Challenge That Semantic SEO Presents

In my last post on Semantic SEO, I mentioned that the greatest challenges are to understand what should be written about and posted on a site and how you can get the necessary signal generated on social media. Google has developed Semantic SEO because it uses factors that are very hard to game. By doing so it stops the ‘arms race’ between Google and the SEO community. I believe that Google made this decision because they were totally fed up with fighting what was a seemingly unending war.


Why is user interaction at the heart of Semantic SEO?

The very fact that user interaction is hard to game is why it’s at the heart of Semantic SEO; that’s very different to links (which underpinned Page Rank, Google’s first rankings methodology), which have for many years been created to artificially boost sites’ rankings. Of course, Google fought back with its Penguin updates, but because of the fact that links can be so easily created (or bought) in ways that Google can’t always detect, links are no longer given the power they once had.

User Interaction is much harder to game because Google finds it much easier to tell if the person causing the interaction is real or not, and by their search patterns and comments reveal that they are interested in, or an expert in, a subject.


How do you generate interaction?

This is the key and is also the hardest thing to do as it requires that your content is not only found (so it can be commented on) but also (the IMPORTANT bit) that this content is relevant, interesting, entertaining, challenging, informative or any combination of these.

Clearly we have a bit of a chicken and egg situation here, at least for new sites. That’s because they have to be found before the content can be seen and before interactions can take place. True, sites can be listed in Google SERPs when they are new, as long as the page concerned contains the right sort of words and the site itself has enough links to get it on the radar. Thus to a degree, writing good content may be enough.

However, to speed up the process and start it in the right way, it’s a much better idea to start the ball rolling with links to these ‘useful’ pages and by creating some social media chatter. The latter is more relevant today than ever.


User interaction is stimulated by relevant content

We know that Google wants to see interaction taking place, but what should you (or your copywriter) write about? And what should you bear in mind when writing this content. All your content should do the following:

  • Promote your brand
  • Promote your company values
  • Make you stand out against competitors
  • Create a better understanding of your business
  • Help people understand the value of your products
  • Help people understand why they should buy from you

Your Posts should also do at least one of the following:

  • Answer a specific question
  • Create a challenge
  • Leave the reader feeling enriched
  • Help establish your company’s authority
  • Generate greater equity for your brand


What is a ‘good’ site?

While you are creating content you should consider that what you are trying to achieve is a ‘good’ site, the characteristics of which are as follows:

  • It contains lots of unique content.
  • It is content-solution orientated for questions (phrases).
  • It contains sticky content that creates a desire to stay on the site or come back for more.
  • It invites interaction for further clarification or additional information.


The types of search phrases to consider writing for

When considering SEO (whether Semantic or not) you have to consider the search terms being used by people, as this shows you what people want answers for, or information on. One thing that will help here is to understand that there are three main types of serach query:

  • Informational – How do I? – What is?
  • Commercial – the price of, the specs of, the problems with advertising.
  • Transactional – where can I buy… a laptop or a Galaxy smartphone? Special offers; Used RVs.

Bearing in mind your type of site and what you’re trying to achieve, you should pick a type and then write with this in mind, As you do this, you should bear in mind what a post should aim to do.


What will all this achieve?

The aim of all this hard work is to obtain traffic, whether via social media or Google. This traffic is generated with the promise of great content; I say it’s a promise because no-one can be absolutely sure of its value before they read it.

Even Google can’t be sure that content is good. Of course it can check the words used, the links to and from, and how well the article is structured. But it cannot really tell how good it is. This is where the user and user interaction comes in; Google’s thinking is that anything that’s talked about and commented on must be interesting enough to mention and thus be included in their listings.

So this is where Google lets the visitors rule on whether a site’s content is really good or not; this is demonstrated by the level and manner of interaction with a given page.


What sort of interaction is Google looking for?

In order of importance, Google seems to be looking for the following:


  1. Comments on a blog post (on the website).
  2. Responses to comments on a blog post (on a website).
  3. Comments about a website on social media.
  4. Responding to comments about a site on social media.
  5. Resharing the content of website and adding a comment to the share.
  6. Resharing without adding any other content.
  7. Following websites.
  8. Liking the content of a site.
  9. Interacting with social posts on a site.


Make It Easy For Your Visitors

All these interactions will only occur if your copy gets seen, but many of them require changes to your site: for instance, if you want people to be able to leave comments, then you must provide a means by which they can do so. Another example is sharing pages: if you want it to be easy for visitors to share a page, then you should provide a Share button.

Social media buttons that link to your Twitter, Facebook, LinkedIn and Google+ accounts need to be provided too. Having them in place makes sharing easier and therefore more likely to succeed.


The Overall Factors That You Need to Be Aware of

SEO always had a lot of acronyms, abbreviations and buzzwords. The latest of the latter being the ‘Four Vs’: volume, velocity, variety and veracity. In reality, they’re not actually that new; each ‘V’ covers something that had a different name in the past. Nevertheless, they are important.


This is obvious enough and refers to the number of posts, Tweets or pages that you create. While there may not be a ‘do not exceed’ level, there’s definitely a minimum for posts on a site, with one a week being lowest recommended. You of course can do more than this and Tweet away too. The advantage of the latter is that you don’t have to point to, or mention, anything on your site. Instead, you just need to talk (or more specifically, write) about something relevant.


Google has always checked on the number of social media mentions and links to sites. The main reason for this is that it has been looking out for anyone trying to game the system. This is still the case for links, so make sure you never create too many at once. The same applies to creating too many Tweets at once. Of course, others may create chatter about you, but don’t worry. As long as the reason for this increased chatter is valid (e.g. you have a great bit of copy) Google won’t penalise you.

In essence Google is looking for natural patterns, those going up and down and looking, well, ‘natural’…


This covers more than just the sources of your links or your social media following, both of which should be as varied (and therefore as natural) as possible. This is where you must make sure your links come from lots of different sites and areas (some of which must be relevant, but NOT all) and that they use different anchor (linking) text. The latter is all part of the Old SEO, but remains important.

Variety is also important at the very heart of Semantic SEO. It’s essential that your blogs, pages and social media signal is varied and covers more than just your services and products. This may sound strange, but let’s consider for a moment someone selling pet beds, dog leads and other accessories – but not pet food. If they only talk about their products they are very unlikely (never?) going to appear in a search about keeping your dog happy or your cat healthy. Consequently, they will miss the opportunity to be found by people who are not actually looking for their services or products at the time, but who could well be interested.

Because of this, they should aim to be considered as an expert in the field of pet products (in this example) so they get lots more traffic and exposure. This is called Serendipitous Search and is another core part of Semantic SEO as Google provides suggestions about the sites it thinks might be useful. It’s all very different to the simple keyword matching results that used to be used.

Variety in the type of content is also vital, so remember to include videos, images, infographics and possibly even podcasts when you can.


This is another word for authority and trust. It was (and still will be to a degree) gauged by checking the links to a site to identify where they are from as well as the outbound links from the site to authority sites. Linking to an authority source is important; for instance, if you quote any statistics it is vital to link to the sources of any statistics used, or research mentioned an imperative, as this makes Google believe the post can be trusted.


Google has always been on the lookout for mentions (no links used) of a site, brand or product that causing it to think the site must be considered an authority and is therefore worth ranking. As with anything in the SEO world, it’s vital that these co-citations aren’t overdone, but this is an area that you should consider.


We understand that links are less important than they were, but it doesn’t mean this area can be ignored. Instead, the variety and authority of links to a site must be constantly reviewed.

User-generated content

As mentioned in my last post, the ‘blind man’ that used to be Google has metamorphosed into ‘Master Po’ and can see an awful lot. One of these areas is who has posted on your blog or about you on social media – Google can even tell if posts are real or fake! While this stops gaming of the system, it also helps when you get a comment from a person who is seen to be an authority in an area and your page gets a boost.


My last two posts on Semantic SEO cover some important changes in the SEO world and what you should do when creating content. The next post will give you some ideas about gaining more social media traction and how to use ‘old fashioned SEO’ to help.

Twitter Rolling Out New Search Page

Social Media has become so much a part of our lives today, so any changes being made make big news. The latest one is by Twitter, the search page and its functions being altered, this hopefully being what people have been saying they wanted to see.

The latter is an important point as today businesses are supposed to be letting their customers ‘help them’ to refine their products via the process of C2B.

It is also suspected that Google will soon be once again accessing Twitters results in real time soon, so any changes that may make Twitter better must be taken into account.


Twitter’s new search page puts filtering actions front and center.

The change is more than cosmetic, although the new look is certainly cleaner and makes it more obvious that you are on a search page within the social network. Filtering options — Accounts, Photos, Videos, etc. — previously in the left-rail, have been moved to the top below a bold bar that includes the search term. There’s also a “More options” heading, that enables users to further refine searches, drilling down into News, tweets by people they follow, tweets from people nearby among other things. From the more options menu, users can also save searches, embed searches by accessing widgets and call up Advanced search options.

Previously, users could choose between top tweets or all tweets about a search term; now the “all” heading has been retitled “Live,” no doubt to further emphasize Twitter’s real-time pedigree. As new tweets related to a search roll in, users are notified and can click to load the new results.

The default view is Top, which along with highly-engaged with tweets, also displays accounts related to the search term and groups relevant photos in the results.

In a change from the original experiment, Twitter has added a Related searches module in the left-rail, above the “Who to follow” and trending topics sections.


For the full article please see

Semantic SEO and Google, the (not so) Blind Man

In some of my previous posts, and when discussing SEO with my clients, I’ve often alluded to Google being like a blind man in a department store. I used this analogy as, without some help, both the man and Google could easily get lost and not be sure that they were in the right place.

In the case of the blind man, this would result in him leaving the store without making a purchase (perhaps never to return); in Google’s case it could mean that they will not understand what the site is really all about. This could be catastrophic as far as getting rankings for just about anything is concerned.

Leaving signposts on your web pages


Of course, in a store you have Braille signs, but what is the equivalent on a website? The answer is of course the Meta Title, Description and Header tags of the pages. Using these to inform Google about the content of the pages is a great first step; even though it’s very much part of the ‘Old SEO’ it’s still vital today.

Google ‘the not so blind man’ and old and new SEO

Even with all of its power and the new SEO practices that it’s forcing us all to follow, Google is still like a blind man in that it needs help to ensure that it gets the right end of the proverbial stick. There is however a huge difference between Google of old and the one that is evolving before our eyes.

If you’re of a certain age, you may remember the TV series Kung Fu. In it, David Carradine stared as a Shaolin monk (Kwai Chang Caine) who, through the training he received, became a martial arts expert. However, it’s not David that’s interesting here, but his mentor, Master Po. Po was totally blind, yet he could ‘see’ everything, pointing out the grasshopper at the feet of the young Kwai Chang – something the latter, even with his perfect vision, had missed.

Today, Google is like Master Po: it can’t see everything, but it can see a lot and all that it does see is taken into account when considering what site to rank for what. But it’s vitally important to understand how it is planning (and to some degree already is) to use this enormous amount of data. That’s because this is the big difference between old and new SEO.

Old SEO equals keyword matching

To be fair, old SEO was more than simply matching a keyword phrase to the ‘best’ sites for that term; even the old systems had 200 or so ‘factors’ that were taken into account. But in the end, it was mostly to do with how well the ‘signposts’ you placed on a site (be they in the Titles, Headers or copy, not to mention all those links) matched the keyword phrase; that’s what really counted.

This of course led to gaming of the system. SEO companies would alter the pages of a site to SHOUT the target keywords to Google. And to reinforce the message they’d create thousands of links to reinforce the message. Pages without any real merit reached the top of the listings and Google came out with more and more rules to try to combat the situation. It was a time of new trick after new trick, with each one being found out and the gains it had brought removed. But it worked, and to some degree still does.

The days of Old SEO are numbered

Google, it seems, concluded that it wasn’t going to continue with this ‘arms race’. Instead, it would change the game entirely. In my view, it didn’t do this out of spite; I believe Google just wanted to ensure that it would always be able to pick the best sites for any phrase and never be tricked again.

This was no mean task, but Google has a plan based on the fact that, instead of just matching keywords to sites, they will (try to) look beyond the words to the meaning of the search phrase – in other words, what you or I, as searchers, are really looking for.

This was one of the reasons for the introduction of the Hummingbird update (technically this was more like changing the engine than replacing a part of it, but let’s call it an update for simplicity). In doing so, Google wanted to be better able to understand what people wanted when they used the new Voice Search feature on smartphones. (By the way, according to the experts, the reason for this is that people express things differently when speaking, compared to when they write them down.)

The reason it’s called Semantic SEO

This leads nicely to the reason this whole process is called Semantic SEO. Semantic is a Greek word that means ‘meaning’. As Google is trying to work out what the intent (and what it really means) behind a search phrase is, this has led to the whole process being called Semantic SEO.

Google does more than just try to work out what the real user intent behind a search phrase is. In order to come up with matches in its database of sites, it must also understand the real meaning of any page. To do this, it must work out what the content is trying to say; that is, how it can help, inform and entertain.

It is thus vital to understand what message you are trying to put across with any content. You can read more advice on this in the next post.

But how does Semantic SEO work?

This is the big question for anyone who wants the best rankings possible for any relevant search phrases. But it’s here that we hit the first real change. You see, even though keywords still have their importance, they’re not the be-all and end-all that they used to be. That’s because Google no longer relies on simple keyword matching.

So, if Google isn’t using the words on pages to decide what it should list, what is it using? This is where it gets tricky to explain; basically, Google will look at the information, the real meaning of a page and the site it is part of, and the purpose behind its creation. It will also look at what others say about it (and on it in the case of comments) before deciding if this matches the meaning behind the search phrase.

Being found when you’re not even being searched for

This is what Serendipitous Search is all about. It’s another another huge change to the old SEO because Google is now more of an ‘answer engine’ that provides suggestions for sites it thinks might be useful – even though they don’t include the keywords being searched for.

The more you make your site answer the questions and needs of your potential customers, the more Google traffic you will give you.


Semantic SEO and the feedback loop


This is another very interesting (and potentially scary) thing about the new Google. Not only does it look at the words on pages, their meaning, links to and from a page, and social media comments (as well as who made them). Google also looks at the data it has gleaned from the billions of searches it makes every day and sees how each one went.

This means that every time a site is listed, Google can tell how popular that site was from the CTR (click-through rate) to the site. It has been using this methodology for years with Pay per click (AdWords); adverts getting the best CTR are charged less than those with low CTR. With organic listings there is of course no payment. But if a site’s Title and Description don’t get people to click on the link, Google will eventually notice and simply stop giving that page a listing for that term. You can imagine that, if this happens too often, a whole site could just disappear from the rankings. So beware and do check the CTR in your Webmaster Tools.

There’s more too. You see, a site could well have a really great Title and wonderful descriptive text causing all who see it to click through. You might think that’s good news, but if the site doesn’t live up to the visitor’s expectations and they click back to Google to try again, Google will notice this – and conclude that, for that term at least, the site doesn’t deliver the goods. As with poor CTR, this could eventually lead to the site not being listed at all.

Google will also use the feedback process to ‘learn’ what people want to see in the first place, which helps it understand what the meaning of the search was really likely to be about. This allows Google to make its best guess about what sites it should list for any term, and then just sit back and wait to see how people react. If they click on a site and don’t bounce, then they’ve got it right. But if they bounce they haven’t, so Google ‘learns’ with every decision searchers make. What’s more, it will never forget and will keep updating its knowledge all the time. Spooky, eh?

There’s more too. You see, a site could well have a really great Title and wonderful descriptive text causing all who see it to click through. You might think that’s good news, but if the site doesn’t live up to the visitor’s expectations and they click back to Google to try again, Google will notice this – and conclude that, for that term at least, the site doesn’t deliver the goods. As with poor CTR, this could eventually lead to the site not being listed at all.

Google will also use the feedback process to ‘learn’ what people want to see in the first place, which helps it understand what the meaning of the search was really likely to be about. This allows Google to make its best guess about what sites it should list for any term, and then just sit back and wait to see how people react. If they click on a site and don’t bounce, then they’ve got it right. But if they bounce they haven’t, so Google ‘learns’ with every decision searchers make. What’s more, it will never forget and will keep updating its knowledge all the time. Spooky, eh?

The above process is made even more powerful by the fact that, just as Google can deduce what a page or a site is about (and therefore what answers and information it gives), when it really does satisfy a user it can then deduce the original intent. This is yet another part of the great feedback loop.

Semantic SEO and gaming the system

As we’ve seen, it’s the copy and how well the message and meaning of a site is put across to Google and any visitor, that really counts in the end. The former to get a listing in the first place; the latter, in effect, to keep it.

There is, of course, more to convincing Google than the copy, though I think this will take the lion’s share. Inbound and outbound linking, the social media signal and the level of interaction (including sharing) are also major factors.

Although it may be possible to game the system by creating a bigger social signal than the site really deserves, the experts’ view is that this will be more and more difficult, with Google looking at each person who comments or Likes, then deciding if they’re real or not. If they are one of the millions of fake profiles set up in the past, they will count for nothing, and may even damage a site.

Thus under the intense scrutiny of Google, it may be as hard and unproductive to create huge amounts of social signal as the process of creating thousands of worthless links…

This doesn’t mean that a small quantity of such links and signal are useless. Both can ‘prime the pump’ a little so the real power of the site is allowed to shine through. If this is the case, a small level of gaming (or old-fashioned SEO work) still looks as if it will be worthwhile.

However, if the page or site in question doesn’t really deserve a high ranking, it will eventually be denied one when people tell Google that it’s no good via low CTR’s and high bounce rates. Therefore, the whole process depends on having a site that answers visitors’ needs. And that means high quality, useful content delivered via words, pictures and video.

The new Semantic SEO

So what will the new SEO process look like? In my view it will still start with the keyword phrase. After all, this is the start of the process and can’t be ignored. The next stage is to try to work out which words are likely to be used by someone who has the intent to react with your site in the way you’d want. This could be to buy something, or simply to understand that you could help them with their problem or needs.

Once you’ve decided on these words, you can reverse engineer the Google results to see what sorts of words it likes to see.

Combine this data with the questions that are being asked, and the problems that your site solves, and you have the recipe for a perfect page that answers people’s needs and uses the words Google expects to see. Interestingly, the latter neatly covers the area of LSI (Latent Sematic Indexing) – without all the effort.

Once this page is created, and you’ve placed all the standard ‘blind man signposts’ on it, you can proceed to getting it noticed via old-fashioned links and social media.

As you can see, the above includes some old SEO practices, this being for the simple reason that they’re still as relevant and required as they were several years ago.

The biggest change and the greatest challenges are to understand what you should write about and post on a site, and how you can generate the necessary signal on Social Media. I’ll cover this in my next post.

Beware: Web Design Can Directly Affect Sematic SEO

There is always something of a ‘battle’ between SEO and Designers, many simply not bothering with SEO at all. To be fair this is not their job and unless the customer has made it clear that they want to be found on the Search Engines, why should the designer take this area into account.


Web Design Can Effect SEO
Public Domain from pixabay


Of course, in my view, any person investing in a website would be classed as slightly mad if they did not want to be found on the Engines, but it does happen.

The issues for a site that wants to be found are more centered around the content and ‘meaning’ of the site (Sementic SEO) than the ‘mechanics’ of the page (This is also called ‘technical SEO’ and forms a part of the old SEO that is still valid and needed today), but it does still matter, especially if the design increases the Bounce Rate as Google WILL notice this and that could mean big problems with ranking.

Interesting article though, so please read on…

SEO is a process that requires ongoing education and learning. And while much of the focus is on building quality inbound links, identifying and optimizing for the right keywords and semantic search terms, and investing in quality content, you can’t ignore the obvious impact of web design on your site’s search rankings.

The Dilemma: Unique vs. Searchable

When designing or redesigning a website, most companies come face to face with a pretty significant dilemma. On the one hand, you want your site to be unique and engaging. On the other, it needs to be easily searchable by the major search engines in order to attract the right traffic. This is the heart of the SEO-web design relationship and something that you need to understand in order to help your website succeed in 2015 and beyond.

How Popular Web Design Styles Affect SEO

In order to speak to the masses, let’s start by analyzing a few of the most popular web design trends and how they impact SEO.

· Parallax design. One of the more popular web design styles this year is parallax design. This trend is defined by building an entire website on a single page. It usually has a very large background image with clean, crisp menus that drop down or appear when the user scrolls his or her mouse over a designated area. While it’s visually appealing, Google and other search engines find it difficult to hone in on specific meaning or themes. Furthermore, your site naturally has fewer pages that can rank – diminishing your potential reach. If you’re only trying to rank for a single search term, parallax design may be okay. However, if you have a lot of content and various products and services, you should probably pursue different web design.

· Infinite scrolling sites. As you may assume, parallax design typically means longer load times. If you like the idea of parallax but don’t want to take a negative hit for longer page load times, you may consider incorporating infinite scrolling. This is the type of design sites like Twitter and Facebook have and allows content to load as the user scrolls. Google seems to like scrolling sites and typically prefers them to standard parallax pages. You can see some good examples of infinite scroll by checking out these award-winning websites.

· Graphic-heavy. Because of the success of infographics and visual marketing content, many brands are attempting to develop graphic-heavy websites that essentially look like large infographics. While they may be visually appealing, you have to remember that Google and other search engines can’t read images (outside of alt-tags and accompanying text).

· Responsive design. That leads us to responsive design – the ideal web design trend for SEO purposes. As you likely know, responsive design allows a web page to be viewed on any device, regardless of screen size. In terms of SEO, responsive design is valuable because it doesn’t require you to create a separate website for each device and helps maintain a consistent user experience (which lowers bounce rates and increases average time on site).

4 Things to Keep in Mind

(see the full post for info on this bit !)

via: Proceed With Caution: Web Design Can Directly Affect SEO


Google is Listening and Learning

Came across this post and just had to include it on the SOM site as there is so much sense spoken both by Rouge and from the orignal post.

For me, the bit about Google listening and learning was the most important bit as I feel that not that many people (inside and outside the SEO community) really understand the power of Google, not just because of its computing power, but its ability to understand user behaviour and thus ‘give the searcher what they want’.

All of this means that the best SEO in the World will do nothing for any site if it proves to be delivering the wrong message. In short the only way of ‘gaming the system’ now is to provide real worth to visitors on  your web pages

SEO - The Old and the NEW
Public Domain from pixabay

C2B Marketing = Listening to your Customers

This article is very interesting because it is all about Businesses actually listening to their customers, all with the idea of letting them shape future products. Whilst this cannot be said to be new (marketing has always been about finding what people want and them giving to them), it is fascinating to see that BIG businesses are really taking the idea on board.

Of course, this idea of giving people what they want goes further than just the products, it also affects the branding and the website, both of which are important areas to Rouge Media.

It is all perhaps more interesting to website designers than many other professions too, especially those that are taking on board the concept of Semantic SEO. The connections are many, but perhaps the strongest is that Semantic SEO is all about understanding what the message, your brand, and your products really are, in effect you get to understand their ‘meaning’ in a way rarely reached before.

Understanding this meaning is vital of course, but, and this is the tie in, you therefore have to understand what your customers want and are expecting. If you get this wrong your site will suffer from high bounce rates (people leaving as soon as they arrive) and / or low average page views / time spent on site.

Google is Listening and LEARNING

But there is more than even this, as perhaps the biggest company using C2B will be Google. This is all because Google takes account of every click to a site and whether the person stays on that site or leaves. If too many bounce back to Google, then eventually Google will down rate that site (at least for that search term).