Press Releases – How Can They Be Used for SEO etc

I have just returned from a workshop given by a PR company called PR2Go. It was their first and was all about what PR was and how it could be used to enhance a businesses reputation.

Of course to those versed in all this, it is easy to see why you should use PR, but to the rest of us, it is a bit more difficult, at least in some cases. For my part I can see that PR is a valuable SEO tool, as it allows me to create a press release and then submit this to one (or many) of the Online Press Release sites that abound on the web.

If the release is half decent it will get accepted and once accepted be there when Mr Google’s spiders come a calling (something they do often with these sites). The result, a link back to the site, which is all to the good.

The presence of this release is good for another reason of course, as it raises the profile of the business on the web when someone is looking for sites that ‘talk’ about it, reviews and comments often being searched for when a customer is deciding whether they really want to be a customer or not…

However, the real reason for a Press Release must be to get into the press, either digitally in print (or better still both). Whilst getting some digital space is relatively easy, getting some space in print is much harder it seems.

So what are the things to bear in mind when trying to get a release printed?

This is a difficult question to answer, but I have some tips for you, from the experts at PR2Go:-

However, before I do that, the most IMPORTANT piece of information that I picked up at the workshop was this piece of gold:-

“The story that you want to tell will, most often than not, Not be the story the journalist wants to tell…”

There is a basic reason for this and that is that most of the stories that we want to tell about our business are ‘yawn’, a bit boring to the journalists of this world.

This in turn means that you have to have an ‘angle’, something that whilst being what you want it to be about (your business) it is also about something that makes interesting news.

This can be quite difficult, but with some thought, and a little practise and guidance from people like those are PR2Go it is achievable.

Tips to make it more likely for your PR to be accepted:-

Make it (perm any number from these 6):-

  1. About People
  2. Interesting
  3. Educational
  4. Informative
  5. Entertaining
  6. Tell them something NEW

Then, bearing in mind that your story might not be the one that anyone wants to tell, think of the ‘Hook’ the thing that makes something mildly interesting very interesting. An example of this is a shop refit. If it was once an old building, then you can use the ‘shop refit breathes new life into historic building angle’ and if it is about taking on new staff, if there is someone that is being employed who has been out of work for a long time or has a disability or something, consider using that as a ‘good news’ angle.

So, if you want some PR and don’t really fancy sitting in a bath of beans for charity, have a chat with a PR expert or two and let them see what they can come up with. After all, ‘There’s a Story in there somewhere’, you just have to find it…

Of Old Dogs, New Tricks and SEO

It has long been said that you can’t teach an old dog new tricks, the saying it seems coming from the Bronze age (see excerpt from http://www.stuckon.co.uk/mythbusters-gives-hope-for-us-all-in-seo-3861.html:-

“One of the most hackneyed sayings in the English language is ‘You can’t teach an old dog new tricks.‘ It was crafted in the early Bronze Age to keep elders in their place, and it has tarnished our perceptions of what aged things are capable of ever since. Well, the US science experiment show Mythbusters has proved that you can indeed teach old dogs new tricks, and Google’s search results continue to show that old SEO can still rank.”

They go on to say:-

“The two hosts of Mythbusters, Jamie Hyneman and Adam Savage, have been testing cultural and Hollywood myths since 2003. In 2007, the intrepid pair decided to put the old saying to the test, taking on a pair of older canines from an animal shelter and each trying to teach their dog a series of tricks. Needless to say, the saying was thoroughly disproved.

The ranking test of old pages can be disproved with far less effort. Simply type a few keywords into Google. For an astonishing number of keywords, Google promotes surprisingly aged websites. For example, a search on ‘online marketing tips’ produces a first result from 2009 and a second from 2006. Unlike a search for Cheshire cheese, one would assume that fresh pages are constantly competing for these spots.”

For me this all makes perfect sense, pages either holding onto their places because

  1. Their has been no real competition for the spot.
    or
  2. They were so good, or were on such a powerful site that they could not be easily moved.

However, it is probably more accurate to say that ‘old dogs don’t forget their tricks’ here, as it is the fact that these pages (dogs) are still so good at doing their trick (getting a top ranking)  that allows them still be there.

The experiment with the pages is interesting in another way in that it does show pages can be knocked off the top spots where there is competition, the question here being “Are they better or is it just that they are newer?”

Happy SEOing

Hidden Text – When is it OK to use?

With the potential rewards for a top ranking for an important keyword being so high, it is not surprising that many site owners and SEO companies sometimes test the envelope a little.

When they do this they do of course risk the risk of Google (other Search Engines are available) and many have paid the ultimate price and been removed from the rankings altogether. Having seen the light (that hot one that the Google Engineers shine at you) many of these sites recant their sins, put it all right and hence regain some listings, the ones you could say that they ‘deserve’.

I suppose you could say it is all a bit like drugs testing in the sports world, the continuing checks for anything ‘underhand’ keeping the Search Engine Results ‘playing field’ level, or at least as level as it can be.

One of the methods used in the early days (when the Internet was just knee high) was to use Hidden Text. This was soon spotted and stamped out (Google had big feet even then) but it is now back, and is being used legally by many sites. It could help your site too, so read on.

First what is ‘Hidden’ text.

“Hidden text is a generally obsolete form of Black Hat SEO in which pages are filled with a large amount of text that is the same color as the background, rendering keywords invisible to the human eye but detectable to a search engine Crawler. Multiple Title Tags or HTML comments are alternative hidden text techniques. Hidden text is easily detectable by search engines and will result in Blacklisting or reduced Rank.”

As quoted in http://www.submitexpress.com/seodictionary.html#hiddentext

This is a very useful definition and with the addition of ‘don’t use hidden text in <no scripts> either (but  do read on to see where you can/should do this), just about covers the field.

But you said it was being used legally – How?

As ever in the perverse world of the web, there are ways to use hidden text on sites, but only under certain circumstances.

Here’s what Google have to say on the matter:-

http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=66353

“Hiding text or links in your content can cause your site to be perceived as untrustworthy since it presents information to search engines differently than to visitors. Text (such as excessive keywords) can be hidden in several ways, including:-

  • Using white text on a white background
  • Including text behind an image
  • Using CSS to hide text
  • Setting the font size to 0

Hidden links are links that are intended to be crawled by Googlebot, but are unreadable to humans because:-

  • The link consists of hidden text (for example, the text color and background color are identical).
  • CSS has been used to make tiny hyperlinks, as little as one pixel high.
  • The link is hidden in a small character – for example, a hyphen in the middle of a paragraph.

If your site is perceived to contain hidden text and links that are deceptive in intent, your site may be removed from the Google index, and will not appear in search results pages. When evaluating your site to see if it includes hidden text or links, look for anything that’s not easily viewable by visitors of your site. Are any text or links there solely for search engines rather than visitors?”

The things not to do are quite clear here, don’t use colour to hide text and don’t use tiny link text, but what they don’t say you cannot do is to use hidden text that is made visible by USER ACTION.

The penultimate sentence is also worth highlighting :-

“When evaluating your site to see if it includes hidden text or links, look for anything that’s not easily viewable by visitors of your site”

The reason for it’s importance is that this is all about ACCESSIBILITY for readers who have to used screen readers etc. What they are leading to here is that your site should be readable by all and they have some tips for that too.

“Some tips on making your site accessible include:-

  • Images: Use the alt attribute to provide descriptive text. In addition, we recommend using a human-readable caption and descriptive text around the image.
  • Javascript: Place the same content from the Javascript in a no script tag. If you use this method, ensure the contents are exactly same as what is contained in the Javascript and that this content is shown to visitors who do not have Javascript enabled in their browser.
  • Videos: Include descriptive text about the video in HTML. You might also consider providing transcripts.”

This bad news for those in the SEO world who push the envelope with ALT tags as it says that you should put meaningful descriptions in and not just keyword stuff them (yes it is still done by some), but GOOD news for those want to use the power of Video to the fullest (as it gives them a reason / excuse for a transcript).

But it is the middle one that is really interesting as it says that if you are using Javascript to hide words then you should also use them in the <no script> tag so that a screen reader can see them. This is interesting as it says that hiding words by Javascript is OK, it just does not say when.

I have known that it is OK for a good time now, this information being gleaned from various forums, see some examples below:-

http://groups.google.com/group/google_webmaster_help-tools/browse_thread/thread/2967a0d659cb8cf7/26b959936836e91b?pli=1

“If a user action (mouseover, click, etc) can cause the text to be
unhidden then Google doesn’t have a problem with it.  Not unlike the
‘more’ link on Google’s homepage or any other flyout menu application.”

And

http://www.webmasterworld.com/google/4073407.htm

“I’ve been using this kind of legitimately hidden text for years. In fact, every site that has a “hover menu” is using it. There’s definitely no problem, as long as an obvious user action reveals the hidden content.

How Google does the analysis technically, I’m not sure. But I’ve never even seen one false-positive penalty from this kind of thing. One thing we do know is that the human editorial evaluators – and there are thousands of them – are trained extensively in detecting deceptive ways of hiding text. So my guess is that Google relies on human input before handing out a hidden text penalty.”

But must admit to not being able to find a Google announcement on the matter (someone will do doubt enlighten me here), but suffice to say that the use of Hidden Text, if viewable by USER ACTION (and not therefore there just for the Search Engines) is OK.

The Advantages of Hidden Text

There are two main advantages to hidden text, one is obviously the SEO one as it allows you to legally place lots of text on the page, but perhaps the real winners are the designers and readers. The former wins as they don’t have to have a terrible ‘wordy’ site that looks awful and the reader wins as they can decide how much information they want to see and are not confronted by a wall of text.

The use of hidden text can therefore be a real winner as it should, all things being equal increase conversion rates for the site too.

But what does this hidden text look like in practice?

There are two main types here, one using ‘tabs’ or links that cause the page to scroll down (extra text being inserted in the page) and another that basically shows an entire new page of information (this being very useful for Flash sites – a post on this coming later).

hidden text - what it looks like
As all the text for each tab is present in the HTML of the site it is read by the Engines and as it is made visible by user action AND is not there just for the Search Engines it is OK.

So there you have it, HIDDEN text that is OK and that does help (not hinder or destroy) to gain higher (deserved) rankings.

More on the use of this methodology and Flash sites later.

Google’s Latest Ranking Signal…

As everyone knows Google loves to ‘mix it’ and keep everyone (especially the SEO community) guessing. The details of the rules they use to decide which site comes first are of course ‘secret’ but the basics are known to many. One of them, especially for Local Search is the number and type of reviews (see Hotpot), but they are now looking at a new way of picking out those sites on the web that ‘don’t belong to be there’ (their army of researchers not being enough it seems).

Their latest weapon is a ”Blocking Extension to their Chrome browser‘, whereby any user can block a site. The important issue here is that the act of Blocking Tells Google that you don’t like the site and this of course can then be ‘looked into’ by their engineers….

I also presume that this will also remove these sites from your personalised search results, but have not tested this.

Not sure what effect this will have on Google’s results as the number of Chrome users are limited and not everyone will download the extension, but it is an interesting and perhaps worrying event, as what happens if it is used in an unscrupulous manner to ‘devalue’ a competitors site??

As ever it is another interesting change…

HotPot is Coming !

Local and Mobile Search are VERY big in the USA, and I expect that they will make a big impact in the UK too soon.

One interesting and powerful aspect of Local Search (Google Places to you and me) is that of customer reviews, and here Google’s have come up with their very own ‘engine’ called Hotpot.

See what the news are saying about Hotpot in the USA

Plus Google’s own Hotspot blog makes an interesting read

Content is King, But is CMS a King Killer?

Everyone understands that ‘Content is King’ on the web today. All the talk is about how the Engines only want to rate web sites which contain interesting information and this of course means in part at least the ‘words on the pages’. This information should also be seen to be changing on a regular basis.

A Vote for the CMS Then?

This would seem to be a vote for making it easy for website owners to add and amend their own content. This has of course been made easier by the huge growth in content management systems, these being available even on the cheapest of website hosting packages. Some of course are better than others and some are easier to use, however, there are real issues when it comes to the area of Search Engine Optimisation.

SEO Issues – Speed of Loading

There may be many more than the two I am going to cover below, but these are the two most important in my view. The first concerns the structure of the site itself, the way in which the pages are put together in HTML terms. Not all Content Management Systems (CMS) are bad of course, but if the pages load slowly because they have to make lots of ‘calls’ to the Web Server to ‘collect’ parts of the page itself, like css scripts, or files containing javascript code then the page will load slowly. This could well cause the site to be ‘downgraded’ by Google (this engine penalises web sites which loads slowly as it believes that they lessen the experience of web users) and that in turn can affect its chances of getting a good ranking.

Keyword Density

However, the biggest issue is that the web site owners can make any change they want to their sites text. Whilst this sounds like a good idea, making such changes can play havoc with a carefully optimised site. Areas like keyword density can be thrown out of the desired range (2% to 4%), whilst the all important heading text can be changed so much that the pages of the site no longer SHOUT the words they need to. If this happens, a page that had been enjoying a good ranking before it was changed by the owner could well drop down the rankings and even off them. If rankings are important (not all site owners are bothered) then you can see that allowing users the ability to change their own content might not be as good an idea as it first seems.

Headers SHOUT Keywords – Do you really want to Confuse the Listening Engines?

Problems are made even worse when the CMS has been set up to allow users to add headings whenever they want. Often these headings have been set up to use one of the old fashioned ‘header tags’ H1, H2 etc, which means that often the unknowing user ends up with a page full of header tags (which just ends up confusing things). Things can get even crazier when these headings end up in the wrong order, H1’s after H2’s and H3’s instead of before them, as this really makes thing messy!

The CMS & SEO Solution

Of course there is a way around the problem, first the CMS needs to be built in the right way, that is to minimise complexity and thus page load times. Secondly and more importantly the website owner needs to be made fully aware of how to use the CMS in the first place. This will allow them to make the changes they need whilst maintaining the ‘Optimisation’ of the pages.

So all you Web Designer chaps and lasses out there, please have a think about how you can help your customers, they’ll love you for it in the end…

May the Fourth Be With You..

Sorry for the Star Wars pun (yes another one on the band wagon) but it is relevant to Search Marketing, as the Fourth Factor in SEO is becoming something that you cannot afford to ignore.

But Hey, you say the Fourth, What are the other three then?

Traditionally Search Engine Optimisation was about two areas, On Page optimisation and Off Page optimisation. The first dealt with the words on the pages and how they ‘talked’ to the Search Engines, whilst the second was all about the links to a site.

Then a third started being banded about. I’m not totally certain just how valid this is, but logically it sounds like it could well be a factor in Google’s algorithm (you know the one that they won’t publish) and that makes it one to bear in mind, at least in my opinion.

Activity – The Third Area of SEO

Basically, the third area is about ‘matching’ the visitor profile (numbers) with the linking profile (number of links), the argument being that if you have a site with lots of links and no visitors then it must (a) be pretty useless or (b) all the links to it have been constructed (i.e planted by an SEO company – shame on them…).

Either way, it is said that Google will consider the site to be of little interest and it will therefore ‘file’ it in the basement, which of course means it rarely, if ever, gets ranked on any results page that is worth having.

So, bear this in mind when doing your SEO. By the way, the best method of getting around this issue is to buy some Adword clicks, that way you are sure to get some traffic and be placed on the Google radar for future reference. Another way, one that overlaps the ‘Fourth’, is to start Tweeting and use the Google URL shortening tool in the Tweets. If you make the Tweet interesting enough you should get a click, which as they use the shortened URL alerts Google to the visit to your site – job done!

But how about the Fourth One then?

The fourth issue is of course SOCIAL MEDIA. By now, anyone following the latest trends in Search Marketing will have picked up on the link between Twitter and Facebook and SERP’S (Search Engine Results Pages).

I am pleased to say that I spotted the link some months before Google and Bing came clean about it all, and am now making very sure that this angle is totally covered, but it is still a new game and one that will take some time to understand in full.

So, if you are interested in getting the very best rankings you can, check out the Google announcement of 2010 and start making sure that you are using the FOURTH (area of SEO) to full effect.

Yoda Rules OK..

Activity – The Third Area of SEO

Search Engine Optimisation has traditionally been about two things, On Page optimisation and Off Page optimisation. The first dealt with the words on the pages and how they ‘talked’ to the Search Engines, whilst the second was all about the links to a site. According to some there is now a third area, at least as far as Google is concerned.

Activity – The Third SEO Area

This latest area concerns the issue of ‘activity’. This activity itself covers two areas, that of activity in the matter of visits to a site and the activity on a site, that is the level of changes happening on that site.

Don’t Let Your Site Go Stagnant

To be fair, the latter point has been one of the rules Google have been using for some time as they have never like stagnant sites that never seem to change. However, it is said that all of this area of activity now has a greater impact on the rankings for a site, so it is a matter that cannot be ignored.

At SOM we have been taking the whole matter very seriously for some months now, using both on page (e.g. the addition of news pages to sites) and off page techniques (e.g using Twitter to generate visits, all the time ensuring that Google are aware of the visits by using the Google URL shortening tool).

Links Are Still Important

It must be said that these methodologies look to be working very well, so if you are serious about getting (and keeping) your Google rankings I would give serious consideration to the matter of generating measurable activity for / on your site.That is not to say that you can afford to take your eye off the other two areas, this particularly being the case with links. Here you must ensure that your site has links from a wide variety of site types, IP address and that these links are pointed at pages other than the home page, these being called ‘deep links’ (aim for a deep link ratio of 50% – 80%).

Make Your Anchor Text Count

There is another issue with links too, this being all to do with the so called anchor text (the bit you click on). Make sure that the words here say what your site is all about and uses the keywords you want to be listed for, this really makes a difference.

So go and check your on page and off page optimisation, but please too, also consider this new area of activity and of course, do not forget the fourth area, Social Media Activity!

Content Management Systems and SEO

I know this is a bit of a hobby horse of mine, but I do get very frustrated by many CMS systems, at least when it comes to the carrying out Search Engine Optimisation for a site. There are many reasons for this, sometimes just a few things are ‘wrong’ in other cases the list is pretty endless.

The reason for these shortcomings is of course easy to explain, in that the CMS has been built so that it is easy to use and often also to a budget. Some open source CMS systems like Joomla have the advantage of lots of ‘plug ins’ / extensions that allow an optimiser to add the facilities that they need, but others do not and this leads to serious reductions in the number of SEO methodologies that can be used.

What is needed here is for the person commissioning the website to fully aware of his needs, i.e. is SEO an issue or is it just a brochure site. Here the website developer needs to ask some questions and to point out what they are going to do SEO wise and what the CMS that they are providing can and can’t do. If they have any doubt, the should, at least in my view either not sell to that customer or to use a CMS that will ‘fit the bill’ so to speak.

It must be said that many CMS systems do have work arounds for most SEO issues, but none are as flexible as bespoke systems. Just a few issues that many CMS systems have:-

  1. The Web page ‘on page title’ or the Meta Title are associated with the Menu so you can change these important items without ruining the Menu
  2. The Menu does not allow the full use of the No Follow tag (used for Page Rank Sculpting)
  3. Only parts of a page are accessible via the CMS
  4. You cannot add extra javascript commands
  5. You cannot alter the CSS

These are as I say just 5, there are many more.

So if you are about to purchase a new site using a CMS, please ask an expert on SEO to check that it will ‘do the business’ BEFORE you sign up for the deal.

Google Analytics & Searchandising

I am often asked questions about Google Analytics, often because there is just so much data available that it just ‘blows their minds’.

The trick here is to use Filters to place one set of data for say PPC visitors and another set for Organic visitors into different ‘pots’. You still keep the ‘big pot’ with all the data in it too, but having these subsets of data sure makes it easier to understand.

However, this blog is not about reducing the amount of data in Analytics, its about INCREASING it, but for good reason.

We all know that the First Problem for all site owners is getting people to their site. As many have found, setting up a website and waiting for the people to visit in huge numbers normally has only one outcome, a long wait (and a website covered in electronic dust). No, for all but the luckiest of businesses, a lot of hard work is required to get those potential customers into their site.

But there is a second problem

Getting visitors in is all well and good, but unless you get them to do something (buy or enquire or sign up for a newsletter) all the effort of getting them there is wasted. This “something” will vary from one site to another, but in all cases you want to make it easy for people to find what they want on the site as that is bound to increase the likelihood of them making a purchase or whatever else you want them to do.

The answer is of course the “Search” box. You’ll find these on many sites, and when they are done in the right way, they are really useful. With just a few clicks you can input the term “what do feed rubber ducks on” and be provided with a list of all the pages on the site that are relevant, all without all that tiresome tracking through the navigation and having to scan every page.

Searchandising

The term “Searchandising” is in fact all about the science of implementing these search boxes and the analysis of the data it provides and guess what Mr Google can help with both. For a small fee you can have a Google Search Facility on your site, this being backed by the same powerful search algorithims that Google use all the time. That beats the hell out of most so called search facilities I have come across on more sites than I care to mention – talk about awful!

Google have spent years perfecting their search function, so when you use it you have one that allows for the fact that humans are well, human. They just don’t do things the way a “sensible computer” would. For a start they can’t spell in many cases, and they tend to write as they speak, in other words they may well type in “yelow rubber duks”. Now a system that expects you to type in just “rubber ducks” is going to get really confused here and will probably decide that there is nothing on the site that matches these requirements and come up with an empty search list. Result, the user goes elsewhere and that potential sale is lost, not so with Googles offering.

Listing the Results

With Google you get the results in a way you understand and know, these being rated by relevance and from experience I can say that the small fee you have to pay each year is more than worth it.

The best bit is yet to come though

Search boxes are found on many sites, but that trick that is often missed is to check to see what people have actually typed in / are looking for. If you have this data you can use it to improve the way the site is laid out, e.g. what products are on the first page etc, but beyond that you can also glean a lot of information about visitor behaviour as it tells you what they want, and that can’t be bad.

The data is also very useful for SEO purposes too, as there is a fair chance that any term typed into your internal search engine will be typed into Google and the like. Thus optimising your site for these terms is a great way of getting ahead of the competition.

That best bit

Perhaps the very best bit is that Google Analytics incorporates any on site searches in its database, allowing you to see what people have been looking for, all from that one interface, no extra programming, just DATA.

So what should I do?

Unless you have lots of money to burn, I’d hop off to Google, buy the Search Facility for your site and install the code pronto, its all very easy and if you can’t do it, your web designer can in just a few minutes. Then all you have to do is to tell Analytics to incorporate the data (just a tick in the box being needed) and heh presto, you’ll have lots of lovely new data to play with.

OK you might not want the extra data, but it will help..