How To Choose SEO Plug-ins For WordPress

WordPress is one of the most widely used website creation tools out there. One of the reasons why it is so popular is the vast amount of plug-ins that you can use to enhance your website or the backend functions. When creating any website, you need to consider your content and how it will affect your SEO strategy. To help you sort out your SEO, there are many different WordPress plug-ins that you can choose from. The problem comes in choosing only one and to do this you need to ensure that there are certain features and that the plug-in is user-friendly.

Is The Plug-in User-Friendly?

There is no point in installing a plug-in if you are going to have a hard time using it. This is why you need to see if it is user-friendly before you look at anything else. One of the ways to determine this is to look at the screenshots that plug-in creator’s supply. These screenshots will generally show you a number of different areas of the plug-in and you can then determine if it looks easy to use or not.

However, for most plug-ins, you will have to wait until you have installed it to see if it is user-friendly. Fortunately, you can easily disable and delete the plug-in if you find it too cumbersome or hard to use. Most SEO plug-ins are actually very user-friendly as they are aimed at helping you and not confusing you.

When Was The Plug-in Last Updated?

SEO is constantly changing and you need to use an SEO plug-in that is up to date. Before you install the plug-in, you need to check when it was last updated. If the plug-in has not been updated in years, it will be out of date and will not help you meet current SEO standards. The best SEO plug-ins will be constantly updated and will not go more than a few months with an update, if that long.

When you are looking at when last the plug-in was updated, you should also check if it is compatible with your version of WordPress. WordPress is always updating to ensure that you get the best platform for your website. If you are on the latest version of WordPress, the SEO plug-in may not be tested yet, but you are able to find out the last version that it is compatible with. The last version should be the version of WordPress directly preceding the one that you are using.

Helps You Target Keywords

One of the features that your SEO plug-in needs to have is the ability to help you target keywords. Most SEO plug-ins will let you state what your keyword for the content is and then tell you what the density of the keyword is. This is important because high keyword density is often penalized by the search engines.

The plug-in should not only look at the written content on your page when helping you with keywords. The alt text for your images should also be considered and the plug-in should offer analytics on this.

Generating XML Sitemaps

Sitemaps are something that a lot of people overlook because they are focused on the actual content on their pages. However, XML sitemaps can help the search engines crawl your website and index your pages. It is recommended that you use an SEO plug-in that is able to generate these sitemaps for you.

It is also recommended that the plug-in is able to modify robot.txt files. These are basically instructions for the search bots which will tell them what pages to look at and what to ignore. Most people will not need to use these files, but it is handy to have a plug-in that can help you with this.

 

Today’s digitalized era has made it easier than ever for all kinds of businesses to have an online presence. According to a research, eight in ten Americans prefer shopping online. That is almost 79% of U.S. consumers, as compared to a meagre 22% back in 2000.

This is why, it is a good time to start an online store and spread your wings further if you already haven’t. Having a digital presence for your brand can help you reach out to more people, equating to even more sales and revenue for your business.

The first step towards building an online store, however, is to decide the type of platform that would suit the nature of your business the best. One of the most common and user-friendly options is WordPress. Many e-commerce businesses are using WordPress to reach out to their customers. It is a highly customizable platform that suits almost every business owner’s needs. It is also light on the pocket and according to a recent survey, around 10% of the ecommerce stores today are using WordPress to build their website.

Once you are up and running on WordPress, the next task is to install relevant plugins that can make your e-store even more efficient. The website is going to represent your business, so you want it to be flawless to give the right impression to your customers. There are countless options to choose from to create a beautiful and functional online store.

With more than 40,000 plugins to choose on WordPress, it can be a tad bit overwhelming to select the one that will suit your business. In this article, we are going to share 7 of our favorite ecommerce WordPress plugins that will not only make your life easier, but will also make sure that lots of cash flows in.

1- WP e-Commerce

A popular choice among e-commerce website builders, WP e-commerce is one of the leading e-commerce solutions that makes it easy for you to run your e-store. It is a free plugin which features a variety of coupon codes, discounts and free shipping options, single-page checkout, integration with all major e-commerce payment options and much more.

You can always upgrade the plug-in to make it even more dynamic and add more features to it like extra payment gateways, slider carousels and additional shipping options. It is undoubtedly a must-have plugin for all e-commerce websites.

2- Hummingbird

If your website takes forever to load, chances are that your customer will leave. According to a recent study, an average visitor will not even wait for 8 seconds for the website to load. In addition, Google recommends that a trusted website should load within 2 seconds, otherwise your search result rankings will suffer. This is why, it is crucial to speed up your website.

One of the best ways to do this is to use caching. Hummingbird is an app that jumps to your rescue and will take care of this aspect. It is an easy to use plugin that scans and analyzes your website. Once this is done, your website will be scored according to its current speed. A couple of clicks can resolve any issue if the need arises.

Hummingbird does not only do page caching; It also minifies, compresses and merges so that your visitors can experience the best version of your website. It is a paid plugin but worth all the money. You can try Hummingbird for free with a 14-day trial, and watch your website soar as fast as a hummingbird (pun intended).

3- Defender

Being the most popular choice for creating websites, WordPress is also an easy target for hackers. This is why, you should be prepared for it beforehand. Thus, it is important to tighten your website security from time to time and Defender is an excellent plugin that will help you to do this and more.

Defender is a plugin that searches your website for vulnerabilities and updates you whenever there is one. You can patch them up instantly by a single click. You can also schedule clean up sessions and stop worrying about threats.

You can try Defender for free with a 14-day trial and then buy the full version if you like it.

4- Google Analytics+

While running an e-commerce business, it is important to track your visitors and their visiting patterns. Google Analytics is an excellent plugin that gives you the opportunity to work on the areas that your website might be lagging behind in.

All the information, including track visits, page views, bounce rate, average visit duration and referrers, is presented directly on your admin dashboard. You can also analyze your customers’ shopping behaviors based on demographic and interests by enabling the advanced settings.

It is a great plugin that helps you keep track of all the important statistics of your website and the best part is that it has an easy user-interface.

Read More: https://www.besttechie.com/7-ecommerce-wordpress-plugins-make-online-store-better/

 

 

Tips for Protecting Your Website from Hackers that Use SEO Keywords for Spreading Malware

There’s a growing trend for hackers to spread their malware by infecting websites that rank well in the search engines for certain keywords, and using those SEO keywords to get lots of visitors who will then become victims of ‘drive by downloads’.

If you want to protect your website from those hackers, then you will need to be proactive with your security. Many websites are breached not because a hacker specifically targeted them, but simply because they were vulnerable – the hackers get a list of websites that rank well for given keywords, then use software to see if those websites are vulnerable to generalised attacks. If they find a site that is vulnerable, then they’ll ‘break in’ and infect it with their malware.

WordPress and Magento are two of the most popular platforms for business websites – they are used for blogs and content sites, and for online stores. Because they are so popular, they get a lot of attention from malicious developers and users, who know that if they can find a security hole they can exploit it for financial gain – or just for fun. WordPress, in particular, is quite an ‘open’ platform in that anyone can just develop plug-ins and themes for it and distribute them without them having to undergo extensive checks. This means that there are a lot of plug-ins out there that are not well written, and that are riddled with potential holes for hackers to exploit.

If you run WordPress, Magento – or any other online content management system or store platform for that matter – then you should look at ways of securing it. For WordPress, that means installing plug-ins to block repeated failed login attempts, renaming the admin account, and keeping the main WordPress up to date, as well as keeping plug-ins up to date as well and removing any plug-ins and themes that you are not using. You should also delete the installation directory once you are satisfied that the installation was successful.

The same goes for Magento. It’s important that you remove the installation directory, change the admin path, and rename all the users to something hard to guess. Keep the platform itself patched up to date, and keep all your extensions patched as well. This will go a long way towards ensuring that the platform runs well and is secure.

If your site does get hacked, the first that you know of it might be when a malware warning pops up when you visit the site, or when you see a warning against it in the search engines. If that happens, your first priority should be removing the malware, then fixing the exploit that caused it. Once your site is clean – and not likely to be immediately re-infected – then you can look at telling the search engines that it’s fixed and asking them to remove the warnings against your site.

Removing malware is an involved process that takes some technical knowledge – and in some cases the vulnerability is something that only the web host can fix, not an end user. So, for this reason, it’s a good idea to hire a web developer to look at your site for you – or a security expert. It can be quite expensive to fix these issues, so try to prevent them from happening in the first place! Following best practices from day one is much easier than trying to fix the issues with your site after an infection has cropped up, and takes less knowledge too.

 

The security protocol used to protect the vast majority of wifi connections has been broken, potentially exposing wireless internet traffic to malicious eavesdroppers and attacks, according to the researcher who discovered the weakness.

Mathy Vanhoef, a security expert at Belgian university KU Leuven, discovered the weakness in the wireless security protocol WPA2, and published details of the flaw on Monday morning.

“Attackers can use this novel attack technique to read information that was previously assumed to be safely encrypted,” Vanhoef’s report said. “This can be abused to steal sensitive information such as credit card numbers, passwords, chat messages, emails, photos and so on.

Vanhoef emphasised that the attack works against all modern protected wifi networks. Depending on the network configuration, it is also possible to inject and manipulate data. For example, an attacker might be able to inject ransomware or other malware into websites.”

The vulnerability affects a number of operating systems and devices, the report said, including Android, Linux, Apple, Windows, OpenBSD, MediaTek, Linksys and others.

“If your device supports wifi, it is most likely affected,” Vanhoef wrote. “In general, any data or information that the victim transmits can be decrypted … Additionally, depending on the device being used and the network setup, it is also possible to decrypt data sent towards the victim (e.g. the content of a website).”

Vanhoef gave the weakness the codename Krack, short for Key Reinstallation AttaCK.

Britain’s National Cyber Security Centre said in a statement it was examining the vulnerability. “Research has been published today into potential global weaknesses to wifi systems. The attacker would have to be physically close to the target and the potential weaknesses would not compromise connections to secure websites, such as banking services or online shopping.

“We are examining the research and will be providing guidance if required. Internet security is a key NCSC priority and we continuously update our advice on issues such as wifi safety, device management and browser security.”

The United States Computer Emergency Readiness Team (Cert) issued a warning on Sunday in response to the vulnerability.

“The impact of exploiting these vulnerabilities includes decryption, packet replay, TCP connection hijacking, HTTP content injection and others,” the alert says, detailing a number of potential attacks. It adds that, since the vulnerability is in the protocol itself, rather than any specific device or software, “most or all correct implementations of the standard will be affected”.

The development is significant because the compromised security protocol is the most secure in general use to encrypt wifi connections. Older security standards have been broken in the past, but on those occasions a successor was available and in widespread use.

Crucially, the attack is unlikely to affect the security of information sent over the network that is protected in addition to the standard WPA2 encryption. This means connections to secure websites are still safe, as are other encrypted connections such as virtual private networks (VPN) and SSH communications.

However, insecure connections to websites – those which do not display a padlock icon in the address bar, indicating their support for HTTPS – should be considered public, and viewable to any other user on the network, until the vulnerability is fixed.

Equally, home internet connections will remain difficult to fully secure for quite some time. Many wireless routers are infrequently if ever updated, meaning that they will continue to communicate in an insecure manner. However, Vanhoef says, if the fix is installed on a phone or computer, that device will still be able to communicate with an insecure router. That means even users with an unpatched router should still fix as many devices as they can, to ensure security on other networks.

Read more: https://www.theguardian.com/technology/2017/oct/16/wpa2-wifi-security-vulnerable-hacking-us-government-warns

 

Understanding Google Webmaster Guidelines

The Google webmaster guidelines are a collection of best-practice guidelines that will help Google to better understand your website, and help to ensure that your site ranks as well as possible without being mistaken for spam or otherwise suffering from penalties or ranking issues.

These guidelines can be used to help people to understand how their site should be structured, how it should look, and the content on it, as well as how link building should be carried out.

If you are hiring someone to build your website for you, then the guidelines can help to ensure that the site is structured properly, and can be a useful set of instructions for any web developer.

What’s in the Guidelines

Google Webmaster Guidelines include content advice and advice about how a site should be structured. Google wants webmasters to ensure that there are no broken links, and that there are no issues with poorly written content. Pages that load slowly or that have broken navigation will frustrate users, and Google does not want to send people with pages like that, because it knows that to many users, the ‘experience’ that Google offers includes the sites that it sends people to – so the search engine looks better if it performs well.

Google also wants webmasters to have pages that load quickly, and that are easy to use on both desktop and mobile devices. This means that the site needs to be hosted not too far from where most of the traffic will come from, so that it loads promptly.

Google has requirements for sites to be accurate, useful and full of unique content. It frowns upon sites that contain either thin content (e.g. sites made for nothing more than hosting ads) and sites that contain a lot of duplicate content. You will need to make sure that your site provides users with information that is useful and up to date, and that was written for your site. Copying content will not help you in the long run.

There are also some guidelines for link building. If you are trying to build up a lot of incoming links then you will most likely find that your site will move up in the rankings – as long as those links are relevant and high quality. Get a lot of spammy links, or get caught buying links from third parties, and you will lose some of that ranking. The reason for this is that Google wants to maintain the integrity of its index. If you are buying links, then you are not earning them through ‘votes’ and Google may think that your site is not actually worthy of those links in reality.

This issue has led to something called ‘negative SEO’ where rival webmasters get people to link to another webmaster’s site from ‘bad neighbourhoods’ – for example having a site that sells trainers linked to from a gambling website. This used to work, but now it is possible for webmasters to disavow links that they do not want to be associated with – so you get none of the ‘useful’ link benefits, but you also don’t get penalized for those links existing.

It is well worth taking the time to learn the Google Guidelines, because they will stand you in good stead when you are ready to promote your website, and they serve as a good framework for getting your site off to a smooth start, so that the SEO will be easy when you are finally ready to do it, and you won’t fall into obscurity through no fault of your own.

 

When former head of web spam Matt Cutts was at Google, he spent a lot of time communicating with webmasters/site owners about updates. We knew what was coming, when it might be coming, and how severe it would possibly be.

If you woke up in the morning and your traffic had fallen off a proverbial cliff, you could go to Twitter and, based on what Cutts was posting, usually determine if Google had run an update. You could even tell how severe the rollout was, as Cutts would typically give you percentage of queries affected.

Although some believe Cutts was more about misinformation than information, when it came to updates, most would agree he was on point.

So if a site fell off that cliff, you could learn from Cutts what happened, what the update was named, and what it affected. This gave you starting points for what to review so that you could fix the site and bring it back into line with Google’s guidelines.

Why the help?

Cutts seemed to understand there was a need for the webmaster. After all, Google’s Search is not their product — the sites they return from that search are the product.

Without someone translating Google’s desires to site owners, those sites would likely not meet those guidelines very well. This would result in a poor experience for Google users. So, that transfer of knowledge between Google, SEOs and site owners was important. Without it, Google would be hard-pressed to find a plethora of sites that meet its needs.

Then, things changed. Matt Cutts left to go to the US Digital Service — and with his departure, that type of communication from Google ended, for the most part.

While Google will still let webmasters know about really big changes, like the mobile-first index, they’ve stopped communicating much detail about smaller updates. And the communication has not been in such an easily consumable format as Cutts tweeting update metrics.

In fact, very little is said today about smaller updates. It has gotten to the point where they stopped naming all but a very few of these changes.

Google communication in 2017

Right now, the Google spokespeople who primarily communicate with SEOs/webmasters are Gary Illyes and John Mueller. This is not a critique of them, as they communicate in the way Google has asked them to communicate.

Indeed, they have been very helpful over the past few years. Mueller holds Webmaster Central Office Hours Hangouts to help answer questions in long form. Illyes answers similar questions in short form on Twitter and attends conferences, where he participates in various AMA (Ask Me Anything) sessions with interviewers.

All this is helpful and appreciated… but unfortunately, it is not the same.

Highly specific information is difficult to find, and questioners are often are met with more vagueness than specifics, which can at times feel frustrating. Google has become obtuse in how they communicate with digital marketers, and that seems to be directed by internal company processes and policies.

This lack of algorithmic specificity and update confirmation is how we wound up with Phantom.

Welcome, Phantom

Google has many algorithms, as any SEO knows. Some, like Penguin and Panda, have been rolled into Google’s core algorithm and run in (quasi-) real time, while others, like the interstitial penalty, still run, well, when they run.

Big updates such as Penguin have always been set apart from the day-to-day changes of Google. There are potentially thousands of tweaks to core algorithms that run every year and often multiple times a day.

However, day-to-day changes affect sites much differently than massive algorithm updates like Panda, Penguin, Pigeon, Pirate, Layout, Mobilegeddon, Interstitial, and on and on. One is a quiet rain, the other a typhoon. One is rarely noticed, the other can be highly destructive.

Now, Google is correct in that webmasters don’t need to know about these day-to-day changes unless someone dials an algorithm up or down too much. You might not ever even notice them. However, there are other algorithms updates that cause enough disruption in rankings for webmasters to wonder, “Hey Google, what happened?

This was true for an algorithm update that became known as Phantom.

Phantom?

There was a mysterious update in 2013 that SEO expert Glenn Gabe named “Phantom.” While it seemed to be focused on quality, it was not related to Panda or Penguin. This was new, and it affected a large number of sites.

When “Phantom” ran, it was not a minor tweak. Sites, and the sites that monitor sites, would show large-scale ranking changes that only seem to happen when there is a major algorithm update afoot.

Now, there was one occasion that Google acknowledged Phantom existed. However, aside from that, Google has not named it, acknowledged it, or even denied Phantom when SEOs believed it ran. Over time, this string of unknown quality updates all became known as Phantom.

The word “Phantom” came from the idea that we didn’t know what it was; we just knew that some update that was not Panda caused mass fluctuations and was related to quality.

Not Panda quality updates

The changes introduced by Phantom were not one set of changes like Panda or Penguin, which typically target the same items. However, the changes were not completely disparate and had the following in common:

  • They were related to site quality.
  • They were not Panda.
  • They were all found in the Quality Raters Guide.

We don’t use the word “Phantom” anymore, but from 2013 to 2016, large-scale changes that were quality related and not Panda were commonly called Phantom. (It was easier than “that update no one admits exists, but all indicators tell us is there.”)

Read more: https://searchengineland.com/the-trouble-with-fred-283343

 

 

 

How To Create Quality Content For SEO Purposes

Creating content for SEO purposes is key to getting your website ranking as high as possible on the search engines. In order to get your website ranking in the search engines, not only do you need to produce and share quality content, but the content has to be both relevant and unique as well. Thus, you are going to need to be sure that you have strategies in place that will allow you to create such content on a regular basis. Below, we will be going over some of the key tips that you can use in order to create quality content for SEO purposes.

Creating Quality Content:

1. Outsource Your Content Creation.

One of the best ways to consistently create high-quality content is by outsourcing it to a third party. Finding a good content creation company or even an individual writer to create custom content for you will allow you to be able to achieve a certain level of consistency that you might be unable to do if you were to try to attempt to create all of the content on your own. Creating content on your own can get difficult due to the fact that it takes a lot of time and effort to do so consistently. Therefore, by outsourcing it, you should be able to save a lot of time throughout the entire process.

2. Do It Yourself.

Another good way to create quality content is to do it yourself, particularly if you have expert knowledge in your field. While it does take a lot of time and effort, it is something that you will benefit from if you do it yourself because it will ensure that you are able to achieve a certain level of consistency and quality each and every time. Outsourcing your content creation is a good idea for those that are looking to save time and scale the process to a certain degree, but some might feel uncomfortable with it at first. Therefore, if you want to begin by doing it yourself, it wouldn’t necessarily be a bad idea.

3. Create Video Content.

One of the best ways to create high quality, relevant, and consistent content is by creating videos. A lot of people enjoy watching videos on a daily basis. This is why YouTube is the number one most visited website in the world. With that being said, creating videos might be easier to scale and less time consuming than creating purely written content. This is a good idea to not only get more out of the content that you create on a daily basis but also to allow you to save time and scale it in a way that is going to allow you to do it much more consistently.

Overall, there are plenty of tips that you can use in order to properly create high-quality content consistently. By using some our suggestions, you should be able to maximise your ability to craft high quality and relevant content on a regular basis.

 

Late in 2015, Google confirmed what many of us had already suspected: mobile search had officially surpassed desktop worldwide.

Smartphones and tablets have completely disrupted and forever altered what was once a fairly linear buyer’s journey. These days, a consumer might drop into your funnel at any point, from any channel, and it might be after an unknown number of touch points across platforms and devices that you didn’t see happening.

They’re reading reviews, are exposed to organic and paid social, are searching for nearby answers for their immediate needs and more. Increasingly, consumers are doing all of these things from a mobile device.

Recent research at BrightEdge (my company) shows that 57 percent of all online traffic now comes from mobile and tablet. Pair this consumer insight with the knowledge that Google’s mobile-first algorithm is coming, and we marketers have some work to do.

In this column, I’ll share the results of our recent Google SERPs Mobile vs. Desktop research, and you’ll learn how to Google-proof your SEO and content marketing strategies to prepare for what’s next.

As the shift to mobile has picked up speed, we’ve discovered some new ways to determine what that actually means in terms of real, measurable impact on businesses.

One such insight gleaned from our recent research helps us assess the extent to which mobile matters to Google. We’ve been tracking Google’s experimentation with the mobile-first index since it was announced in 2016, and what we learned might surprise you.

We tracked SERP listing data for nearly 25 million keywords, and what we discovered is that 79 percent of listings have a different rank on mobile and desktop devices. For listings in positions 1-20, 47 percent had mobile and desktop rankings that were not the same.

Furthermore, we found that 35 percent of the time, the top-ranking URL of a domain for a given query is different on desktop than on mobile.

Preparing for mobile-first

Back in 2016, Google first announced their development of a mobile-first algorithm, a direct response to the rising use of mobile across its consumer base. Now, the search giant has begun experimenting with this new algorithm — a test that’s attracted the attention of marketers across all sectors.

It’s impossible to estimate the impact of such an algorithm, yet it’s safe to say you need to start preparing now. Brands that are still looking at their marketing strategy through a desktop view in a mobile-first world are likely to misunderstand the opportunities and threats affecting them (most likely on the mobile side, and in their largest channel — organic search — which accounts for 51 percent of traffic, on average).

But mobile-first isn’t mobile-only, either. Those who come out ahead through this upcoming mobile-first update will have separate strategies for each and will be tracking performance across both. Carlos Spallarossa, director of SEO for cosmetics giant L’Oréal (a client of my company) says,

“Mobile traffic is huge for us and our industry — above the 57 percent [this survey] is reporting. We are developing content with a mobile-first perspective to connect with our users with info, user advice, and reviews – especially when they are near a store where they can easily purchase.”

Winning in this rapidly evolving environment requires a keen understanding of user intent, how your customers use mobile and how your site appears on mobile devices.

Google interprets each user’s most likely intent through micro-moments, which impact how the SERP is constructed and the types of content that appear. For example, if the search engine believes the searcher wants to find a restaurant, the local 3-pack will appear. If the person seems to express an I-want-to-know micro-moment, then a Google Quick answer will appear. Google also varies the number and placement of videos and images on the SERP depending upon the likely intent.

Site developers and marketers must recognize how mobile users interact in these micro-moments and how their intent differs between mobile and desktop. Only then can you ensure that the content created matches both the intent and the device.

For example, a consumer searching for “how to contact KOA” has an “I want to know” query. On mobile, that person is more likely to click-to-call than to type out an email, which is the exact opposite of the desktop searcher.

Read more: http://searchengineland.com/mobile-desktop-seo-different-results-different-content-strategies-281643

Research into the FRED Google update, confirming why sites lost rankings.

The Fred Update by Google caused quite a ripple in the SEO world, many sites losing ranks, and hence traffic, up to 90% in some cases.  I have been doing quite a bit of digging and have asked some Gurus some pointed questions about why and what has happened.

The overall thoughts on the matter are that Google penalised sites that had poor content, or ones that were simply there to make money and not give anything back to the visitor in the form of useful data or information.

User Experience is Another Factor

Other thoughts on the matter were more to do with the User Experience that a page gave its visitors. Here the sites that were said to be hit included those that placed the copy below the fold of the screen or in some cases had very low load times.

However, in some cases sites were hit that were not just ‘out to make money’, but that seem to have been ‘lumped in’ with those that do because of the lack of content on their page.

Having a Lot of Links Did Not Save Sites

There was also talk that FRED checked on the quality of the links to sites too. This may turn out to be the case, further research is needed on this matter. However, what we can say is that sites that fell foul of FRED’s On Page quality checks were not saved by having a lot of links. Instead their positions were taken by sites that had inferior linking profiles, both at Page and Domain levels.

This research only covers 9 sites, so it can hardly be said to be definitive, but the evidence so far is pretty conclusive. Further research into the sites that were affected but did not fit the profile of sites that ‘should have been affected’ (by Fred) is the next step. More on the ‘efficiency’ of Fred later.

The FRED Data

In each case, the sites that held a first page rank before Fred for a given term were compared with the sites that now hold the first page (for that term). The sites that had lost their first page rank (had to have a position of 7 or above pre Fred) were then checked, this with a view to see ‘what could have caused them to lose their rank’ and whether this fitted with the profile of sites that Fred ‘should have hit’.

The phrases checked covered a range of topics, ranging from ‘lqf fruit’ to ‘chemical companies’ so should be diverse enough to give some firm data.

Search Phrase ‘ lqf Fruit’

Before and After FRED

Search Results pre and post Fred

Google results before and after FRED

Here two sites lost their first page rank:-

1

Not enough text for FRED

A screen shot of the site

This site had lost a rank of 5, and when checked, we saw that the actual page that was shown when you clicked the link was https://www.thespruce.com/what-does-iqf-mean-995719, a page not even on the stated domain. Something that is sure to annoy Google to start with. Furthermore, this page had very thin content and seemed to be only really there provide to a place for Google Ads and other advertisments. Being a prime target for Fred, it is not surprising to see that it was hit.

2

 

Content to Thin

The fruitbycrops site

Again a site with very thin content, just 155 words with an Advert at the very top, again a prime target for Fred.

 

Search Phrase ‘ chemical companies’

Before and After FRED

 

Results before and after FRED

The results for the term before and after FRED

Again two sites affected:-

12

 

 

 

 

This is a big website, with a lot of links, some 222,000 links to the domain, (although only 3 to the page)  linking to the page, the reason it lost its ranks seemingly down to the fact that the page in question was just not related enough, it being just one short item on the page.

4

 

An example of a penalised site

Was this site penalised because it’s copy was not ‘good enough’. Seems to be the most likely.

Another page that held just a small amount of what I would call ‘filler text’, it not really ‘saying anything’, at least in my view, the total length being just 251 words. Again a prime target for the Fred update.

 

Search Phrase ‘welding supplies uk’

Changes in the Google results pre and post FRED

The results from Google, pre and post the Fred update

Two sites here:-.

11

An example of a site hit by FRED

The Weldingshop site one of many hit by the Fred update

This site is not that bad in reality, although some may think it is a bit old fashioned. But it is not as bad as many that do hold onto first page ranks.  What is most likely the cause of the pages loss of rankings is that the main copy is only 340 words long. This leads me to consider that the length on the copy is considered below the ‘satisfactory’ level laid down in the Google Quality Guidelines.

5

 

Little text below the fold

Too little copy, with it below the fold. Possible reasons for the site being hit by FRED

This page lost a rank of 7, again the amount of copy being the likely cause of the drop, only 270 words being on the page, whilst also being below the fold, a factor that Google has stated (in 2012) that caused the value of any copy to be degraded.

Search Phrase ‘metal fabricators’

See how Fred altered the Google results

Google results pre and post the FRED update

Three sites had lost their ranks for this phrase

6

 

 

 

To few words for FRED?

Another site hit by Fred, more than likely due to the small amount of copy

Yet another page that lost its ranks, apparently down to the lack of content, the copy amounting to just 154 words.

7

 

Text below the fold - a reason for a Fred hit?

A page with over 600 words, but being below the fold, this could have caused a Fred hit.

This site had a rank of 4 before Fred, and does have a fair number of words, over 600 in all. However, 90% of it is below the fold on the screen and this looks to be the reason for the drop.

8

 

Yet another site hit by Google's Fred.

Yet another site hit by Google’s Fred.

This page lost its 6th position, it again being a ‘low volume of copy’ casualty, the length of copy amounting to just 170 words.

 

Conclusion

In all cases we can see that the sites affected by Fred did seem to fit the patterns suggested by the Gurus and by other research in that they mostly had very thin copy or ‘hid’ the copy below the fold in the page.

The next step is to see if the pages we are currently looking after SEOwise that also suffered a drop in rankings also fit this pattern.

Watch out for another report on this later in April.

How Should You Position Your Web Content?

We were approached by Tracy at UKWebhostreview.com and asked if we would like to feature an infographic on how to position web content on a site to get the very best effect. This has always been an important topic, BUT, after the Google Fred Update, anything that improves the User Experience is something that deserves serious consideration. So, we were more than happy to host this post and hope that you find it as useful as we have.

Guest Post from UKHostReview on Positioning Web Content

If you’re asking this question then you are already thinking a lot more deeply about your online marketing than a large population of website owners. People can often get caught up in getting a website set up quickly or concentrating on which web host to go for and other aspects involved in website building.

Infographic by UKwebhostreview

Infographic on how to place web content supplied by UKwebhostreview

When this happens, some of the other important considerations like content positioning can be neglected, which will result in a website that isn’t as effective as it should be. When we talk about website effectiveness, the key measure that most people will be interested in is driving increased customer sales. If you are setting up a business website then one of your main priorities should be to get the positioning right on your website. This can seriously be the determining factor in how many sales your business is making, so should be treated as a top priority for you.

If you’re not an expert in developing content or positioning content for maximum effect, then you will probably find this infographic from James at UKwebhostreview.com of great use. It lists the 25 features that every online business must have in 2017, so as you can probably tell from the title it is a very comprehensive list. It shows you exactly where to add your key features like call to action button or logo with tagline. You can also use the list of features to check that you have remembered to include every essential item of content that a good website requires.

Whatever stage of website set up you are at, whether you are only just beginning or you have had your website set up for some time, you should use these 25 features as a guideline for how to structure your website content to drive the best results.

6 Social Commerce Trends You Should Know About

It’s great to know that your own posts get found and read, and in some cases this leads to even better things, one reader telling us about an article they had written that provided even more information on the topic we had been featuring, that being ‘Social Media’s power in the Ecommerce World’.

wpid-2555-2555-def30992b5a44a92b343c6266ff50fb2..jpg

We checked out the article and found it very interesting, Social Media now being an area that no one can totally ignore. At the moment, Serendipity really only uses Social Media to boost the power of the links we create. BUT, that is all set to change soon as we plan to offer a level of Social Media marketing to our customers.

Anyway, read on, and if you want to see the full article on Social Media Marketing, click the link!

Just a few decades ago, advertising only showed up in a few channels, such as television, radio, and billboards. Companies who wanted to increase sales had to shell out a significant amount of cash to get their products in front of people and there was no way they could guarantee their ads would get traction.

Social media has completely revolutionized the way commerce happens. Now masses of people spend their hours on social media platforms, consuming quantity in astronomical proportions. Companies who want to drive sales have to be innovative in their social media tactics.

Every year, these tactics change as social media platforms release new options. What trends can we expect for 2017? Here are 6 you need to know about.

It’s All About Those Videos

Have you noticed that video is everywhere?

Facebook, YouTube, Instagram, and Periscope have all released the option of live streaming video. Additionally, all these platforms offer the ability to create video ads. Even now, YouTube now has shoppable ads before videos. It also allows companies to create simple calls to action so that viewers can purchase their products.

Expect to see more companies tapping into the power of live, shoppable video even more in 2017.

Companies like QVC and the Home Shopping Network have long demonstrated that live video can generate huge amounts of sales. Now almost anyone can create live videos in which they demonstrate and sell products.

Because these videos can be so highly targeted, they represent a massive opportunity for advertisers.
If the live video trend continues, we should expect to see almost every company selling their products live on social platforms.

Cashing In On Those Impulses

Marketers have long tried to tap into impulsive buying. Whether that’s encouraging consumers to call immediately or offering a limited-time discount, impulse buying has always been deeply integrated into the shopping experience.

However, impulse buying is increasing at a staggering rate with social networks.

Platforms like Instagram and Facebook allow consumers to make purchases without ever leaving the platform. And while not exactly a social platform, Amazon has one click ordering to make it all the easier to purchase without thinking.

Companies know that impulse shopping can drive a huge amount of revenue and are doing everything in their power to make it as simple as possible for customer to purchase without thinking.

In 2017 we should expect to see more and more companies implementing impulse buying options across social media platforms.

Pinterest, for example, isn’t just a place for posting recipes and interior decorating ideas. They now offer a “Buy Now” button which allows consumers to make immediate purchases from the platform.

Considering that a massive amount of Pinterest users visit the site for product-related ideas, it’s a huge opportunity for marketers.

Smart Scientific SEO Strategies for 2017

It’s been a fair few weeks since we managed to post anything on our blog and frankly I’m amazed at how fast the year has gone so far, and at the rate at which things seem to be changing, not to mention a lot of really useful software that has become available.

The post we’ve highlighted today (see below) comes from a series published by a well respected Web Design and SEO company called AimInternet. It is certainly a useful piece and highlights the fact that the information in Google’s Webmaster Tools (now called Google Search Console) is very very useful. The main reason I say this is that Google (for reasons of privacy they say…) stopped reporting the keyword phrases used by any visitor to a site in Analytics. You can tell they come from Google, but not what search words they used. All very annoying when trying to work out what words are converting and what are resulting in a high bounce rate.

Google Search Console fills this gap, to a degree in that it gives you a good idea of the phrases being used, the number of times a phrase has resulted in someone seeing a Google listing for the site, the Click Through Rate (very useful this, as it gives you an idea if your Title and Meta Description are well tuned to get clicks) as well as the average position in Google. But, it does not tell you what page they land on or whether they stay or ‘bounce’.

You can start extrapolating the data to make some intelligent guesses about what is going on (there is software that will do this for you) but they are only guesses (you could always run an Adwords campaign to check, but that is another story).

But to get back to what the article is about.

Scientific Organic Search Strategy

In the article AimInternet mention that they had increased the ‘number of keywords present’, by which I think they mean the number of different search phrases (or ‘Queries’ in Google Search Console speak) that were associated with a site. They made a big difference (something that we too pride ourselves on being able to achieve), increasing the number of associated phrases from 300 to 800. What this really means is that the ‘footprint’ of the site on Google has more than doubled, hence it is more likely to be seen and thus get a click ! All very good.

The process by which they reached this point is covered in earlier posts and no doubt they follow the same ‘Scientific’ path as we do. If they do they will first carry out research to find the words being used by people searching for their customers services and products. Then they will weave these into the site and construct content that supports the drive for rankings for the chosen target phrases.

What they ‘might’ not do is to check on the sites that currently have the best positions for these target phrases and then ‘Reverse Engineer’ them. By following that path you ‘know’ the words that Google likes to see and can thus use them in the content. This system also gives you a list of all the similar words and phrases that should be used, which avoids keyword stuffing and gets the ‘message’ across to Google in the way that we know it likes.

Add some links (that themselves have to be intelligently added – there is software that helps with that now too) and the site WILL, like Eagle, be associated with more query phrases, get better rankings and thus more traffic.

But the trick is in carrying out each of these phases in a controlled scientific manner…

One very interesting point that Aim made is that once you have a list of the phrases that Google associate with a site, that you should build on this and write content (about these phrases) that will make the site that bit more interesting and helpful. This will not only cement your position with Google but will no doubt improve the rankings for the site and, more importantly, give your readers more reasons to come back for more, and even, hopefully, buy from you.

They also make the point that visitors don’t always come in through the front door (the home page) so you should make your interior pages interesting too. This is not really new though, in that most of the pages on a site should be doing their best to engage with viewers by providing useful content, each page targeting a different set of keyphrases.

So a very interesting article.

To read the whole post on A Smart Organic Search Strategy please click the link

How We Use A Smart Organic Search Strategy To Get Our Clients On The First Page Of Google

This week we expand on looking at how to get your website on the first page of Google by using a smart organic search strategy.

In our last blog, we looked at the importance of getting on the first page of Google. And, we examined how our methods of using local marketing tools are driving traffic to the homepage – and producing fantastic results – for a client of ours. This week, we’ll expand on part of that methodology – using an organic search strategy to drive traffic to particular product pages or blog pages which then link through to specific product pages. We also do this via Adwords, although this is something we’ll look at in more detail in following blogs.

What Is An Organic Search Strategy?

In brief, an organic search strategy consists of finely keyworded product pages or blogs, which get picked up by Google each time one is published on a website. At this point, you might be thinking “I’ve already got all the information about the products or services I offer on one page of my site so I’ve nailed it, right?” or “I make rubber plugs, why the heck do I need a blog about those, who is going to read it?!”.

OK, so you might not be totally wrong about the last point (but hey, you never know, there might just be a rubber plug enthusiast out there who would LOVE to read your blog about them!).

Getting back to business…

Creating separate product pages on your site and posting blogs is all part of your organic search strategy. Simply, doing so creates more pages on your website containing the relevant keywords that you want your website to be found for, which Google can then index. The more relevant and unique pages and content you have on your site, the more shots on target you have at being shown on the first page of Google.

The important things to note here are relevant and unique. Google is smart and will penalise your site if you post up a load of duplicate pages and content. The same goes if you keyword stuff (make your content unintelligible by jamming in too many keyword phrases) your posts and pages.

We won’t go into it here but recommend that you take some time to familiarise yourself with good content practice. That includes following referencing protocols if you are using content from another site. For example, you might choose to do a blog post which rounds up the “5 best things about rubber plugs” and which uses information from other websites. That’s absolutely fine, but just remember to acknowledge and reference your sources correctly.

Why Do This?

How many pages are currently on your website? Probably not that many? So, if you currently have one page that discusses your 10 different products, by separating them out into individual pages you just added 10 extra pages to your site virtually overnight. You’ll be able to expand the content around each product, and so the mentions of the relevant keyword, too. So, whereas on the original page, you may have only listed the type of products you sell, you can now go into more detail about each one on their own page. This naturally allows for an articulate way of including more of your desired keywords on your site – avoiding the extreme no-no practice of keyword stuffing.

Google likes new and relevant content. Each page becomes a new way for traffic to come to your site. Of course, once the core pages of your site are done it’s likely that you won’t be updating those that often. Which is why, as part of any organic search strategy, we advise our clients to do regular blogging. And, in the case of blogging, the more regular you post the better.

Employing an organic search strategy such as this might mean that traffic enters your website not via the traditional route of arriving at the homepage. Instead it might enter on a product page or a blog post page written around a specific topic, which then links to a product page. Typically, we notice that customers will land on one of the product pages of our client’s websites, because of the organic search that we’ve set up for the client.

If you’re in the pressed parts trade you might do a search in Google for “copper plating”. Google will take into account your location (it gets this information from your settings) and present to you the most relevant results. Let’s say you’re Midlands based, as is EC Williams.

As a result of this search, people enter EC Williams’ site on the Copper Plating product page. Once on the page, you are presented with all of the information you need about “copper plating” along with some important trust points about the company. Our analysis shows us that from landing on this entry point people also then navigate to other pages on the site. From this example in particular, we can see that “zinc plating” is the next most popular page. Once on their website, this alternative page is now easily found in the navigation bar above, under “Plating Services”. From our research, most people stay on the “zinc plating” page, as they’ve found what they want. But, if they want more depth they’ll go onto “zinc nickel plating”.

The point of this is that once on the EC Williams’ website, the customer is presented with everything they need to make a purchasing decision. And, if you were that person looking for a company who were experts in the field of coating pressed-parts, then, bingo – you just found them.

Straight away, serious buying customers get what a snapshot of relevant information once they are on the site. Because of the trade they’re in (pressed parts), they become interested in making an enquiry straight away. We’ve measured this extensively on EC Williams’ site plus many others’, and know that it works. You need to make it easy for your customers to find information on your site and this method works by doing just that. Everything has be there for the user so that they’re not having to look for things too much.

How Organic Search Strategy Works

Most people will find you through a long-tail keyword search. These are keywords that tend to be more specific. Your website content should be driven by the keywords that your SEO advisor gives you. They need to advise your outsourced content providers of these keywords so that they can write content around them.

Take a look at www.eagleplastics.co.uk. They are another client of ours. Again, you can see that similar to www.ecwilliams.co.uk, everything a customer requires is there easy to find on the homepage, above the fold.

From an SEO perspective, when we started working with Eagle Plastics, the number of keywords we had to work with was much less than it is now. The site was receiving much less traffic that it does today which meant that there were nowhere near as many clicks or impressions being recorded. This impacted on the number of keywords being presented to us by Google. At the time we were only getting about 300 keywords presented, yet a year or so on, Google is now presenting 800 keywords.

This is as a result of the organic search strategy we have implemented, like that we discussed earlier. Traffic gets signposted to the Eagle Plastics website all based around these 800 keywords. And, now we have more of those, we can start creating content based on different keywords and keyword phrases.

Through testing the blogs, we are able to determine which keyword phrases are the most successful by analysing which ones have the best impressions.

On Eagle Plastics, “High Impact Polystyrene” is a key term for them. We know that this keyword phrase works well for them so we use it regularly in their blog headlines, in the h2 sub-headers and throughout the blog text. Of course though, we ensure we use it professionally and never keyword stuff.

As a result of this organic search strategy, we are providing more content to Google. This is recognised by them and results in Google starting to suggest more keywords which are relevant. We then create content based around these suggested keywords and their variations. As we post regular content which uses those keywords, Google views this as quality content and so provides us with even more relevant keywords. We then use these to continue to push the search and content strategy. The result is more traffic. But more than that, in getting more traffic, Google rewards you for quality content. And so it continues…

As little as five years ago, most searches were conducted using use two keywords. Today people use an average of five words per keyword search term. What was once a keyword search for “plugs” is now a more unique phrase of “the best luxury rubber plugs”. As you can see, the one word keyword has become a keyword phrase made up of multiple words. Searches are now more unique and these long-tail keyword phrases more specific.

Ultimately, it’s important to remember that every keyword search represents an intent by someone to find some information out. Long-tail keywords help you to better address that user intent by creating unique tailored content.

Statistics show that of 3 billion searches a day, 20% of every search is unique. That’s a heck of a lot of unique searches – and to get displayed on the first page of Google, you need a successful organic search strategy to be found amongst all of that noise.

SEO Ho Ho – Search Engine Optimisation in 2016 – Xmas Message

The year is nearly at an end and Xmas has been and gone, but there is still a lot of cheer in the air and pleasant memories of all the festivities to boot (amongst them our company Xmas card – see the image below) which went down very well with our customers).

seo-ho-ho-card

But there are other reasons to be grateful about 2016, in that in my view Google has made some really good moves to make the results fairer and more accurate, the latest Penguin update really sorting things out.

This has been somewhat of a relief to SOM as we have been ‘preaching’ what we call ‘Proper’, ‘Scientific’ SEO. What we mean by this is that we research the words that people are searching for in a market area, find the words that Google ‘wants to see’ for these phrases so that they can be incorporated in to the copy. Then we add some relevant links (with a natural anchor text and source type mix) and he presto , things start to happen.

The best part of this is that it is all totally ‘Google legal’  and can never in our view be subject to any penalties that Google may dream up at some time. We can say this as all we are trying to do is to make sure that any site we optimise offers some of the best information there is on a given subject, and of course we make sure that there are enough links to the site’s pages so that Google thinks the same. We call this link building programme ‘priming the pump’ as once the site gets traffic, the links will start building organically. Link building is still required in many cases, but perhaps, only because others are trying to get their sites rankings higher too…

As to the blog post we have included below, we certainly agree about the rise of AI and believe that Google searchers have for some time been ‘rats in the Google maze’, in that they have been analysing what we click on and what sites we like, thus getting closer and closer their goal of truly understanding the real intent behind a given search term.

The other interesting thing raised here is the increased importance that mobile search is being given these days, not really surprising when you realise that people are accessing the web using mobile devices more and more these days.

For 2017 we see it as more of the same, Google getting cleverer and cleverer at spotting the good sites (the ones that deserve rankings) from the ones that don’t, all of which means you just have to ‘Do SEO properly’ or suffer the consequences…

To see the full article on SEO in 2016 and some predictions for 2017 please click the link.

What we’ve learned about SEO in 2016?

Since the inception of the search engine, SEO has been an important, yet often misunderstood industry. For some, these three little letters bring massive pain and frustration. For others, SEO has saved their business. One thing is for sure: having a clear and strategic search strategy is what often separates those who succeed from those who don’t.

As we wrap up 2016, let’s take a look at how the industry has grown and shifted over the past year, and then look ahead to 2017.

A growing industry

It was only a few years ago when the internet was pummeled with thousands of “SEO is Dead” posts. Well, here we are, and the industry is still as alive as ever. SEO’s reputation has grown over the past few years, due in great part to the awesome work of the real pros out there. Today, the industry is worth more than $65 billion. Companies large and small are seeing how a good search strategy has the power to change their business.

As search engines and users continue to evolve, SEO is no longer just an added service brought to you by freelance web designers. With the amount of data, knowledge, tools and experience out there, SEO has become a power industry all on its own.

Over the course of the year, my agency alone has earned a number of new contracts from other agencies that are no longer able to provide their own search efforts. A large divide between those that can deliver SEO and those that can’t is beginning to open up across the board.

The rise of AI

Artificial intelligence (AI) is now prevalent in many of our lives. Google, IBM, Amazon and Apple are very active in developing and using Artificial Narrow Intelligence (ANI). ANI can be used to automate repetitive tasks, like looking up product details, shipping dates and order histories and performing countless other customer requests.

The consumer is becoming more and more comfortable with this technology and has even grown to trust its results. Sundar Pichai, Google CEO, announced during his Google I/O keynote that 20 percent of queries on its mobile app and on Android devices are voice searches.

RankBrain, Google’s machine-learning artificial intelligence system, is now among the top three ranking signals for Google’s search algorithm. Why? Google handles more than 3.5 billion searches per a day, and 16 to 20 percent of those are unique queries that have never been searched before. To handle this, the team at Google has harnessed the power of machine learning to help deliver better results.

While we can’t “control” RankBrain, what we can do is learn more about how Google is using it and then help the tool by creating good content that earns shares and links, building connections with others in our niche or related niches, and building trust in very targeted topics.

We are still in the beginning stages of this technology, but as more and more homes become equipped with smart tools like Amazon Echo and Google Home, we can be sure that these tech giants will use the knowledge they gain from voice search to power their AI technology.

The “Google Dance”

Every so often, Google likes to surprise us with a major algorithm update that has a significant impact on search results — some years we get one, and other years we get a little more.
While they do make nearly 500 tweaks to the algorithm each year, some are big enough to garner more attention. Let’s look back at four of 2016’s most memorable updates.

Mobile-friendly algorithm boost

A little under a year after “Mobilegeddon,” an event marked by the launch of Google’s mobile-friendly ranking algorithm, the search giant announced that it would soon be increasing the effects of this algorithm to further benefit mobile-friendly sites on mobile search. That boost rolled out on May 12, 2016, though the impact was not nearly as significant as when the mobile-friendly ranking algorithm initially launched.

Penguin 4.0

While this ended up being a two-phase rollout, Penguin 4.0 made its entrance on September 23, 2016. This has been considered the “gentler” Penguin algorithm, which devalues bad links instead of penalizing sites. The second phase of Penguin 4.0 was the recovery period, in which sites impacted by previous Penguin updates began to finally see a recovery — assuming steps were taken to help clean up their link profiles.

“Possum”

While this update was never confirmed by Google, the local SEO community noted a major shake-up in local pack and Google Maps results in early September 2016.

Fellow Search Engine Land columnist Joy Hawkins noted that this was quite possibly the largest update seen in in the local SEO world since Pigeon was released in 2014. Based on her findings, she believes the update’s goal was “to diversify the local results and also prevent spam from ranking as well.”
Divided index

As mobile search continues to account for more and more of the global share of search queries, Google is increasingly taking steps to become a mobile-first company. In November, Google announced that it was experimenting with using a mobile-first index, meaning that the mobile version of a website would be considered the “default” version for ranking purposes instead of the desktop version:

“To make our results more useful, we’ve begun experiments to make our index mobile-first. Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results.”

The time to say goodbye to 2016 is fast approaching, and I am truly excited to see what 2017 has in store for the world of SEO!

The Importance Of Hiring The Right SEO Firm

As a website owner, our desire is to see that our website reaches the top spot in the search engines and receive a lot of traffic that converts. You can have the best content in the world but without search engine optimization you will not reach that goal. It is a rather depressing scenario as SEO seems so simple.

However, if you have ever tried to play around with your own site, you realize it is anything but a simple task. There are so many nuances and algorithms to take into account. Learning how to do SEO and then implanting those techniques on a site would be a full-time job for many of us. Thankfully, there are people who do this for a living and many of them do it well. This article is going to shed some light on how to find the best SEO services.

Solid Portfolio

One of the most important aspects in finding an SEO firm is going to be the quality of their work, They should be able to offer up a full portfolio of sites that they were able to rank for several keywords and phrases over the years.

It is essential to take note of the competitive nature of the keywords they ranked for. Are they easy words like “best fried chicken dinner in Louisville KY?” Or is something that would take skill to rank for like, “best credit cards?” Anyone can rank for that first phrase, but there would be skill and expertise needed for such a competitive word as the second.

Guarantee

Not too long ago the internet could have been compared to the wild west. And some SEO professionals were the proverbial train robbers. They would charge companies large amounts of money for really no work at all. This was due to many companies not really understanding search engine optimization like they do today.

Now most SEO companies will offer a guarantee on the work they do and will not expect a blank cheque in advance. This is advantageous for smaller companies with a limited budget. If the ranking is not completed within a specified time you can either get your money back or allow more time for the individual to rank the keyword.

The Secret Sauce

One final ingredient to keep in mind when looking for the best SEO company is to find out how they plan to rank their site. You will not get any specifics, you just want to make sure that only ethical and white-hat methods are being used. If an SEO agency were to use underhanded methods to rank your site, it may be penalized down the road. When this happens you will either have to pay a good deal of money to get the site back in the rankings or simply begin a new one.

It can be a very time-consuming process, so make sure they are doing things that will not harm your site.

If it was easy to rank a site, everyone would be doing it! However, it is a difficult task that is best left to the professionals to handle. Let them rank your site, while you reap the long-term rewards.

 

Like any worthwhile business investment, selecting a Search Engine Optimization (SEO) agency requires time for careful consideration, and this is doubly true if your business relies heavily on online search for brand discovery.

The sheer number and variety of SEO firms to choose from is enough to give anyone pause.

During this process of intensive research and analysis for service procurement, a number of facets may not be as upfront as looking up an About Us page or researching an agency on LinkedIn.

Yet these same facets are crucial to return on investment, you don’t want to kick yourself for now knowing about them before committing to a potentially long-term relationship.

 

Culture of Transparency and Communication

You’ll want to ensure that the SEO vendor you partner with embraces the same values of transparency and effective communication you expect between in-house teams and/or employees, and for the same reasons, really.

Transparency affords businesses better relationships, synergy, engagement and solutions. Your SEO agency needs to meet the same standards your internal people do on a regular basis.

Some things to consider:

  • Which key performance indicator (KPIs) will be available to you on-demand? Are they the right ones for performance tracking?
  • Can you request any data relevant to your relationship at any given time with good reason?
  • What about communication times? Some changes in long-term strategies like link building obviously require some time to take root, so you need to be able to immediately shift tactics, can your SEO agency turn on a dime in these cases?
  • How can you ensure you’re getting the truth and not a dressed up version of events to make things look good?

Secondary and Tertiary Competencies

While most SEO agencies might list secondary and tertiary competencies in their packages, always make sure to ask.

SEO on its own is strictly limited to traffic, not conversion. It’s a means to the bottom-line, which means it functions in concert with relevant channels within search (e.g. pay-per-click ads) as well as efforts indirectly related to it.

Your SEO agency needs to be at least competent enough in coordinating and communicating to the other moving parts of your digital marketing machine to guarantee that their efforts won’t exist in a bubble, and your campaigns are not in disparate silos not working together toward a single goal.

Tech Stack

You’d be surprised how many people brush off the importance of tech stack compatibility when looking for partners across the many channels of digital marketing. There are a few simple questions that can help you determine if your SEO agency of choice has the right tech stack for your operation:

  • Are they experts in your current tech? If you’re running on WordPress, as is more than 70 million sites on the web, can your SEO agency work with that, or are they better with more technical CMS like Drupal or Joomla? Also, it’s one thing to be an expert at a certain tech stack or build, and another to just be “handy” in it.
  • Can they help you migrate to a new one, if necessary? Might seem contradictory to the above, but technology is constantly shifting. E-commerce portal Bluefly, for instance, recently found itself on the wrong end of tech adoption when the E-commerce platform they originally signed up for, among the most popular ones in the past decade, couldn’t support what they wanted to do on mobile. They ultimately had to switch providers.
  • APIs, APIs, APIs: The tech world is badly fragmented, and your SEO agency needs to ensure it either has the right application programming interfaces (APIs) or the capacity to support them from third parties.

Scale Potential

Your partner’s tech stack is relevant to this factor: scale potential refers to how big your partner can help you get before becoming too small for your operation.

It’s a simple truth that different SEO firms have various clientele targets. Some cater to small to mid-sized businesses (SMBs), others focus on enterprise. While the ones that focus on SMBs can offer unique insight to enterprise level clients, they neither have the manpower or tech infrastructure to support enterprise-level SEO.

Read more: http://www.business.com/seo-marketing/7-things-to-keep-in-mind-when-choosing-an-seo-agency/