Shortly after Google began using links as a ranking factor, SEO strategists discovered that if the text (anchor text) of the link matched your target keywords for that page, higher rankings could be attained in a much shorter timeframe. Since then, a large part of SEO has been attempting to build links to websites in order to rank higher for target keyword phrases. Anyone engaging in SEO is, essentially, trying to “game” the system to varying degrees.
Using the classic example, if I had a website with a page focusing on “blue widgets”, and I had 100 links to that page from different websites with anchor texts such as “My website” and “mysite.com“, then that page would likely not rank very highly in Google for “blue widgets”. However, if 50 of those 100 links had the anchor text of “blue widgets”, then that page would be much more likely to rank on the first page of a Google search for “blue widgets”.
Google has always publicly said that website owners should create great content on their sites so that other people/websites would want to link to it. This is a great concept, but in reality things are much different. Large sites such as Amazon, Wikipedia, Ebay, Zappos, Home Depot, Walmart, and others have a much easier time ranking due to their size. They have so much website content and so many customers that they naturally achieve a higher number of links from other websites. That leaves smaller companies/websites in a situation where they must find a way to artificially build links in order to compete with these large sites. In essence, Google had created an environment in which a certain level of manipulation was required by anyone engaging in SEO. Just allowing things to “happen naturally” wasn’t going to result in high rankings for keyword phrases that had the potential to bring significant traffic to a website.
Fast forward to early 2011…
Google began a series of updates called “Panda”, whereby they began to lower rankings for websites they deemed to be “low quality”. This quality measurement was based on lack of original content, significant errors in website code, slow website speed, etc. Many sites found themselves scrambling in 2011 to correct such problems in order to get back in Google’s good graces. A large percentage of these issues came down to copied content or duplicate content. For example, many retailers who copied product descriptions word-for-word from a manufacturer’s website all of a sudden had their Google rankings drop significantly because there was very little original content on their sites.
This was a significant step by Google to “clean up” their search engine results pages. Most SEOs agreed that it went a long way to combat spam problems. However…
In early 2012, Google has gone a step further.
This time, they have changed the way they treat backlinks to a website in terms of their ability to rank. This new update seems to have 2 separate aspects:
1. Google seems to have penalized websites that went overboard with automated linking. For example, website owners that used automated tools to create 100’s or 1000’s of spammy blog comment links have now seen their rankings dropped significantly. Similarly, website owners that participated in automated linking networks to build a high volume of links per month have also seen rankings drop significantly. This was a good thing. But…
2. Google also seems to have put a filter in place that actually penalizes website pages based on the anchor text of the backlinks to that page. In other words, using the previous “blue widgets” example, if a large percentage of the backlinks to a website page about “blue widgets” had the link anchor text “blue widgets”, that page may no longer rank as highly.
In my opinion, Google has failed in this most recent update if their intention was to weed out spammers from their search results. If they have the computing power to determine “spammy backlinks”, then why not just completely devalue those links in terms of their influence on ranking? In choosing to penalize websites in certain cases of over-optimization, they have opened the door to what is termed “negative SEO”, or the ability to spam competitors’ websites with garbage links in the hopes of knocking them out of the top rankings. Of course, this would not affect the largest companies, as those sites already have millions of existing links.
As I said earlier, most SEOs figured out years ago that they couldn’t rank a small website for competitive keyword phrases without some manipulation of the anchor text of the backlinks to their sites. However, there are many sites that have been hit by this new penalty that have done absolutely nothing manipulative at all. Google doesn’t come out and say what the magic number is for backlinks. As usual, they are being intentionally secretive. The conspiracy theorists out there will tell you that Google simply wants more people to move over to Pay-Per-Click (a much more expensive option – and beneficial to Google’s bottom-line).
A big key to success in growing your organic search traffic is good website content – there’s no doubt that this is still priority #1 with Google. Beyond that, a smart link-building campaign will now include a higher percentage of “branded anchor text” links. Instead of overloading your pages with “blue widget” links, use anchor text such as “your company”, “yourwebsite.com”, etc.
At Effect, our SEO process will continue to evolve based on our discussions with industry experts, observations of ranking movements, and our own research. This is definitely a bump in the road for many, but those who adjust quickly will continue to win with Google.