Don’t Fear the Penguin! Part 1
In May, as Google’s main man for web spam control, Matt Cutts announced another Penguin update, and SEOs everywhere became increasingly nervous. This is because, in recent years, Google has come down heavily on websites which have gone against the code of online conduct in terms of search, and this has brought an abrupt end to the visibility of many sites. But why is that so? Read on to find out more.
Don’t fear the Penguin! The answer to Google’s Updates.
Picture the scene: You’re on a city-hopping tour of Britain, and you find yourself lost somewhere in the streets of Durham, and you’ve got to make your next train. You’ve got an old university pal, James Brown, you know that he lives around here somewhere and you’re certain he’ll help you to the train station, but you’re not sure of his exact address, and definitely not his number. You’ve got the phone book to hand and start calling your way through the 43 J. Browns listed there.
“Who? Brown? No, love, that’s my ex-. What has he done now?” So he’s not living there…
The second call is equally unsuccessful, as are the fourth and fifth. By the time you’re on the thirtieth call of the day (a very nice lady on the phone, by the way, who did initially take you for a conservatory salesman), you throw the book in the corner and give up all hope. It’s annoying, isn’t it, when you just don’t get what you’re looking for?
This is a problem Google knows about. And this is a problem Google has a solution for – Good content.
Not another Penguin?! Why Google is cleaning up the search results
In May, as Google’s main man for web spam control, Matt Cutts announced another Penguin update, and SEOs everywhere became increasingly nervous. Speculation amongst bloggers and online experts everywhere reached fever-pitch as everybody sat watching and waiting to see which changes Google was preparing to make to the online world. With no one knowing the exact effects these changes would make, a real online shiver could be seen going over the spines of online marketing. In recent years, Google has come down heavily on websites which have gone against the code of online conduct in terms of search. This brought an abrupt end to the visibility of many sites. But why is that so?
Think of the phone book. When it gives you the wrong number, it’s useless. When you’re misdirected, when you don’t find what you need, and – worst case scenario – when you’re offended or plagued with bad recommendations, wouldn’t you get a little bit annoyed? It’s so much easier when you’ve got current, clear and easy-to-use information in a user-friendly format. Google thinks the same.
Over the last few years, Google has been working on changing its search algorithm. The main goal is to make sure the search results are optimised for you, the user. It’s only then, when the users are satisfied with the results and service they get, that you’ll continue to use the market leaders in the search machines. The competition is always lurking out there somewhere, even if it’s some distance away. In concrete terms over recent years, for Google this essentially means learning how to better evaluate website content, which informs users quickly, efficiently and easily. The flip side of the coin is that websites which don’t adhere to these expectations, when they don’t meet the users’ needs, and which just work on promoting single sites will be moved down in the rankings.
The most important Google Updates – an Overview
The makers of the SISTRIX toolbox have all the Google updates carried out by the search-engine giant since 2002 documented on their website. Many of these updates were intended to impact the allegedly bad websites. Here’s just a short selection:
– April 2003 Cassandra: Websites with hidden text and hidden links as well as those which engaging in large-scale linking between websites owned by the same webmaster are punished.
– May 2003 Dominic: Change in the valuation of backlinks.
– November 2003 Florida/January 2004 Austin: Google punish SEO tactics, such as keyword spamming, stuffing or similar.
– January 2005 Nofollow attribute: Google, Yahoo and Microsoft introduce the nofollow attribute, which essentially allows the site owner to determine whether a given hyperlink is allowed to influence search engine rankings.
– October 2005 Jagger: A series of algorithm changes designed to, among other things, punish ‘bad’, ‘illegal’ and paid links.
– September 2009 Vince (Brand): The websites of ‘offline brands’ are strengthened in that they are matched to important and relevant keywords.
– April/May 2010 May Day: Google begin re-evaluating websites which ranked due to long-tail keywords (keywords with multiple terms). Websites with inappropriate or insufficient content lose their places on search engine results pages (SERP).
– February 2011 Panda: Google begin to implement its Panda update in small steps, starting initially in the USA, then gradually moving to a global scale. The filter punishes first and foremost websites with poor quality content, or more precisely, websites which see a high click-off rate or fewer, quality backlinks. Even today, Google continue to work on fine-tuning their Panda update further.
– November 2011 Freshness: The Freshness update rewards websites with relevant, up-to-date, focussed and interesting content.
– January 2012 Page Layout Algorithm: Taking into account the relationship between content and advertisement, Google begin punishing sites with too many adverts.
– March 2012 March 50-Pack: These updates bring in a new evaluation of high-quality websites and link texts.
-April 2012 Penguin: The aim of the Penguin update is to penalise websites which were too focussed on obtaining good search engine results through the use of various optimisation techniques.
– August 2012 DMCA Penalty: Google penalise websites which were frequently suspected of breeching copyright.
– September 2012 Exact-Match Domain: Websites with low-quality content whose names corresponded directly with a search term (e.g. hotels.co.uk) are set back in the search results.
In particular, the changes brought about by the Panda, Penguin and Freshness updates had a great influence on many websites and provoked a real change in search results. In fact, Google continue to work and develop the some of these updates even further. The Panda update, for example, has been followed by (at least) 24 further updates. The most recent changes were brought about by Penguin 2.0. The current algorithm changes have had a somewhat smaller effect, and, according to all signs, is a Penguin 1.0 update for the under-pages of a website. Despite this, website owners remain ill at ease: Google has already announced further large cuts for this year.
The exact techniques and practices as implemented by Matt Cutts and his team can’t be predicted by anyone in advance of the updates. In order to remain on the safe side and to avoid losing visibility, each user has really got to use Google’s Guidelines for good websites as their orientation.
On the list of recommendations website owners can follow to make sure their site is ranked, there is one thing which stands out: Google itself have stated that a determining factor in achieving a good place in search engine results is a high-quality webpage. With good content, website owners assure themselves the most reliable method of protection against the negative impact of algorithm changes.
In the next post, we’ll address what good content actually is and why it is the key to successful SEO.