This is a contribution by Scott Mclay, author of one of the most popular link building posts ever on SEOMoz, gamer and builder of fast Beetles. I’ve heard Scott talk at a few local SEO events we both attend in Scotland and he’s a clever and genuinely nice guy. You can follow him on Twitter here.
So here we have it 2012 is coming to a close and in the 8 years I have been around the SEO industry I can honestly say this has been one of the most turbulent years I have witnessed. During the year we have seen what can only be described as a record number of official updates from Google, among which were countless Panda & Penguin refreshes, along with smaller impact updates such as the exact match domain update.
The number of algorithm changes made this year raises a few questions:
- Why did they do this?
- Did they achieve their aim?
Why Google did this
The end goal for anything Google does is to increase revenue and profits. Due to this a few questions have been asked by the SEO community which are mostly based on the viewpoint that Google are trying to phase out organic search and, in turn, SEO.
The truth is Google relies on organic search just as much as their PPC product for profits. If you look at it from a user experience point of view, users expect to find what they are searching for; if they don’t get the experience they want from Google then there is a good chance they will go elsewhere, leading to a drop in profits and a greater loss of the overall search share to Bing.
Based on the above, the clear aim is to increase the overall search quality for the end user and start to claw back some of the users which have moved over to the other search engines like Bing, who are actually doing a half decent job in key search keyword groups.
What did they achieve?
With every update Google release we see quite a few websites either getting kicked out of the index or dropping for core keywords. But since these algorithms are based on the overall profile of a website, large companies which users expect to see on the first page are being affected while a large percentage of spam websites continue to rank – granted these spam websites only rank for a few weeks but the spammers just rinse and repeat with a different domain which lowers the overall search quality.
The annoying thing about these spam websites is that they seem to follow the exact same patterns which, from an SEO point of view it is pretty easy to spot, so questions need to be asked about why Google can’t seem to crack down on these kind of activities.
Combating the Spam
The tactics being used by these spammers seem to be targeting a loophole in the algorithm designed to allow brands or products which have just been released and generate a large number of natural links over a few days, a priority to rank – a kind of buzz effect.
If you look at it from the point of view that, if a new iPhone is coming out, there will be a sufficient buzz around the products landing page, which a user would expect to find through their searches. But in order to rank the page appropriately a few of the subroutines around link quality would be bypassed until a later date – hence the 2 week lag on these websites being removed from the index.
Google needs to find a way to be able to process this kind of information quicker or potentially look for clear differences between a spammer’s tactics and more natural link acquisition created by buzz.
What does 2013 hold?
Let’s be honest 2013 is going to be more of the same – update after update and we as SEO’s are going to have to adapt to changes at a faster rate than ever, but Google is always going to look for 5 things to rank a website for competitive terms:
- Clean and organised structure
- Content written for the user
- Social signals
- Brand signals
- A shit load of links
At the end of the day if you base your onsite SEO campaign around a great website that takes the user’s needs and wants into consideration and generate an external SEO strategy around relationship building you will have nothing to worry about.
Merry x-mas and a happy new year!