What was the Panda Update?
14 months ago Google released an algorithm update that was to change the face of SEO forever. The plan was to improve the quality of the results for a given search by pushing low-quality sites further down the results page.
For years Google’s results outstripped its competitors, earning it a market share of over 65%. However, other searches such as Yahoo and Bing were beginning to catch up, and webmasters had become wise as to how they could manipulate Google’s existing PageRank algorithm – practices such as link selling were becoming widespread.
The Panda update was Google’s answer, and gave them a whole new way to analyze websites. This new approach demanded better quality results , and gave less weight to the votes of confidence from other websites (links), choosing instead to listen to what users thought about websites. Not only did it start to consider user opinion on websites we visited, but it did this without even having to ask us for a review!
The idea was to compare a wide variety of signals, such as time on site, bounce rate and social media likes/shares, which gave Google a strong indicator as to the quality of the site. If searchers were only spending 10 seconds on the website they were directed to, clearly it wasn’t what they were looking for. Similarly, if they noticed that many users were reaching a site (and staying there) by searching for a brand name, Google’s algorithm would make a note of this and bump it up the rankings accordingly.
Needless to say, when Panda went global in April 2011 it caused a stir. CNET reported a surge in the ranking of news and social networking sites, whilst advertising-heavy sites plummeted down the list, simply because the quality of the content just didn’t warrant them having a top spot. Panda started to consider the semantic nature of a website; not just how many links it received and how many keywords could be crammed in to every orifice presented.
What has the new 3.4 update changed?
The original Panda update was probably the most dramatic change to SEO in the last decade. But, ever determined to deliver the best search results available, the boffins at Google have been toiling away in the background since then, working on an update that would encourage more high-quality sites, and together with the recent Venice “Freshness” update, level the SEO playing field for the future.
Now, it’s worth pointing out that search engines such as Bing & Google don’t really like old fashioned SEO; in truth it’s just cheating the system, trying to trick them into giving our site a high ranking by looking at what ranks well, listening to what they say they want from websites, and optimizing accordingly. If you have a site with weak content, don’t panic! A good link building campaign, well optimized anchor text profiles, and keywords all over the show can make up for it. Some people would say this is a good thing, I mean, Mr Pen’s Online Pen Store isn’t trying to provide fascinating information about pens for its users, its trying to sell them pens. However this approach has left search engines being open to exploitation, and when your company is worth 190 billion dollars you don’t tend to put up with it…
So how do you stop people doing too much old fashioned SEO on a site? Simple. You penalize it, and that’s exactly what Google have done. Sites with great content and user value are going to get all the love, whilst ones with duplication or too much optimization are going to the get the cold shoulder as Google dumps them to go and “find itself”.
I should mention at this point that the update doesn’t mean the SEO isn’t important any more – if anything now is the time for site owners to review their strategy, particularly those who have been affected by the recent changes. What it does mean is that it needs to be done carefully; cowboy approaches to SEO just won’t work any more.
As ever, no one except Matt Cutts and the rest of Google’s spam squad actually knows what makes a site overly optimized. It’s generally believed that link exchanges and keyword stuffing are already picked up by the algorithm in use, so either they’re going to crank up the punishment they deal out to the guilty parties, or they have some new ideas about what an over-optimized site looks like. This presents a challenge for sites that have undergone a lot of search tuning, but will level the playing field, allowing pages which have not been rigorously SEO’d to rank on the basis of their content, rather than the old fashioned signals – great news for webmasters who have only just set out on their online experience. The latest Panda 3.4 update will also have a big effect on websites with a lot of duplicate content because, after all, what good is providing a list of results that all say the same thing – having unique and interesting content is what will put you at the top. Dealing with duplication already on your site can be quite easily done with some SEO work, but it’s providing that great content that keeps people engaged that will really put you in a competitive position.
Exactly what is being punished and how isn’t entirely certain at this point, but what we do know is that it’s going to have an impact on sites that rely on their link popularity to get in the top listings for a search phrase, rather than the quality of a user’s experience. One thing is for sure, Google are pulling our attention towards the one thing they’ve always wanted us to do – build sites for users, not Googlebots.
If you want to safeguard your online business from search engine updates, speak to myself or the team about our SEO services.