Google Flux & SEO Strategy
In his post on 25th April 2016, Dr Pete Meyers discussed our ability to accurately predict the ‘Google Weather’.
It is commonly acknowledged amongst marketers that Google update their algorithm consistently, raising the question, ‘How do we strategise Organic campaigns in an ever-changing landscape?’
What is Google SERP Flux?
Google’s Search Engine Result Pages (SERPs) update frequently. Very frequently, in fact.
Outside of the notable updates (Panda, Penguin, Rankbrain, Pigeon, Adwords, etc), Google test and alter their algorithm more than many likely realise. In 2012 alone, Google made 665 improvements to their Search algorithm. This rose to 890 updates in 2014 (equating to an average of 2.5 updates a day!).
The result of these updates is ‘flux’; an ever-evolving environment in which no SERP position is ever truly owned by one website. In fact, due to how Google ‘personalises’ search results based on each user’s history, flux is more prominent than ever.
Monitoring Google Flux
Improving search engine visibility is the primary aim of our SEO team, and with it, the average rank of keywords over time. Naturally, monitoring Organic positioning is essential to the role. Building a complete picture of the sources of inbound traffic requires a diverse set of tools, including dedicated rank tracking software, prospective rank tracking solutions, and collated data from Google platforms.
- Monitoring primary keywords – Primary keywords, the priority search queries for a campaign, are usually represented by those shown to provide strong transaction and revenue opportunities. The value of these queries dictates that precision is essential when monitoring rankings; the difference between positions 1 and 2 can be significant. This necessitates a dedicated rank tracking solution; one which can provide accurate data from specific search engines, locations, devices, and SERP elements (such as knowledge graph or answers box).
- Secondary & tertiary keywords – SEO campaigns are benefitted by a wide array of keywords; more than would ever be appropriate to track individually. Prospective rank tracking solutions can provide generalised ranking data for larger keyword sets, providing average rankings as an indicator to change. Additionally, understanding how a webpage’s visibility is affected by both long and short tail traffic can also influence decisions on elements such as hierarchy and semantic fields, leading to the development of enhanced rankings and traffic.
Predicting Flux & Influencing Change
Predicting Flux is equal parts impossible and essential. The frequency at which Google update their algorithm means that whilst change is inevitable, we can never be 100% certain of the refinements made. Yet to influence change we need to consistently collate data to map the importance of influencing metrics. This necessitates an open-minded approach to search marketing. Thankfully, the SEO community promotes a shared dialogue on collated data, assisting our ability to predict future requirements and ability to strategise accordingly.
Take for example the Penguin Update. On 22nd May 2013 Google rolled out Penguin 2.0; an updated version of their link penalising algorithm. This created chaos within the Search Marketing industry as experts clamoured to learn precisely what and how the algorithm affected their short and longterm positioning. The data collated by the industry, coupled with limited feedback from Google, led to a strengthened understanding over which links Google takes exception to. Fast forward 3 years and we find ourselves on the verge of the next (way-overdue) Penguin update. The release of this update will no doubt result in unprecedented flux across all SERPs, with the winners and losers determined by those who strategised best over the past 18 months (since Penguin 3.0 rolled out).
‘Influencing change’ is a misnomer; change will occur whether we act or not. The aim of an organic marketing campaign is to incite positive change, which requires not only the protection of existing search engine visibility but the improvement to search queries determined to be of importance.
Final thought – Providing Consistent Results
The vast number of metrics used to determine where a web page ranks for a given query means that there is no one solution for any business. Whether acquiring new traffic or protecting existing, Search strategies must be bespoke; tailored to the specific needs of the business and inclusive of competitor data. This could include improvements to Technical requirements, consideration over duplicate content, improvements to semantic fields, or the development of improved inbound Authority. For more on our approach to Search Engine Marketing visit our SEO services page.