Category: SEO

CandidSky nominated for “Best Use of Search – Finance” at the UK Search Awards 2018

The amazing work of the CandidSky team has been recognised at this year’s UK Search Awards, with a nomination for “Best Use of Search – Finance” for our work with Buddy Loans.

Now in their 8th year, the UK Search Awards are a celebration of expertise, talent and achievements within search marketing. The awards cover SEO, PPC and content marketing, attracting hundreds of entries each year.

CandidSky and Buddy Loans are up against nine other campaigns in the “Best Use of Search – Finance” category, including brands like Direct Line Life Insurance and GoCompare.

The winners will be announced at UK Search Awards’ ceremony on the 29th November in London, and in the meantime, here’s a little more about why this cross-channel search campaign is award worthy…

An award worthy campaign

Buddy Loans are a guarantor lender with ambitious targets. They approached CandidSky to help them reach more potential customers and convert more of their website traffic.

The campaign CandidSky ran for Buddy Loans was a cross-channel campaign with SEO being a prominent area of focus.

Within six months of Buddy Loans working with CandidSky they:

  • Improved in rankings for the term “Guarantor Loans” from position #12 to #3
  • Were the the #1 highest ranked direct lender

And the campaign is still running! CandidSky’s upcoming plans for Buddy Loans will have them reaching more people and smashing their targets.

While the agency’s strongest roots are in SEO, CandidSky has evolved into a truly cross-discipline agency. Our work on the Buddy Loans campaign demonstrates our evolved approach at its best – with SEO, content, and paid media all helping to deliver amazing commercial results for the client – the end result being a 42% increase in funded loans.

Helping our clients grow will always be our greatest reward. But this nomination is an incredible acknowledgement of our forward-thinking approach to achieving those results.

Take a look at some more examples of the great work the CandidSky team are doing – our work.

Why should you care about mobile first? And what is it, anyway?

mobile first

‘Mobile first’ simply means the design and development of a website that prioritises a user’s mobile experience over desktop. In a buying journey, potential customers move across multiple devices and channels when researching and looking for solutions. So your website needs to be responsive (scale up and down) to the device it is being viewed on, to provide the best possible user experience. This also hugely important for SEO, as Google has explicitly said it’s favouring websites that provide a positive user experience on different devices (including mobile).

What is one of the first things you do when you wake up? Check your phone? More people are shopping on their smartphones and the data shows the estimated annual spend on mobile nearly doubled from £13.5 billion (OC&C Strategy Consultants) to £27 billion (Centre for Retail Research) from 2016 to 2017 and it’s estimated to be £43 billion by 2020.

However, according to PayPal only 18% of small businesses in the UK have a website that’s mobile responsive. This means a huge percentage of companies are missing out on a sizeable amount of traffic and potential leads from mobile.

What’s the impact on Search Engine Optimisation when your site is not responsive?

In March 2018 Google announced it had started to introduce mobile first indexing meaning Google will crawl, index and rank a page based on the mobile version of a webpage, instead of the desktop version. This move is in recognition that over half of mobile searches are done on a mobile. This doesn’t mean if you don’t have a mobile version of your website that you won’t be ranked, but it is something to be aware of – if you don’t have a responsive website then you could be losing out to websites that do.  

Responsive websites improve usability for those on mobiles, which in turn leads to better levels of engagement – a factor that Google is known to measure using metrics like time spent on page and bounce rate.

A further SEO benefit of responsive websites is that all your content is stored in a single place, rather than potentially duplicated across desktop and mobile-specific domains which could negatively impact how you rank,  if not managed correctly.


Impact on conversions

Whilst one aim of search engine optimisation is to increase organic traffic and improve rankings, conversions are the bottom line that matter. Generally, mobile conversion rates are lower than on desktop, but they are catching up as ecommerce providers speed up the user registration and payment process. And by taking a cross-channel approach, for example, having an abandoned shopping cart email sent out to a user with a discount to complete their payment can push the customer over the line to payment.

Non-responsive websites can make it harder for a user to use your website, for example tiny buttons that are too close together and having to scroll and zoom in/out/across make it a much more difficult experience for users. These problems simply don’t exist with a responsive design. As Google webmasters tweeted on 14th June 2018:  

 


Is my website mobile friendly?

Google very helpfully have a tool called Mobile-Friendly Test – https://search.google.com/test/mobile-friendly This will run through your web page and diagnose any potential issues which need to be corrected.  

If you don’t want to miss out on mobile traffic and need to optimise your website, speak to us today!

 

The search engines you are ignoring but your customers aren’t

search button

As soon as you hear the word ‘search engine’, I would guess that Google immediately springs to mind. How often have you said ‘I’ll Google it’?

Have you ever said, ‘I’ll Yahoo it’?

However, it’d be a mistake to think that Google is the only search engine available to your customers when they’re researching solutions/products. We take a look at the others that may not have occurred to you, and how you can increase your visibility on these search engines to reach a wider audience.

YouTube

If it hadn’t occurred to you before, YouTube is the world’s 2nd largest search engine with 1.5 billion monthly users. Not only dominated by cat videos, YouTube is an increasingly popular marketing channel for companies to connect with and attract customers.

When uploading your videos to YouTube, it can be challenging to stand out amongst so much content. Think carefully about your title, thumbnail, and description, as these are all important in attracting your customers’ attention. Also don’t forget to promote your video on your own website and through social media channels. You will have spent a considerable amount of time and budget on video production, so don’t neglect to push it out on as many places as possible!

Bing

Bing is a search engine owned and operated by Microsoft, relaunched in 2009 to take on Google’s market share. Bing’s usage has increased by 5.2% over two years so is worthy of your attention and also has additional services including Bing Maps, Bing Travel, and Bing.

Similar to Google Adwords, Bing offers its own paid advertising platform with text ads available. Considering Google dominates the majority of paid search, competition for keywords is likely to be less – meaning it is a more budget-friendly option.

There are differences between how Google and Bing rank websites so it is worth doing some research into how you can optimise your website for Bing too; Bing is much more straightforward when it comes to search engine optimisation compared to Google. For example, Bing is more reliant on targeted and exact-match keywords whereas Google generally prioritises more comprehensive or related phrases. Bing also uses location tags to accurately place your business in local search result which Google doesn’t. They’re easy to add to your site’s code and won’t negatively affect your Google rankings.

Therefore, when optimising for Bing, it might be more advantageous to use exact match keywords in your headings and metadata. Yet, be sure to do this sparingly or your position in Google could be adversely affected.

searching with a magnifying glass

 

Yahoo

Launched in 1995, Yahoo was one of the first search engines on the market. In 2009, Yahoo announced it would use Bing as its search engine and handle their advertisements. Therefore, by advertising and optimising for Bing, you should increase your visibility on Yahoo.

Consequently, the same rules when it comes to search engine optimisation apply here. It is also worth noting that Yahoo/Bing prioritises the age of a website when it comes to rankings.

Twitter

Twitter, the quickfire social media network, also comes with a search function that should not be ignored or underestimated. With a growing number of monthly users tipping 336 million at the beginning of 2018 (Source:Statista) Twitter can be an outlet for latest company news, product launches, and showcasing your company culture.

By using relevant hashtags, you can increase the visibility of your company and extend the reach of your content. Whilst social shares don’t directly affect rankings, they can have a positive knock-on effect for search engine optimisation.

Tools like Hootsuite can help you schedule in tweets in advance so you have continuity in the frequency when you post. 

Wrapping up

By only focusing on Google as a search engine, you are missing out on a huge audience. It is definitely worth investing your marketing spend and time on the other search engines – putting your eggs in various baskets, so to speak.

Google frequently implements new algorithm changes which can have a detrimental effect on your rankings. Although a skilled agency can help you recover, exploring other search engines can help to spread the risk.

Next steps

If you wish to speak to our team about increasing your presence on other search engines, contact us today.

Take a look at our other blog posts to increase your digital knowledge.

 

SEO, UX & CRO: how do they interact for success?

SEO (Search Engine Optimisation), UX (User Experience)  and CRO (Conversion Rate Optimisation) are too often treated as separate entities. But getting all three aspects of a marketing strategy nailed is key to increasing website traffic, engagement, conversions and retention… and who doesn’t want that?

With so much focus on driving traffic to a specific area on a website and high rankings, the element that gets forgotten about is the focus on not only keeping that traffic on the website, but increasing their conversion.

From my own experience in previous positions, I‘ve been in the unfortunate situation where the design department have conveniently forgotten to bring the SEO team into a new web build discussion until it is ‘too late’ to make any design changes. One reason for this ‘selective amnesia’ is largely down to the belief that SEO and UX cannot work hand in hand, and that SEO recommendations will be detrimental to achieving a well-designed website. Well, unless your marketing strategy is to solely use word of mouth as your main traffic driving channel, your results are likely to reflect the lack of integration across different departments.

Conversion Rate Optimisation (CRO) I feel is a channel normally thrown in with implementing new landing page designs, with multiple elements being tested at once and not being particularly thought out about what is being tested, but has enormous potential to make a real difference to website performance.

Most companies undertake an element of SEO marketing, therefore identifying where to gain that competitive advantage from a website perspective is even more crucial. This is where UX and CRO come in.

One way to think about how these three channels work together is:

SEO = art of driving traffic to the site

UX = keeping traffic engaged on the site

CRO = turning traffic into customers/taking a specific action

SEO satisfies search engines, UX and CRO satisfy people.

While the goals of each channel might be slightly different, they are part of the same customer journey, and all need to be successful in order to maximise a website’s potential. Imagine if a website focused solely on SEO, yes organic rankings will improve (hopefully) and a lot of relevant traffic will arrive on the website, however, if the level of UX doesn’t match the level of SEO, users will simply exit the website when served a potentially confusing poor user experience.

Crossover between SEO, UX & CRO

UX and CRO appear similar on face value, however, there are slight differences that demonstrate the value that can be achieved by focusing on both channels.

UX is intended to make your website easier, to navigate around, and to take key actions on. CRO is intended to help you make the actions you want them to take available and taken more often such as download that white paper, submit an enquiry or join a newsletter mailing list.

In addition, poor user experience metrics such as time on site, pages per session and bounce rate inform search engines that ranking this site too high will also provide a poor user experience to their audience.

In fact, a positive user experience is becoming more and important from an SEO perspective with factors such as a site showing its secure, mobile-friendly and has a fast page load speed all impact organic ranking positions in a positive way.

Therefore, neglecting the time needed to ensure your website traffic is being served a positive user journey and can convert easily, has a detrimental impact on your SEO efforts.

CRO is a channel that is likely seen as the least important of the three based on the recognition and promotion it gets industry-wide, which is hard to comprehend when it is the channel that can most impact your revenue, ROI and build brand loyalty through conversions.

A comprehensive CRO campaign should focus on combining data-driven insights with user experience, A/B Testing, competitor analysis and in-depth user testing. It is an area of the marketing mix that should be consistently evaluated, and despite there being best practice elements involved there is no one-way of CRO that’s fits all scenarios, and should, therefore, be tailored to the customer’s behaviour, intent and objectives.

Getting out of the siloed mindset

Obviously saying this is the easy part, the challenge is implementing this into the internal processes within a company environment. This should start with involving all departments at the beginning of each web build/project to ensure SEO, UX and CRO are factored into the design. 

In a previous blog, I refer to the impact of siloed marketing channels and how cross-channel marketing is crucial to driving real value in a campaign. Well, this blog follows the same train of thought in the sense that design and channel teams working in a siloed fashion can lead to siloed outcomes.

If your company is undertaking a new website build project for a company, ensuring not just design drives the outcome, but the strategy in general also plays a role.  To create a product that the client is happy with and drives quality traffic and conversions is a win-win for everyone. Experimenting with new page layouts, for example, has the potential to not just benefit SEO performance, but provide a more suitable page for your PPC traffic to land on, increasing ROI and decreasing costs.

Next steps

If combining SEO, UX and CRO is an area you want to explore further, please get in touch to discuss how we can best support your wider marketing objectives.

If you would like to work at CandidSky and develop your career prospects, take a look at our Careers website for available roles and find out what is like to work here.

And finally, take a look at our other blog posts to see what else we have been up to.

9 common SEO mistakes to avoid

To say that search engine optimisation is a challenging task is a bit of an understatement. The algorithms for platforms such as Bing as well as Google are constantly changing and require trained minds to understand how best to adjust strategies accordingly.

Despite this, numerous business leaders hear about the benefits of SEO and view it as a simple profit-generating venture. Consequently, they make errors which have the opposite effect – costing them traffic and conversions.

The good news is many of these common mistakes can be fixed and agencies such as CandidSky will be there to help businesses resolve them. Yet, if you’re wondering if your SEO campaign is on the right track, I have detailed some mistakes which I’ve seen individuals make. I’ll explain each in more detail below, but in summary these are 10 most common SEO mistakes to avoid:

  1. Preventing crawls from taking place
  2. Believing link building (or any strategy) is dead
  3. Investing in link earning, rather than intentional link building
  4. Excessively using keywords
  5. Neglecting mobile users
  6. Creating orphan pages
  7. Reusing content
  8. Neglecting local search
  9. Abandoning a strategy too soon

Preventing crawls from taking place

The robots.txt file is a vital part of any website and specifies pages you don’t want accessed by crawlers. When put like that, it’s tempting to prevent all robots from visiting your website. However, it’s worth remembering that search engines use crawlers to find relevant pages.

Consequently, if all robots are prevented from accessing your site, you won’t appear in search results. In this situation, it won’t matter how much effort you put into your SEO campaign, your website might as well be invisible.

If you want to learn more about robots.txt files, take a look at this resource.

Believing link building is dead

To borrow a routinely used phrase, there’s a lot of “fake news” out there. This is also true when it comes to the SEO industry. Partly spread by people who haven’t given a strategy the time it deserves, they then post articles online claiming what they were doing is ineffective.

Link building is one of the most frequent tactics described as dead whereas it is still a viable – and essential – part of any SEO campaign. This dismissiveness might be due to how link building has evolved over the years and I’ve linked to a post explaining some good strategies to use.

Regardless, if you see an article stating that a particular tactic is dead, don’t completely take it at face value. Gather multiple sources and trial several strategies before writing it off. Chances are, the author of that piece has just been doing it wrong.

Investing in link earning, not building

Link earning is a phrase which tends to differ in meaning depending on the person who uses it. While some definitions involve creating quality content and assuming people will read it, others favour some degree of promotion and getting automatic backlinks as a result.

Sadly, this strategy is not as effective as actual link building. When publishing content on your site, always invest suitable time in content promotion. In fact, Social Triggers states that, when generating content, 20% of your time should be spent on creation – 80% should be allocated to promotion.

Excessively using keywords

In the early days of SEO, using as many mentions of a keyword as possible greatly benefited your campaign. For example, webmasters wanting to rank highly for “cheap used cars” generally had to include that phrase more than their competitors to succeed. Eventually, it got to a stage when SEO-targeted content became challenging to read and looked like this:

Are you looking for cheap used cars? Great, we have a fantastic selection of cheap used cars in our online store. Browse our selection of cheap used cars and find a cheap used car to suit your needs. Speak to us today about finding a cheap used car.

Thankfully, this strategy does not work anymore. Yet excessive keyword use is still prevalent within some SEO campaigns. These days, relevance and variation is vital when targeting a particular phrase. Not only is this better for improving visibility but – from a UX point of view – it helps the user as well.

Neglecting mobile users

The browsing habits of users has changed dramatically over the years. Whereas people were once limited to desktops, now they can access the internet through mobile. In fact, in 2016, the number of internet users on mobile and tablet exceeded desktop for the first time.

However, despite the importance of mobile optimisation, research conducted by PayPal this year suggested that just 18% of UK small businesses had a website which was correctly optimised for mobile.

Needless to say, by neglecting mobile users, companies are missing out on a substantial number of customers.

Creating orphan pages

When a webpage is not linked to from anywhere else on the site, this creates what we call an orphan page. By not having these connections, it becomes very challenging for both users and search engines to find – signifying to a search engine that this page is not a high priority. Similarly, internal links are vital for the flow of SEO value (gained from building an authoritative domain) around a website.

When adding a new page, it’s essential that it is well linked to with a strong internal linking structure. This can include in-content links (links within the copy), navigational links (links within menu structures), or breadcrumbs. Otherwise, even if this new resource contains amazing content, it will struggle to rank well.
Reusing content

When managing a large website, the idea of creating unique content for each and every page can seem daunting. Either on purpose or through accident, it is not uncommon to see remarkably similar content throughout several pages of one website. Perhaps it is no surprise that a study published in 2015 revealed that almost 30% of websites suffered from duplicate content issues.

This creates a problem for crawlers where, when faced with multiple identical pages, they can struggle to identify the original. As a result, this traditionally leads to decreased rankings and traffic.

Neglecting local search

A study conducted by Google in 2014 revealed that more than 60% of smartphone owners used their device to access local information stored on adverts. Moreover, the organisation revealed half of consumers who used local search visited a store that same day.

Although this aspect of search is deeply important, some webmasters don’t rate local optimisation that highly. As a result, they won’t focus on their Google My Business listing, NAP data, or local schema. However, by doing so, they could be missing out on an audience base close to home.

Scrapping strategies quickly

One misconception with SEO is the amount of time it takes for results to manifest. All too often, companies will cease strategies after several weeks – accusing them of being ineffective. However, the effects of search engine optimisation can take several months to materialise and should continue to improve as time goes on.

When running an SEO campaign, perseverance and patience are two very important qualities to possess.

How to report SEO success to your boss

SEO success

 

Marketing Managers often have a difficult time conveying the value of their SEO efforts to their boss, and are often left wondering how to demonstrate the channels true value.

But worry not…

Throughout this article,  we aim to provide detailed insights into how to report SEO success in a way which your superiors will not only care, but emotionally engage with,  resulting in receiving the plaudits you deserve.

Understanding KPIs and objectives

KPIs (key performance indicators) are the bread and butter of any marketing campaign, not just SEO. Before starting any work, you need to have a thorough understanding of your organisation’s KPIs, and align with your digital strategy to help achieve the business’ wider objectives. An example KPI may be that you wish to achieve an X% increase in revenue by next year, or that the organic channel generates X number of conversions in a certain period of time.

If you have an in-depth understanding of your objectives, you can continue to demonstrate how your SEO results are contributing to the achieving them. This will undoubtedly engage your stakeholders, resulting in personal recognition which we all love!

Conversions

Whether you’re an e-commerce business aiming to improve revenue, or you’re a lead generation company wanting to increase the number of enquiries, you’ll want to be referring to these in every report and meeting you have with your stakeholders.

Ultimately, stakeholders don’t care whether we have achieved a 100% increase in traffic Year on Year, or if we’ve witnessed a 10% uplift in Share of Voice. They simply don’t see the commercial benefit of these metrics. Instead, they care about how SEO is contributing towards the wider objectives, which is why it’s essential to constantly revert back to the first point in the report.  You can ensure that you’re able to report these accurately by tracking everything in Google Analytics (form completions, phone calls, clicks for directions etc.) and explaining how results correlate with offline media.

Visibility

Whilst rankings aren’t the most commercially relevant metrics for a campaign, they do represent the progress which you’re making. As a result, it is important to include a section about visibility in your SEO report, which hopefully shows progress Year on Year!

Furthermore, if you are able to communicate continued progress for priority commercial keywords, you’re also likely to also witness improvements for other metrics, such as conversions. Moreover, if you are able to establish a connection between a particular improvement in visibility and an improvement in sales/conversions, then this is going to connect more with your stakeholders, thus making them care more than if you were to simply outline the data.

Work completed and expected outcomes

Whether you report to your boss on a weekly, monthly or quarterly basis, you will need to outline the work which you’ve completed over the period and the expected outcomes. An example could be that you’ve built X number of high quality links to a particular area of the website, which you expect to improve sitewide visibility and authority over the coming months.

By outlining the work you’ve completed and explaining the projected outcome, you’re adding value and quantifying your role, which further enhances your boss’ understanding of why you’ve spent a proportion of your working week on a task. Furthermore, if you have an attentive boss who has a particular interest in organic search, this could present an opportunity to educate them on SEO best practices and improve their understanding of the channel.

Furthermore, we always recommend adding a section at the bottom of a report explaining tasks which are planned for the upcoming period, so that that boss’ feel ‘in the loop’, and know what is going to be delivered in the next meeting.

Make your reports easy to digest

Most Directors and Stakeholders don’t have time to read through paragraphs and paragraphs of text, therefore it is essential that every report you generate is well structured, aesthetically pleasing and easy to digest.

Every person is different, therefore it is important to understand what your boss’ value within your report, and what they understand. Once you know this information, you can begin generating your report with graphs and well-structured content – we’ve found bullet points and prominent headings often make content easy to digest.

If you want to speed up your reporting, you can use reporting software such as Data Studio or Report Garden, which allow you to pull through real time data from Google Analytics. This data can be quickly made into graphs and charts, which will improve the value of your report to stakeholders, thus better communicating the value of the channel.

Next steps:

If you would like to speak to us more about SEO and how we can grow your business online, contact us today to arrange a call.

If you would like to work at CandidSky and grow your career prospects, take a look at our Careers website for available roles and find out what is like to work here.

And finally, take a look at our other blog posts to see what else we have been up to.

 

Local SEO Guide – The Ultimate Checklist

Search marketing shop front and mobile

Introduction

Whether you’re a small, medium or large business, it is essential that you understand Local SEO.

By ignoring it, you are not only losing out on local demand for your products or services, but also missing an opportunity for increased visibility in the SERPs.

Throughout this guide we will discuss key factors and techniques to ensure that you have a thorough understanding of Local SEO, which will help you generate more website traffic, enquiries, physical visits and boost your company’s reputation. We have also put together a comprehensive Local SEO Checklist to ensure that you can audit your website performance for local search and strategise accordingly.

So, what is Local SEO?

Local SEO isn’t all that different from traditional SEO, however, it focuses solely on generating exposure for searches with local intent.

By focusing on local searches, your aim is to appear in both the organic listings and local pack, therefore generating a higher SERP real estate. For example, if you’re a local Chinese restaurant, you will want to appear in both the organic listings and map pack whenever somebody enters a query such as ‘Chinese restaurant in…’, ‘Chinese restaurant near me’ etc.

If you aren’t running a Local SEO campaign, you’re missing out on over 56% of mobile searches, which is only set to increase further with advances in technology.

Let’s get started…

Whether you already have a thorough understanding of Local SEO factors or you’re a complete novice, this guide aims to help increase your knowledge.

Some of the factors are backed up with facts, whilst others are controversial and arise from influencer speculation. As with everything in this industry, nothing is set in stone, but let’s get started…

External Factors

  • Authority of inbound links to the domain: Links have always been the primary trust signal that Google use and local rankings are no different. The authority of inbound links are still the number 1 ranking factor. You can take a look at your backlink profile using Ahrefs, which is arguably the best software for backlink analysis.
  • Authority of inbound links to Google My Business landing page URL: Google places particular importance on the Google My Business landing page (usually the homepage), therefore the quality of inbound links to this URL should be prioritised. You should aim to gain authoritative links with a high domain authority (DA) for the best results.
  • Diversity of inbound links: Since Penguin, a domain’s link profile has had increased importance, with Google favouring diverse link profiles. The days of having thousands of guest posts or reciprocal links have long gone, your SEO strategy now needs to acquire links from multiple sources in a natural manner.
  • Inbound links from locally relevant domains: Many SEOs overlook the importance of links from locally relevant domains, meaning that this is an area to exploit for many local businesses – for example; an inbound link from a local church could be just as valuable as a high authority link when it comes to local visibility.
  • Topical keywords used in anchor text of inbound links: Whilst exploiting keyword rich anchor text links will result in a penalty from Google and land you in a messy situation, it can still be extremely powerful if done correctly. Your link profile should include a mixture of keyword rich, branded and generic anchors from a variety of sources.
  • Distance between address and search location: If your business’ physical address is in close proximity to that of the searcher, you’re more than likely to appear higher in the SERPs.
  • Business Address is in the city of search: Many businesses have struggled with Local SEO due to being located outside of a city’s borders. As a result of the Google Possum update, this has changed and many businesses outside of borders have seen an increase in local rankings, however, it still seems that Google favours those based within a specific location.
  • Branded search volume: A recent study from Tom Capper outlined that branded search volume has a higher correlation with rankings than domain authority. As a result, increased social media advertising and PR could have a positive effect on your Local SEO strategy.
  • WHOIS information: Ensuring that your WHOis information is consistent across the web can cause a slight ranking boost for both local and non-local searches. Research shows that if your information is inconsistent, or even worse hidden, your rankings could suffer. You can read more about this here.
  • Authority of third party reviews: Whilst acquiring reviews on Google should be the primary objective, third-party reviews on websites such as Yelp, Yell, and Trustpilot can also boost your local visibility. Third party reviews can also act as a trust signal, provide referral traffic, and be marked up using Schema to display rich snippets.

Google My Business Factors

  • Verified listing: Not only does verifying your Google My Business page allow you to manage your listing, but studies have also shown that Google gives a small ranking boost to businesses when their listing is verified.
  • Category associations: Correctly selecting a Google category can often be one of the deciding factors whether a website shows in the local map pack; it really is that important. Your category should be what your company is, not what it provides. It is recommended that you’re as descriptive as possible when categorising your business and Google recommend choosing as few categories as possible, however, if you feel that more than a couple of categories are necessary, it is more than likely that Google will too.
  • Google My Business title: It is recommended that your product/service keyword is included in your GMB title, however, it is important that this doesn’t appear spammy. If you over-optimise it can have an adverse effect on click-through-rates and hinder your local SEO performance.
  • Age of listing: Ultimately, the more mature your Google My Business listing, the more Google trusts the company. As a result, Google is more likely to place you higher in the map pack and organic results for searches with local intent if your page is older than that of a competitor’s.
  • Quantity of Google reviews: Google wants users to use their review feature and 90% of customers state that buying decisions are influenced by online reviews. Due to this, Google uses the number of a business’ reviews in their native reviews as a factor in their local algorithm.
  • Quality of Google reviews: “Quality over quantity” is definitely relevant when speaking about reviews. Users don’t want to see hundreds of 4-5 star reviews with no context and neither do Google; therefore Google will prioritise lengthy, topical and contextual reviews. These will carry greater weight when ranking a website and also appear higher than reviews that Google deem ‘low quality’.
  • Including a local area code: Google’s guidelines suggest that you should always include the local area code within your business listing. Tip: this should match your NAP on both your site and third-party websites.
  • Adding relevant photos: Google favour businesses with photos from both the company and (if applicable) customers. Not only this, but new users will also have a better experience when they land on your GMB profile, particularly if there are high-quality images rather than a screenshot from Google Maps. This only takes a few minutes to upload if you already have images and could provide a quick win for your local SEO campaign.
  • Having a visible address: By hiding your address in Google My Business, you’re instantly shooting yourself in the foot when it comes to Local SEO. The primary focus of Local SEO is to gain a local presence, however, if you’re hiding vital information Google is likely to rank a competitor that’s being transparent higher in the local listings.
  • Listing hours of operation: You don’t want potential customers arriving at your premises to find shutters down, so why would Google? By listing your hours of operation you are giving Google more information and providing a better customer experience.
  • Correctly placed map marker: A simple mistake can be costly and that’s definitely the case here. By incorrectly placing a map marker, you could be directing users to the completely wrong location, which could result in a poor user experience, loss of revenue and even poor reviews.
  • Add posts to GMB: An update to Google My Business in the last quarter of 2017 allows you to add posts to your profile; these can be used to promote events, promotions, products, and your general services. The rich content will be delivered to users when they find your business listing on Google and is an excellent way to increase CTR. It is unknown whether Google will give a ranking boost to businesses using posts, but even if they don’t, they can still be a great way to drive interactions.
  • Respond to negative reviews: Although it is not concrete whether or not this is actually used as a local ranking factor, it should be imperative for business owners to interact with, and try to resolve negative reviews. By doing this regularly, you are not only showing new users that you care about customer satisfaction, but you’re also increasing the possibility of resolving the issues with that customer, resulting in the negative review being removed.
  • Duplicate listings: Just as duplicate content has been an issue in SEO for years, duplicate Google My Business listings pose the same issue. If you have two listings for the same location and business, Google will often see this as spam and filter out one of the listings. It is also worth noting that if you are at a shared location where two companies have the same category, Google will often filter out one of the company listings.
  • Legitimate business address: Using a fake business address is something that Google has been tackling aggressively due to it being exploited to gain an advantage in the local search results. They’re tackling this by doing random spot checks on registered addresses and if you’re found to be violating the terms and conditions, your listing could be removed. In the past, fraudsters have registered P.O boxes to receive a postcard from Google, before verifying and changing their location within that geographical area, however, this is much less likely to work today.

Site Factors

  • Domain Authority: As one of the leading contributing SEO ranking factors, domain authority isn’t something that should be overlooked when undertaking a local SEO campaign. Although competition is bound to be less than a national or international campaign, to dominate your local field you need a high DA and trust from Google.
  • Topical Keyword in the domain: The days of keyword rich domains are over since Google tackled the growing issue in 2012, however, a study from Canirank found that keyword rich domains still rank 11% higher than branded domains. The main reason for this is that Google believes that the website has a higher relevancy, combined with inbound links including the keyword in your business name, e.g. https://hotels.com/.
  • NAP address is featured sitewide: If you are a company with less than 10 physical addresses then you should have each address added to your website’s footer. This should be the same as your NAP in Google My Business, third-party websites for consistency and marked up using Schema.
  • Phone number structure: If your website is optimised for conversions, it is highly likely that you will have a phone number present in the header, footer and on the contact page. Your phone number should include the local area code, be clickable on mobile, be marked up using structured data and match other citations.
  • Outbound link quality: The outbound links from your website should all be of a high quality. If Google deems your outbound links to be of a poor quality then you could see this harm both your local and non-local rankings.
  • Outbound link themes: Relevance is just as important as quality when it comes to outbound links. As with inbound links, each outbound link should relevant to that page, whilst linking to local pages also sends out signals to Google for local results.
  • Outbound link volume: Whilst outbound links are important, it is also imperative that you aren’t linking too much or too little. If there aren’t enough outbound links on a page, you could be losing out on opportunities, whilst if you have too many outbound links, Google could deem this spammy which could negatively affect organic performance.
  • Presence of malware: Malware on your website will seriously harm your organic performance and can result in a manual penalty from Google. If you have malware present on your website you may aswell say goodbye to both your local and non-local rankings until the issue is resolved.

On-page Factors

  • Length of content: On average, a page which ranks on the first page of Google has 1,890 words according to a study carried out using SEMRush data. It is recommended that you take a look over competitor websites to get a true gauge of what is required in your industry and look to better their content in terms of both length and quality. We always conduct a content audit at the beginning of our SEO campaign’s to ensure that there is sufficient copy and to identify low-hanging-fruit.
  • Site-wide content relevance: If you’re a local business, the content on all your pages should be not only relevant to keywords you’re targeting, but also reference locality. This can be achieved through techniques such as in-content links to local websites and including references of your target areas within the copy where applicable.
  • Content relevance on GMB landing page: The content on your Google My Business landing page is imperative for success in the local map pack. The content on this page (usually the homepage) should be relevant to the category you’ve selected and be locally relevant for the best success.
  • Google My Business landing page title: Title tags (also known as meta titles) are one of the key elements for on-page SEO. As a result, you should ensure that the title tag for your Google My Business landing page is relevant both topically and locally. Tip: Remember that your title tag will appear in the SERP’s. If your title appears spammy it can harm CTR.
  • Header tag relevance: Header tags carry more weight than the rest of the text on a page, therefore it is important that all of these are optimised. For Local SEO it is best practice to try to include both your primary keyword and area in your H1 heading.
  • HTML NAP matches Google My Business: Your NAP (Name, address, phone number) should be consistent across the web, however the most important is that your Google My Business NAP matches that which appears on your website. It is also recommended that you markup all relevant information using Schema/JSON-LD to ensure that Google recognises the data.
  • Optimise for ‘near me’ searches: ‘Near me’ searches are growing rapidly as more people perform local searches on portable devices. As such, it is important that your website is optimised for these, which can be achieved by tailoring your content, ensuring that your GMB signals are perfect and managing citations/links.

Social Media Factors

  • Facebook page likes: Page likes act as an authoritative signal for Local SEO, with Google more likely to trust a business which has 4000 page likes, as opposed to one with just a few hundred. Not only this, but the larger your Facebook following, the more users you are reaching out to with every post, which can increase shares (another trust signal).
  • Age of Facebook page: Just as humans are more likely to trust an adult over a child, Google trust a Facebook page the more mature it is.
  • Facebook shares: Share-worthy content is a must for any successful campaign and there has been speculation dating back to 2010 between social shares and organic rankings. Many tests have been carried out over the years and although nobody are certain, a recent study by Larry Kim confirmed that by having more Facebook shares, your page is more likely to rank in the top 10 results for both local and non-local searches. By attracting more Facebook shares, you are also increasing your potential audience and the possible attention of influencers, who could provide invaluable inbound links.
  • Facebook reviews: As mentioned earlier in the guide, Google uses reviews as a trust signal and Facebook’s reviews act as one of the major third-party review signals. Although not as important as Google reviews, Facebook reviews are an important Local SEO ranking factor and can increase the chance of a conversion for new users. Recently, Google has also started displaying third-party display data alongside your Google My Business listing.
  • Twitter followers: Just like Facebook page likes, Twitter followers are a trust signal that Google uses when analysing a business’ online presence. The more you have, the more likely it is that Google will recognise you as a trustworthy local business.
  • Twitter engagement: Likes, Retweets, mentions and replies are all metrics that business owners use to judge the success of a Twitter campaign and the same is true with Google. Not only this, but if you have a tweet that performs particularly well, it can actually display in the SERP’s as a structured snippet and can be great bait for generating inbound links.
  • Snapchat Geofilter’s: Although Snapchat provides no crawlable content, it is a great tool for local businesses to advertise and should be integrated into every local SEO campaign. Once you have created your Geofilter and it’s been approved by Snapchat, you can expect buzz from other social media outlets and should also reach out to local media outlets, with the aim of generating invaluable local inbound links to your website. For the best outreach results, it is also recommended that you have a landing page on your website that is relevant to what it is you’re promoting.
  • Social Links to and from your site: There is no point investing resources in your social channels if search engines can’t link them to your website, therefore it is recommended that you link to your social pages on every page of the website (usually within the header/footer) and link to your website from your social pages.
  • Topical bio: Your social media bios are a great opportunity to add both topical and locally relevant content. It is recommended that the content is unique and tailored to your audience on each channel, for example, your bio on Linkedin should be different to that on Twitter due to both of these having a different audience.
  • Consistent NAP references: You’re probably bored of reading about your NAP by now, but it really is that important for a successful Local SEO campaign. Social media pages provide further opportunities for unstructured NAP’s and should be consistent with the rest of your references.
  • Interaction: Engage, engage, engage is what you hear from all Social Media experts and this is also true if you want to drive a successful SEO campaign. By engaging with relevant users, you are much more likely gain more social shares, therefore increasing the possibility of inbound links and increased rankings.
  • Apple Maps Connect: A bit off track with this one, but it’s something that is missed by a lot of local business owners. Although Apple Maps aren’t generally used for non-branded searches, they’re still a great way to capture local branded traffic. By listing your business using Apple Maps Connect, you can ensure that firstly, your company will show in Apple Maps, and secondly, that you can manage your listing.

Citation Factors

  • Consistency of Citations: Consistency is key when building citations and if you get it wrong, it could confuse Google and have an adverse effect on your local rankings. Initially, we would recommend building citations through directories such as Yell, Yelp, Foursquare and Thomson Local. By stepping into Google’s shoes, it’s easy to understand why they would be more confident displaying a website which has 40+ consistent citations over one which is sending mixed signals.
  • Quality/Authority of Structured Citations: When selecting where to place your structured citations, you should think about both the quality and authority of the website you’re placing them. A study from Search Engine Land showed that the most important aspects to look at are industry relevance, local relevance, domain authority and the number of competitors that are listed.
  • Quality/Authority of Unstructured Citations: While structured citations are most important, you shouldn’t overlook the importance of unstructured citations. These are citations which come from sources such as newspaper articles, blog posts, government websites and industry associations. If you want to dominate the local pack in your niche, it is unstructured citations which can give you the competitive edge.
  • Citations from locally relevant domains: Where better to place your NAP than on a website that relates directly to your geographical area. These websites include local news outlets, blogs and community websites.

Technical Factors

  • Relevant landing pages: For the best results you need landing pages which are relevant both topically and locally. Without a relevant landing page for each of your products and services, it is increasingly difficult to send the signals to Google of what you do. If you are trying to target too many topics on one page, it is likely that you won’t succeed due to the optimisation being diluted.
  • Mobile-friendly website: Since the mobile-friendly and Mobilegeddon updates, Google has been pushing for every website to be designed for mobile. If your website is mobile friendly then Google will give you a small ranking boost and it is expected that Google will roll out the mobile-first index imminently, which will see websites that aren’t mobile-friendly drop for both local and non-local results.
  • Site architecture: High-quality internal linking is just as important as inbound links. Internal links pass authority from page to page, therefore if a page is deep in your link structure, Google will deem it to be a low priority which will affect rankings. It is recommended that no priority landing page is more than 3 clicks from the homepage, with the highest priority pages featuring in the primary navigation. Methods for improved internal linking include adding structured breadcrumbs, including in-content links and having a flat site-architecture.
  • Structured data markup: While not directly affecting rankings, structured data is something which you can’t ignore if you want to increase your organic visibility. If you markup information using structured data then it can be used by the search engines to generate rich snippets and gain you a higher SERP real estate. For local businesses, the most popular schema markup includes address, opening hours, price range, reviews, and breadcrumbs – you can see a full list of structured data options on Schema.org.
  • Duplicate content: One of the biggest issues for SEO is duplicate content, which can single-handedly cause huge problems for a campaign. If your content isn’t unique and is a direct copy from elsewhere on your website or an external website, it is highly likely that performance will be less than impressive. Instead, you are likely to get a cannibalisation issue in which you have pages competing against each other for the same phrases, which will cause rankings to fluctuate. For many local businesses who don’t have a lot of services, this is likely to be less of an issue, but for a website with many pages, it can be harder to manage.
  • Canonical tags: A canonical is a HTML tag that is placed in the <head> section of a page to prevent duplicate content issues. Each page should include a canonical tag to ensure that the canonical URL is indexed by the search engines – this should either be a self-referring canonical or a canonical to the preferred URL. For local retailers that have an e-commerce website this is particularly important, however, it is also important if your website displays multiple versions of the same URL. An example is that https://www.test.com/Men/clothing.php?sessionid=273749 is the same as https://www.test.com/Men/clothing.php, therefore to manage this you would have to specify the canonical URL in the page’s source code.
  • URL structure: URL’s are a touchy subject with webmasters, however for local SEO the best structure to follow is always using a single domain or subdomain, followed by a keyword rich slug. When possible it is recommended that you exclude dynamic URL’s, use less than 60 characters (where possible) and try to match URLs with your H1 tag. For Local SEO, it is always preferable to use the topic of the page as the slug, followed by the area where applicable.
  • SSL certificate: Since January 2017, SSL encryption has been a factor in Google’s core algorithm, meaning that websites that utilise SSL technology get a slight ranking boost. For local businesses fighting for less competitive keywords, switching to SSL could provide a boost that will see your rankings rise and it should be fairly easy to implement with the help of a developer or hosting company
  • Hreflang attribute: The hreflang attribute tells the search engines what language you are targeting on a specific page, which helps them serve the page to users searching in that language. It is particularly important if you have various versions of the same page, which serve different countries. For example, if you have two versions of a page targeting both the Republic of Ireland and the UK, you would need to implement hreflang tags to tell Google which page to serve in each location. Without being correctly implemented you could experience cannibalisation issues, high bounce rates, and low conversion rates.
  • Server location/site speed: Google states that the third most important geo-targeting metric is server location. It is recommended that your server is located in the same country as the majority of your customers – this is more of an issue if your server is slow and you have a long TTFB which will increase bounce rate and cause usability issues. If you are a local business in a competitive industry, users aren’t going to wait around forever so you want your site speed to be as quick as possible!
  • Country code top-level domains (ccTLDs): ccTLDs are generally utilised to indicate that a website is relevant to a particular country or region, for example, if your website is targeting the UK you should use the following ccTLD: .uk. Google says that this is the strongest signal you can send that your content is targeting a specific country, and as such is particularly important if you’re running a Local SEO campaign. If you correctly implement ccTLDs, hreflang attributes and have your server in your targeted country, you are sending all of the right signals to Google.

Behaviour Factors

  • Click-through-rate (CTR): You can calculate CTR by dividing the number of impressions by the number of clicks. This data is available to all webmasters through Google Search Console. Google uses this data to establish whether or not your website is relevant to the user intent, which can then be used as part of their algorithm to rank you for a particular keyword. For example, if you are a local Chinese takeaway and your CTR is low for ‘Chinese takeaway near me’, it is likely that your rankings will decrease for that query. In order to increase CTR, you should ensure that your titles and meta descriptions include USP’s and are written in a way to stand out. You can also use schema markup to show structured snippets in the SERPs.
  • Time spent on page: Due to Google using time spent on a page to define Ad Rank in AdWords, it is also speculated that they use the same metric for organic results. Moreover, you should ensure that when a user lands on your website that they have a positive experience, and spend some time browsing the available content. You can increase the time a user spends on your page by adding visual content, creating engaging text and ensuring your landing page is relevant to the query.
  • Bounce rate: Bounce rate is a metric that Google uses to measure whether somebody stays on the page, or leaves without navigating to another page. If you have a high bounce rate, it is likely that there is an issue with page load time, there are usability issues or the page isn’t relevant for the query. If you have a high bounce rate (which can be viewed in Google Analytics), it is likely that Google will presume there is a usability issue and therefore won’t want to send users to the site, thus resulting in lower rankings.
  • Number of clicks to call: Users can call a business directly from the map pack and Google has access to all of that data! As a result, if your business receives many calls in comparison to impressions, Google is more likely to believe that you are a trusted local business and rank your website higher as a result.
  • Directions to business clicks: Most local businesses will want customers to visit their premises and Google collects data both from directions to business clicks. The data which they announced they gather includes driving logs to review counts, temporal signals, and direction information, which can all then be used to help deem the rankings of your local business in the SERP’s. For example, if Google can see that people are willing to travel further to your business than one of your competitors, it shows them that it is a popular, trusted business.

Time to conclude…

So what are you waiting for, grab a coffee and begin to put your Local SEO knowledge to use!

It’s easy to identify where you can improve your local presence by working through our Local SEO checklist – it will help you identify areas for improvement & strategise a plan to blast the local competition out of the water.

We hope you enjoyed our comprehensive local SEO guide, and if you have any qustions please feel free to get in touch.

Frequently asked SEO questions

Why have my rankings dropped?

Rankings are the bread and butter of any SEO campaign, therefore monitoring your rankings on a daily basis is essential.

But what do you do if your visibility has decreased overnight and why might this happen?

All is explained below:

The most likely reason that a particular keyword has dropped is due to fluctuation, which is completely natural in organic search. There isn’t a day goes by where we don’t see ranking flux, and you will usually see your rankings improve again in the coming days. If there is a significant drop in rankings, then you may have an underlying issue on your site that needs investigating further.

Secondly, there may have been a quality update by Google, which happens on an almost daily basis. Due to the constantly changing digital landscape, it is essential that you remain up to date with the latest Google updates to ensure that you’re working within Google’s guidelines.

Another possibility is that the ranked page has lost some authority due to a loss of inbound links. Monitoring your backlink profile is something that you should be doing on a regular basis, and doing this religiously will ensure that you spot a problem before it affects your visibility. Our favourite tool for monitoring backlinks is Ahrefs.

If you see a site-wide drop in rankings, you may have been hit by a manual or algorithmic Google penalty. The difference between the two is that a manual penalty is applied by a Google employee, whilst an algorithmic penalty is automatic and usually caused by a Google update. You will know if you’ve been hit by a penalty if you see a harsh drop in your organic traffic, whilst rankings remain stable on other search engines.

If you have been hit by a Google penalty, you can check this in Google Search Console, by visiting ‘manual actions’ – this will give you more information on why you’ve been hit with a penalty, thus helping you to devise a solution.

Finally, there could be technical issues on your site that are hindering your SEO efforts. Technical is often one of the most overlooked aspects of an SEO campaign, yet it is an area where you can reap the highest rewards. We recommend conducting regular technical audits to ensure that your website is in good health which is something we do for each of our clients.

 

search engine

How do we gain visibility for featured snippets?

It’s now been three years since Google launched Featured Snippets in the SERPs, however, many digital marketers have failed to adopt them as part of their SEO campaigns. But why would you not want to rank above everybody else in the organic results? It doesn’t make sense to allow your competitor to gain a higher SERP real estate.

Firstly, you need to understand the opportunities available in your niche, and whether featured snippets are worth your investment. For example, if you’re a local brick and mortar business, we would instead recommend focusing on the organic and map pack results, as these would reap the highest ROI. On the other hand, if you’re in a market where there are a lot of questions, featured snippets will present a huge opportunity.

Once you’ve identified if featured snippets are relevant to your niche, you should begin conducting keyword research and competitor analysis’ to identify where featured snippets exist and if you are able to create content that’s better than what already exists. Alternatively, you can use SEMrush to identify opportunities through their featured snippet analysis tool, or like ourselves, you can use STAT to identify which of your tracked keywords feature an “answer box”.

Once you have identified opportunities within your niche, it’s time to create content focused on acquiring position #0. This content will need to clearly answer the question, have positive user engagement signals and have clean code that is easily digestible for Google.

Does page load speed impact my organic performance?

The answer is simple – page load times absolutely affects your organic performance. Google have indicated that site speed is used in their algorithm over on Webmaster Central Blog and we have seen multiple instances across our SEO campaigns where improving site speed has resulted in improved organic visibility.

Ensuring that pages on your site load quickly is also essential from a user experience perspective. A slower page load speed tends to cause a higher bounce rate, less time spent on a page and ultimately, fewer conversions. We recently wrote an in-depth blog post on how to improve your page load speed, which is definitely worth a read!

 

page load speed

How often should I publish informational content?

Publishing content is something that we touched on in our recent SEO myths article, however, a question that is common relates to how often a business should publish informational content.

The truth is, there is no definitive number of resources you should be publishing in a specific time frame. It is more important to focus on creating informational content that will provide value to the user, capture customers at different stages of the sales funnel and gain exposure in the SERPs.

Creating one piece of exceptional content is much more valuable than 100 resources that provide no value to users or the search engines.

Next steps:

If you are interested in working with CandidSky and seeing how we can help to grow your business online, contact us today to arrange a call.

If you would like to work at CandidSky and grow your career prospects, take a look at our Careers website for available roles and find out what is like to work here.

And finally, take a look at our other blog posts to see what else we have been up to.

SEO myths – revealing the false claims

Over the years, it’s safe to say that we’ve heard and rebuffed hundreds of SEO myths. Often these come from new clients that have unfortunately been provided out-of-date, or plainly inaccurate information in the past. But why is there so much false information being shared?

The truth is that the digital landscape is constantly evolving, therefore something that worked in the past, may not work today. It’s essential to remain up-to-date, relevant and ahead of Google’s updates to truly succeed in SEO, but by listening to false SEO myths, you could be harming your organic performance.

Some of the common myths we’ve heard (in no particular order) are discussed below;

1) Create amazing content and outrank your competition

Since Panda was introduced in 2011, there has been a frenzy about creating high quality content. We’ve even heard people stating that your website will be penalised if your content is deemed low quality by Google. This is definitely FALSE.

Whilst creating high quality content is important, it isn’t enough by itself. A recent study from Brian Dean explains how creating amazing content is just the beginning of your journey.

Once you have created your content it is time to promote. Here are a few ideas:

  • Social media – whether you already have a loyal social following, or you promote your content through targeted ads, social media shouldn’t be overlooked. If you can get your content shared by influencers, you could earn valuable links.
  • Influencer marketing – the Kardashians are living proof that influencer marketing is a huge industry and one that shouldn’t be ignored. Whilst it’s unlikely that you’ll get somebody that big to engage with your content, there are influencers in every niche that you can get to promote your content. By interacting with influencers you are increasing your reach, gaining loyal customers and increasing the authority of your website.
  • Build backlinks – Google may advise against intentionally building backlinks, however, your content won’t reach its full potential if it can’t attract inbound links from authoritative and relevant sites. On the other hand, link building is a time-consuming and highly skilled task, therefore if you get it wrong you could be in trouble. For that reason, we recommend using a reputable SEO agency (such as ourselves) to build high quality links.

The final step is crucial…

Once you’ve created an amazing piece of content and increased the page’s authority, you will need to monitor progress through a rank tracker (our favourite is STAT), Analytics tool (Google Analytics) and backlink monitor (Ahrefs).

These tools will help you to improve your content and monitor the progress of all your hard work!

2) Meta tags aren’t important for SEO

Meta tags are snippets that appear in the <head> section of a page’s code and are generally used to communicate information about the topic of a page with the search engines.

The most common meta tags are:

  • Title tag – the title of your page; this appears at the top of your browser and in the SERPs.
  • Meta description – a brief description (up to 160 characters) that describes the page.
  • Meta keywords – a list of keywords that you deem relevant to the page.
  • Meta robots – a rule for the search engine crawlers of what to do when they land on your page.

When we hear this statement, we assume that individuals are referring to meta descriptions and meta keywords, as it is common knowledge that the title tag is one of the most important on-page elements, whilst meta robots are required on every page.

We will start with the meta description…

Whilst meta descriptions don’t directly contribute to your organic rankings, they are an essential element for driving click-throughs from the search engine results page (SERPs). Many recent studies also suggest that pages with a high click-through-rate (CTR) receive a ranking boost by Google, thus increasing your organic visibility over time.

Your meta descriptions should include a brief description of the page, USPs (unique selling points), CTAs (call to actions) and a semantically linked keyword (this will be emphasised in the SERPs).

Now onto meta keywords…

For SEO newbies, you won’t remember the days when Google used meta keywords in their core algorithm – this was eliminated as webmasters spammed the feature by inputting hundreds of variations of the same keyword!

In 2017, Google ignores meta keywords, but this doesn’t mean they’re dead. Other search engines such as Bing and Yahoo still use meta keywords in their algorithms, therefore by adding in a few relevant phrases on each page, you are increasing your organic footprint.

Third party directories can also use them to determine and categorise a page on your site.

3) The more pages on a site, the better your rankings will be

Quality over quantity is definitely true when it comes to the number of pages on your website. A common misconception is that you require hundreds, or even thousands of pages to be deemed a high quality site, however, this is just another common SEO myth!

Less is more 99% of the time.

We recently worked with a client on restructuring their informational content, which resulted in removing hundreds of pages! Although it wasn’t an idea that was initially welcomed, it has resulted in fewer, but higher quality pages, less cannibalisation (two pages competing against each other) and increased engagement.

The moral of the story; concentrate on generating high-quality pages that will deliver value to the user, rather than a number of low-quality pages for SEO purposes.

4) Your website will get penalised for duplicate content

Whilst duplicate content can hinder your organic performance, it certainly won’t result in a penalty from Google, despite what you may have been told. This was confirmed back in 2013 by Google’s head of search spam, Matt Cutts, who announced that it’s nothing to stress about unless you have spammy duplicate content.

A much more likely outcome of having duplicate content on your website is cannibalising pages, which could result in high ranking fluctuations, or a page not ranking due to being deemed low quality.

If you’re facing a duplicate content issue, feel free to get in touch and speak to an expert!

5) Nofollow links have no value

Any digital marketer worth their salt would know that nofollow links are an essential component of an SEO campaign. Whilst they do not pass authority from one website to another, there are huge benefits of acquiring nofollow links including;

  • Having a healthy link profile: if you are building links, you need to have a healthy mixture of both follow and nofollow inbound links in order to avoid possible penalties from Google. If your link profile looks unnatural to an SEO, it is also going to appear unnatural to Google which could land you in serious trouble.
  • Referrals: many major publications only use nofollow links, however, these links are still likely to pass huge value to your website through referrals. As if an influx in referral traffic isn’t enough, by being featured on well-regarded websites you are further increasing your brand image.
  • Linkbait: you can use nofollow links as bait to attract dofollow links to your site, which will pass through that all-important link juice. This is a completely natural way to acquire links!

6) Running a PPC campaign will improve your organic results

It’s surprising how many times this has been discussed, despite nobody that is a reputable source implying that it’s true.

AdWords and organic are completely separate entities that don’t affect each other’s performance, and for most industries, we would recommend running an SEO and PPC campaign alongside one another for the best results.

So if you were thinking of spending thousands on AdWords in an attempt to increase your organic rankings, please resist and instead speak to an expert about the benefits of both mediums!

7) Your website platform makes a difference to your organic performance

As a digital agency, we have experience with a range of Content Management Systems (CMS), the most popular being WordPress and Magento. A common SEO myth is that you need to have a website built on a certain platform to ensure that you can perform well in the search engines, however, this is just another myth.

Providing that you have access to the code and a skilled web developer, you can ensure that your website is built for maximum organic performance. Our only advice would be to choose your developer wisely and avoid website builders such as Wix and Divi, as these will almost certainly hold back your SEO efforts regardless of budget.

8) You need X% keyword density to rank for a particular search term

This myth is a blast from the past, but fortunately, the times of aiming for 3-5% keyword density are long gone!

As Google’s algorithms have developed and become more sophisticated, they’re able to better understand natural language and deliver a better experience for the user. This has ultimately resulted in the death of keyword stuffing and if you or your SEO agency are still discussing this concept, it’s definitely time for a rethink!

Instead of aiming for a particular keyword density, you should instead be creating content that is relevant, high quality and uses semantic keyword targeting.

Wrapping things up…

These are just a selection of SEO myths that we’ve been questioned about recently, however, there are hundreds more we could’ve touched on.

If you want to discuss any information you’ve received, feel free to get in touch with our SEO team, join us for a coffee or have a chat with myself.

If you would like to work at CandidSky and grow your career prospects, take a look at our Careers website for available roles and find out what is like to work here.

And finally, take a look at our other blog posts to see what else we have been up to.

Earning links – the good and the bad

We are all aware that a website need links to rank well, but the big question is, what is a good link and what is a bad one? Google’s guidelines will tell you that you need to build links naturally, which is true to an extent, however we will guide you through the do’s and the don’ts when acquiring links.

The Good

When analysing a company’s link profile, the first thing the search engine crawler will analyse is whether it looks natural. Is there a spike in referring domains? Is there a spurt of links from one IP? Does it look like a PBN? These are all things that need to be considered, and if your link profile does look unnatural, we recommend speaking to an SEO expert.

It’s true that building links in the right way does take more time and there is a lot more skill involved, but the rewards are much more prevalent. If you get it wrong, it can often be time consuming and costly to fix.

Create Visual Assets
Everyone worth their salt knows that visual assets are great link bait. The primary reason that every time somebody shares your infographic, image, diagram etc. you get a link back, which isn’t the case when you display the same information as text.

Visuals aren’t only beneficial for link building though – they are proven to increase the time a user spends on a particular page (which case studies show Google use as a ranking signal) and they can be used across your social channels to drive engagement, whilst generating shares and reactions.

Create The Ultimate Guide
If you are an expert in your industry, there’s no reason why you can’t create the ultimate guide! This means that you essentially create a great resource that tells the user everything there is to know about a topic relevant to your niche.

Once you have chosen a topic to create the ultimate resource, it’s time to generate something that’s engaging, packed with information and most importantly, share-worthy.

Despite the myth, creating amazing content isn’t enough alone. You need to promote that content and get it seen by the right people, which is where good old fashioned outreach comes into play.

Guest Blogging
The days of churning out poorly written, keyword rich content and placing it on a third party website is long gone, however, if guest blogging is done correctly, it can still be an effective method for boosting your website’s rankings and authority.

In today’s environment, it is imperative that these articles are written to provide value to the user and not solely for the acquisition of links. If you are guest blogging for the sole intention of creating links, it is almost guaranteed that this technique won’t work and could potentially harm your rankings.

Due to Google penalising sites that use guest posting as a link building technique, we would instead recommend planning cautiously and using the technique to boost your brand awareness, as well as reaching out to a wider audience in your niche. If you can do this well, then you will see inbound links start to flurry in as a result!

Niche Directories
Going back a few years, directories were king when it came to link building. Unfortunately for many marketers, this technique was buried when Google rolled out Penguin and many sites that were listed in thousands of directories were penalised.

Despite this, niche directories can be an extremely valuable link source. These aren’t technically traditional looking directories that list multiple website’s under a specific category, but instead could be networking groups, industry specific associations or trade organisations websites.

Here are a few things to consider:

Does the directory accept anybody that pays a fee? Avoid.
Does the directory publish content from the source that can be indexed? Avoid.
Does the directory regularly assess their outbound links to ensure all sites are still live? Consider.
Does the directory have a low spam score? Consider.
Does the directory have feature a lot of keyword rich anchor text? Avoid.

 

The Bad

Link building is an art that requires practice, skill, patience and most importantly, time. If you use the wrong techniques then you can expect to have wasted a lot of resources and could even incur a penalty from Google.

Below are some of the link building techniques that you should avoid at all costs.

Private Blog Networks
Since Penguin arrived, we have seen Google penalise private blog networks (PBNs) and the sites that they link to. Back in 2014 Google started to send out thousands of manual action notifications to webmasters that participate in these schemes, and continue to do so to this day.

Many bloggers have reported using PBNs for short-term gains, however, if you are serious about promoting your business’ online presence in the long term, you should steer clear. It’s easy for Google to recognise these networks and once they do, you will be hit with an instant penalty which will cause serious headaches.

Comment & Forum Spam
Some business owners may consider posting a link to their website on forums and comment sections of blogs, however, from an SEO perspective, this technique is going to do a lot more harm than good.

Not only does it damage the perception and value of the brand, but this technique is also likely to pass on no benefit in terms of links, due to most forums and comment sections utilising “rel=nofollow” for all outbound links.

By using this technique you are playing with fire – there are little benefits to be had, whilst you are risking a link-based penalty.

Automated Link Schemes
Link networks continue to be a problem despite Google aggressively tackling the matter for a number of years. Despite there being a high chance that your website will be penalised if you join such a scheme, many unknowing business owners sign up to automated services that result in thousands of unrelated, spammy incoming links – we can assure you that it isn’t a simple process to rectify either!

Ads that state things such as “Boost your rankings quickly” or “Get 1000 backlinks for $50” are the most obvious signs that you’re signing up to an automated link network, however, many dud SEO agencies also use them for quick results. Instead, we often see that pages are penalised, or even worse, the website is de-indexed and doesn’t appear in the SERP’s at all until the links are disavowed.

In order to avoid automated link schemes, it is important to consult with a knowledgeable SEO expert.

Sitewide Links
The best practice for link building is to place the anchor text naturally in a body of relevant text. By placing a link in a sites footer, header or sidebar, it is automatically given less weight by Google due to webmasters spamming these locations in the past.

In general, Google’s algorithm will either completely ignore inbound sitewide links or pass very little weight. The guidelines recommend that sitewide links which appear as a result things like advertisements and royalties are nofollow.

Wrapping things up…
Links are still the most important ranking factor in Google’s algorithm, however, if you get it wrong you could be in serious trouble and face a penalty.

We would love to speak to you regarding your digital strategy and offer our expert advice. If you would like to discuss your requirements, feel free to drop us a message or pop in for a coffee and discuss all things digital.

Yoast SEO auto-applying ‘disallow all’ to robots.txt

man stressed at an office desk

Yoast is the most common SEO plugin for WordPress, with 40 million downloads to date. The plugin allows you to customise your meta tags, have complete control over breadcrumbs, ensure your content is readable and much more.

One of the advanced features available on Yoast SEO allows you to generate a robots.txt and sitemap.xml file. Both of these files provide guidance to bots on how you want them to crawl your website, however, as they’re often auto-generated, technical issues can arise.

This morning, we recognised a crawling issue on multiple websites that use WordPress, in which Yoast had auto-applied a disallow rule to the robots.txt file (see screenshot below) – the result being that search bots are blocked from crawling every page on the website, which poses obvious problems from an organic perspective. Luckily, we spotted this early meaning that none of the websites we manage have been affected, however, it could’ve been a totally different story had this been missed, and organic visibility could’ve dropped off completely.

Yoast Robots.txt example

Whilst we have investigated possible reasons for this occurring on some WordPress websites, but not others, we were unable to pinpoint the source of the issue or recognise any patterns.

How do I check if my site has been affected?

The easiest way to check if your website has been affected is the robots.txt checker in Google Search Console – this tool will allow you to check for any errors and test specific bots to see if they’re able to crawl a specific page on your website. For this purpose, we would set the test to see if Googlebot can crawl your homepage.

If Google can’t crawl my website, what do I do?

If the test results come back negative, you will need to take immediate action, however, it should be a simple fix by following a few simple steps – these are outlined below:

  1. Login to the CMS of the affected website
  2. Navigate to Yoast & choose ‘Edit files’ or ‘Tools’ (depending on your version of Yoast SEO)
  3. If your version of Yoast uses ‘Edit files’, you will be able to update your robots.txt from the page that opens, however, if your version of Yoast displays ‘Tools’, you will need to click ‘File Editor’ in the next window.
  4. Now you will need to edit the robots.txt file – you can either copy the template below or use one of the many tools available online to generate the content of the file.
  5. Hit ‘save changes to robots.txt’.
  6. Your robots.txt file is now updated on the server, however, there is one more step – you will now need to go back to the robots.txt Tester in Google Search Console and submit your new robots.txt file.
  7. As always, you should test that your file has updated and that Google recognise this by refreshing the page a few minutes later.

Almost half of all WordPress websites we manage were affected by Yoast SEO auto-applying disallow rules and we believe that this could be a widespread issue affecting many websites – so if you have any friends, colleagues or people in your network that you know are running a website on WordPress, let them know and be sure to share the article to help others.

Can you help?

We have created a PDF to create a single resource for investigation by Yoast. At this time we do not have enough of a pattern to pinpoint the cause of this issue, but if you experience the same, please share here.

Need for Speed: Why a fast website matters more than ever

When it comes to website optimisation, ensuring your site speed is up to scratch makes a huge difference.

It doesn’t really matter what performance metric you’re basing success on – whether it’s goal conversions, revenue, search rankings, whatever. If you have a slow website, then any and all of these are going to be harder to achieve.

If the average mobile website loads in around 20 seconds on a 3G connection, yet your content page fails to load in under 3 seconds, that can have a devastating effect on your visitors and potential conversions. Research has shown that roughly 54% of users will bounce out if the page takes longer than 3 seconds to load.

Having a website that loads faster is considered a far better user experience for your users. With internet connections becoming faster and faster, we tend to expect a faster experience on all our devices. Our devices are getting stronger and faster, and yet the vast majority of websites out there are struggling to keep up and are being left in the dust. A fast page load time is very quickly becoming an imperative part of a good user experience, ensuring the correct content is show to the user in as short a time as possible, keeping them happy and engaged.

Fortunately, there are quite a few ways in which we can optimise for speed in order to make sure your site is running as fast as possible. We’ve decided to mention a few of them below.

Minimise your CSS and Javascript usage

One of the most common areas where there can be some obvious streamlining optimisations tend to be under the hood, looking at things such as your site’s CSS (stands for custom style sheets) and your Javascript codes.

By carrying out some “minification” (the process of cleaning up your code from unnecessary character), some essential yet quick wins can be achieved with regards to overall site speed.

Minimise your redirects

When migrating from one site or page to another, the most common way of passing along users and authority from one to the other is with the use of permanent (301) redirects. However, there is a downside to this.

Having too many redirects can lead to your site not being able to load as quickly as it should, as the servers that direct traffic to your site will have an additional step to ensure users land on the correct page. While they tend to be a necessary evil, it is advised to best use only when truly needed.

Choose your hosting provider carefully

To make sure your content loads as quickly as possible, it’s also important to choose your site’s host carefully. Keep in mind that bandwidth and traffic demands on your site tend to change over time, so ensuring that your hosting provider will adapt to your site’s changing needs is very important.

Strip down for Google AMP

The Accelerated Mobile Pages Project is Google’s answer to Instant Articles. Like Instant Articles, AMP is a service which reformats your content for faster display on mobile devices.

Keeping your site fast

As you can see, there are a number of ways in which site speed can be addressed and remediated. Remember to keep in mind that a slow loading website can have negative repercussions on traffic, as well as costing you conversion opportunities and potential revenue.

On your marks….get set…..optimise!

May the (SEO) force be with you

Earlier this year, I made the move to agency life at CandidSky from an in-house marketing role.

I came as a junior to help support and assist the rest of the team. In the months since, I’m proud of how far I’ve come, picking up new skills, advancing in my role and now running my own campaigns. I figured it would be nice to round up my five top tips for those in a similar position to myself all those months back; anyone thinking about, or just starting, a career in SEO.

But WAIT! With the new Star Wars film, Rogue One, set to be released this month causing a disturbance in the force around the CandidSky office, it would be a missed opportunity not to do a themed post. With this in mind, strap yourself in for a mix of arguably the worst Stars Wars based puns and tenuous links the Galaxy has ever seen…

starwars-lego

Find yourself a Jedi Master

Much like Luke finding Yoda in the swamplands of Dagobah, I’ve been lucky enough to find three exceptionally talented SEO masters at CandidSky – my very own Jedi Council! Having the opportunity to call on the skills and experiences of all three has helped me to go from an SEO padawan to running several successful campaigns in a matter of months.

If you’re already at an agency this should be easy enough, but for those without that resource available, don’t be afraid to ask any questions you have on the likes of Reddit or Moz, both of which have active search communities.

Harness the power of the (analytics) force

Google Analytics is a powerful tool, it can tell you almost everything you need to know about a website’s performance (‘who’, ‘what’, ‘where’ and ‘when’). Unfortunately it can’t tell you the ‘why’, this is where you come in. Any person can pull off data from GA, but as an SEO you should be able to connect the data seen in GA with the context of the client’s campaign.

Take some time to learn as much as you can about data analytics (after all, there is 1.5x the demand for people skilled in GA than there is a supply, making you more valuable). I recommend starting with Web Analytics 2.0 by Avinash Kaushik, the Millennium Falcon of analytics books. At the very least, make sure you study for and take Google’s Individual Qualification to demonstrate your analytics powers.

Read!

Darth Vader and Kylo Ren can both read minds to get the information they want, fortunately you only need to read articles. Much easier. Plus, there are literally Death Stars full of great content for beginner SEO’s out there, which you should be gobbling up quicker than Jabba the Hut at an all-you-can-eat buffet. For starters, try ViperChill’s less frequent, but really in-depth blog posts, or Avinash Kaushik’s easy to read, yet super-informative emails.

Pro tip for Slack users: Set up an integration to pull in the newest articles from the top industry sites. You’ll never miss anything ever again, and, even better, you’ll always have a feed of articles at hand for when you want to take a quick break.

Get under the hood

Learn how to avoid looking for technical issues in the Alderaan-places (sorry), and get under the hood of a website! I can’t recommend doing a series of comprehensive technical SEO audits in your spare time highly enough – pick a varied mix of five websites and get stuck in. You’ll learn far more about technical SEO this way, and then when it comes to auditing a client’s site, you’ll have likely already come across the majority of any potential issues.

And finally…

Resist the temptation of the Dark Side

Spun articles, paid backlinks, spam comments, keyword stuffing, cloaking; just a selection of the types of techniques upon which the foundations of black hat SEO sit. Learn about them, understand how they work, maybe even flirt with them a little in a test arena, but never give in to them – all you’ll end up doing is getting a hefty penalty and probably losing a client. Ultimately, it’s best to leave black hat SEO where it belongs… in a galaxy far, far away.

Five top tips for Google Analytics beginners

Google Analytics is a powerful website performance analysis tool for webmasters.

So powerful, in fact, that it provides approximately 80 unique reports as standard, with the potential to create hundreds more custom reports.

But all that data can be overwhelming if you don’t spend much time getting to grips with it. That’s why we’ve put together five top tips for understanding your Google Analytics account better…

1. Don’t focus on your total sessions

If you’re a local business that can only provide a service or deliver to a limited area, or you only have physical brick and mortar stores in a particular area (e.g. the North West) then pay less attention to your total number of sessions. Instead, place more emphasis on the number of sessions coming from people in the local area using the Location report.

After all, what good is it if you’re receiving tonnes of traffic from London but have no way of providing a service to these people. It’s more important to measure (and improve) the number of users in your local service delivery area who are finding and interacting with your business online.

Pro tip: if you don’t already, then it’s worth checking out your Google My Business account for the data Google provides on how users have interacted with your knowledge graph, including the number of calls and driving direction requests.

2. Always consider context

Despite being described fantastically well by Avinash Kaushik as a “sexy metric to measure the number of people who came to your website, saw it, puked, and left”¹, bounce rate alone is not the be all and end all.

Bounce rates in GA’s Landing Pages report are great at pointing out which pages on your website require some attention, but there is a misconception that high bounce rates are inherently bad. Not at all. In fact, a high bounce rate can indicate that a user came to your site, got exactly what they wanted straight away and left. This means you’ve satisfied the user’s original intent, kudos!

This is why it’s important to always consider context when interpreting the data from Google Analytics. GA is great at providing all sorts of quantitative data, but it’s up to you to provide the explanation for why the data is like that. So next time you go into the Landing Pages report, don’t immediately start to worry if your top landing pages are getting high bounce rates. Consider instead what questions a user landing on this page may have, and whether that question is being answered straight away. Herein lies the context, and it’s vital to separating pages which are under performing, and ones which provide all the information a user needs without having to navigate the rest of the site.

3. There are two ways to calculate ‘conversion rate’

At its core, conversion rate is a really simple, useful and actionable metric which shows how well a website is persuading users to complete a target objective. Though be aware that conversion rates in Google Analytics are calculated based on the total number of sessions by default:

(Total Conversions/Total Sessions) × 100 = Conversion Rate (%)

It’s important to note that every user can have multiple sessions, which means the above equation is fine to use if you’re an e-commerce platform that could reasonably expect a user to make a purchase every time they visit the site, e.g. clothing retailers or supermarkets.

However, let’s say you sell beds. Buying the perfect bed requires a lot of research, so a user may browse your site numerous times before they settle on a bed they like and decide to purchase, creating multiple sessions in the process. In this instance, it doesn’t make sense to then calculate your conversion rate based on the total number of sessions. You’ll get a much more accurate reading of how well your site is converting by using the total number of unique users instead (referred to as ‘New Users’ in GA).

(Total Conversions/Total New Users) × 100 = Conversion Rate (%)

GA doesn’t provide the option to change how its metrics are calculated, so unfortunately anyone wanting to work out Conversion Rate using the second equation will have to do it manually. But, if this is the best fit for your business then it’s certainly worth doing.

4. Segment by device

The number of Google searches performed on mobile devices surpassed the number on desktop some time ago now². Furthermore, Google’s Gary Illyes recently announced that a mobile-only index is coming, and will likely be launched within months³. So if you aren’t already optimising your site for mobile users then now really is the time to start. And an important part of any optimisation process is the measurement of existing behaviour.

So how do you analyse the behaviour of mobile users on your website? Start by segmenting them. Fortunately, Google Analytics comes ready-loaded with an advanced segment for mobile traffic. Apply the segment, then spend time running through each report in detail. Often, you’ll find mobile users interact with your site completely different to desktop users, and this will help you identify where improvements can be made.

Perhaps the data shows that mobile users don’t convert as well as desktop users, so are you sure that contact form is working properly on mobile? Maybe they don’t spend much time on a page where you’d expect otherwise, is that video content loading correctly? You get the point.

Final thought on this one – advanced segments are the most powerful part of Google Analytics. We’ve only focused on how they can be used for mobile optimisation here, but don’t let that stop you from using other segments to really dig into user behaviour and website performance.

5. Keep it simple

Each report and metric in Google Analytics serves some purpose. There’s no doubt that some are more important and provide more insight than others, but nonetheless I’ve yet to come across a completely useless piece of data provided by the tool.

With this in mind, it’s a good idea to try and cut through this jungle of data to find the most important bits for you and your business, which really depends on your aims and objectives. The best advice I can give you is that for whichever data or metrics you choose to focus on, make sure they’re simple to understand and, most importantly, actionable – Ben Yoskovitv4 makes a great point that if a metric isn’t actionable, then it won’t change the way you behave, so move on and measure something else.

Final thought

Getting comfortable with Google Analytics should be a key part of any marketer’s role. It doesn’t matter if you’re the CMO, head of department, or an office assistant, being able to analyse and interpret website data is a massively useful skill. How do you justify all that spend on an SEO or PPC campaign to the board? Who are your audience, does that match up with who you thought your audience were? Where do you start with your mobile marketing optimisation? These are just a small sample of the kinds of questions a marketer needs Google Analytics in order to answer.

If you haven’t spent much time with GA in the past, we hope our tips give you a few pointers to start with. Ultimately though, there’s no better substitute for diving into the data head first and seeing what you find.

Good luck!

References

1Web Analytics 2.0 – The art of online accountability & science of customer centricity’, Avinash Kaushik, 2009.
Building for the next moment’, Jerry Dischler, Google Adwords Blog, May 2015.
Within months, Google to divide its index, giving mobile users better & fresher content’, Barry Schwartz, Search Engine Land, October 2016.
Measuring what matters: How to pick a good metric’, Ben Yoskovitz, OnStartups.com, March 2013.

5 signs you’re working with a dud SEO agency

We need to talk about SEO.

Five years ago, no other three letter acronym is likely to have caused more confusion for businesses looking to grow their business online. Back then the key question(s) was simple: what does it do and why do we need it?

We’ve since witnessed search engine optimisation evolve from a little known buzz-phrase into LinkedIn’s ninth-most valuable workplace skill – oh yes, it’s official.

Those three letters are now an essential consideration for all modern businesses. But as SEO becomes more and more in-demand, new challenges arise. So many of our current clients initially contacted us because they were deeply unhappy with the service provided by their former agency.

These days, the key SEO question businesses need to ask themselves may have changed, but it’s just as simple…

What are the signs you are working with a dud SEO agency?

 

bad-seo1

1. Managing expectations

‘We can rank you on page one of Google for your highest volume term in a week for only £100’!

Are alarm bells ringing? They should be.

Consider the competition within your niche. If ranking for your most important search query was as cheap as £100 and could be achieved within a week, wouldn’t everybody be doing it? Hell, we would.

Unfortunately, your competition is working as hard as you are to improve their visibility to your desired customer base, and it takes time, effort, and ingenuity to earn the right to rank first.

SEO has evolved significantly over the past five years and, with over 200 ranking metrics to optimise for, it’s unlikely such small time frames and budgets will make much of a dent, which ties directly into our next big clue…

2. Price

In his article, ‘5 reasons to avoid affordable SEO services‘, Nathan Gotch hit the nail on the head in describing precisely why you get what you pay for when it comes to SEO.

The key message is affordability. The price of the campaign should provide a projected and tangible ROI which falls in line with the proposed strategy. This leads to an ever-growing campaign where profits are reinvested back into the channel to provide compounding returns.

The high-quality assets that Google loves cannot be developed on a shoestring budget and, in the long run, you’ll likely end up wasting money that could have been invested into a more viable campaign (PPC, for example).

3. Strategy

We’ve seen it many times before. Frankly, it doesn’t take long to do. Your dud agency logs into Google Adwords, adds a few keywords related to your business and pulls off the top ten highest volume ones, telling you that if you rank for these you will receive boundless traffic and growth. Unfortunately, this isn’t the case, is it?

An SEO strategy should be aligned with the growth of your business, with organic visibility reflecting these areas of growth. Important questions to ask are, ‘Does your SEO agency truly understand what you do’? or ‘Do they appreciate where growth will come from’?

Understanding the intent behind your priority keyword set is of utmost importance. For example, as a recruitment consultant in Manchester would you benefit more from ranking for ‘jobs in Manchester’, or ‘recruitment consultants in Manchester’?

Adwords would tell you that there are significantly more searches for the former, but the latter would provide a direct route those who pay for your services.

Keyword research is one of the most crucial elements of a campaign, setting the way point for future success. Do your priority keywords truly reflect opportunities for business growth, or are they misaligned, or even vanity terms? Does your strategy aligned with your business development?

4. Fixed-term contracts

Controversial, but thought-provoking. If your agency is as good as they say they are, why do they need a fixed term contract? Surely strong, consistent results would lead to your continued business?

I say controversial because there are reasons in which a fixed-term contract may be beneficial to both parties, such as providing confidence of partnership.

For reference, CandidSky don’t have fixed-term SEO contracts… and to date have not needed them (if you feel otherwise, please convince us in the comments section).

5. Results

This is the big one.

Does your SEO agency attain real results for your business?

Over how long a time period are these achieved?

How are the communicated?

When it comes to results we believe firmly in the realist approach (which ties to the earlier stated expectation management). Significant results will not be seen overnight, but there always be indicators of growth. How does your agency communicate these growth indicators, and can you foresee a future in which your investment will lead to compounding ROI?

Bonus entry – Claims and Terminology

If you ever receive an email like this, run. Find sanctuary.

screen-shot-2016-10-21-at-13-10-46

3 most common technical SEO fails

When it comes to SEO, you’ve probably heard all about the importance of having engaging content well optimised for keywords.

But, often overlooked is the technical side, which encompasses all the work done behind the scenes to ensure a website is in tip-top shape when a search engine crawls and indexes it. This means that technical SEO is arguably even more important than creating and optimising content. After all, what good is it having great content if there’s a structural problem preventing that content from ranking well?

As an SEO team, we find new and complex technical problems all the time – mostly from an audit of a new client’s site. Very few sites are perfect – we often see glaring errors from the biggest brands. When a brand is so big that users flock to the site in their hundreds of thousands, a small dip in traffic is hardly noticed, but that small dip could be a symptom of a bigger, underlying issue. In this blog we highlight the main technical issues that hamper search performance of the biggest brands right down to start-ups.

1. Crazy canonicalisation

When you have content that can be accessed from different URLs, for example with dynamic breadcrumbs, canonical tags should be used to simply tell a search engine that these are all legitimate versions of one page.

Yet incorrectly set up canonicalisation is probably the most common technical SEO issue we come across; everything from canonical URLs missing the domain name, to the target URL not actually existing. When canonicalisation is set up incorrectly, it can lead to big duplicate content problems.

breadcrumb

2. Dastardly duplicate content

This is a big one. There is sometimes a slight misunderstanding of what constitutes duplicate content. Fortunately, our Search Director , Simon, has written a great SEO’s guide to duplicate content to clear up any confusion. Simply put, duplicate content occurs when a webpage can render at multiple URLs, which can be detrimental to performance in search engines.

A great example of this is the BBC homepage, which renders at both http://www.bbc.co.uk and https://www.bbc.co.uk. The only difference between these two URLs is the security protocol, HTTP and HTTPS. Note that this only happens on the BBC homepage, all deeper pages are managed with 301 redirects from the HTTPS version to the HTTP version.

For any other site, having a homepage that renders at both HTTP and HTTPS protocols without any canonicalisation would be a huge problem. Fortunately for the BBC, the likelihood is that all organic traffic to the homepage is as a result of branded search queries. Their homepage doesn’t need to rank for any non-brand keywords and as such this isn’t a problem.

duplicate-twins

3. Ruffled redirects

Typically speaking, inbound links are still the biggest contributor to a website’s ability to rank well in the search engines. Therefore, if a page expires or is removed, it is important to retain any inbound link equity by redirecting it to a relevant page elsewhere on your site.

We’ve taken clients on previously with a huge amount of lost link equity due to broken inbound links. Correcting this simple error with a series of 301 redirects has taken their rankings from the deepest, darkest depths of the search engines and straight into the top ten.

redirect-404

Final thoughts

Getting your technical SEO right is the key to ensuring that the great content on your site can be found online. One of the best ways to keep your site healthy is to set up a regular checking process, so that you can spot any new issues caused by changes and additions made to the site. This is especially important for businesses without a truly household-recognised brand, who are reliant on strong performance for non-brand keywords to drive traffic to their site.

If you’re worried about the technical health of your site, our team prides itself on finding and fixing issues in our technical SEO audits.

Benefits of a cohesive SEO & PPC strategy

Many people approach SEO and PPC as completely separate strategies.

Investing in SEO can produce excellent results for your brand and bottom line; when we couple our partners SEO strategy with their PPC campaigns we always see an upturn in results.

Search marketing is about relationships

While SEO and PPC are different in many ways, it can be helpful to think of them as a joint approach to search, like Ying and Yang. In this blog we discuss a two-pronged approach where PPC and SEO work together, which, if deployed, will result in better performing search campaigns.

Online visibility

The most obvious benefit of combining the two strategies is the added exposure you get on search result pages. A common mistake people make is to turn the paid advertising tap off when organic search is performing well. It’s important to remember that the top four results of most search engine results pages (SERPs) are reserved for PPC ads. Being prominent in both organic and paid listings greatly increases the likelihood of winning clicks, and ultimately more traffic and conversions. More real estate means more clicks. 

brightHR-seo-ppc750

One of our partners BrighHR dominating Google with paid and organic results

Sharing keyword data

Simultaneously running SEO and PPC campaigns gives you double the data to collect and analyse.  By running PPC ad campaigns we are able to trial different keyword combinations and advert copy to reveal which are the best performing. By assessing the overall conversion rate of the adverts we are able to steer our partners SEO campaigns down the most profitable route. This optimises the whole organic approach, in terms of time and budget spend. When we launch SEO and PPC campaigns together we always see quicker positive organic ranking gains because of shared keyword data.

Social media data

The emergence of highly targeted social adverts has changed the way we view social media. Advertising platforms on popular sites such as Facebook, LinkedIn and YouTube are able to serve up highly targeted ads to increasingly specific groups of people. Using granular campaign focus, it is now possible to target very specific individuals within certain age groups, genders, geo-locations and interests. Data collected from such campaigns can prove a very valuable source of information to help steer and refine overall SEO strategy, maintaining consistency across all digital channels.

Final thought

SEO and PPC compliment each other in a way that allows digital marketers to vastly improve the impact of search campaigns. A strong online presence is both useful and vital to a successful business. Let your SERP brand dominance begin!

Website Localisation: SEO for businesses big and small

localisation

Website localisation, from an SEO perspective, refers to the process used to communicate to search engines where in the world your business does business.

Whether you are a large retailer looking to branch out into international markets, or the village patisserie hoping to bring people in your area into your store, localising your business online can greatly assist potential customers in finding your services.

In this post, we will review some of the basic steps businesses should take in sharing their location with search engines before moving on to some more complex opportunities for international targeting.

Getting started…

Let’s start small.

Imagine we are an independent chain of coffee shops with three stores in the Manchester area. Our aim is to gain search engine visibility to anyone searching for ‘coffee shops in Manchester’. We know that Google shows local map listings for this query, and would like to be shown within the map-pack.

coffee-shop-manc

 

Onsite & offsite

To achieve our goal in communicating with search engines that we have a local presence in Manchester, we will need to perform a number of onsite and offsite tasks, including;

  • Add Local Business schema for all local sites published on our own website
  • Ensure our site is localised to “en-GB” (written in English for a British audience)
  • Register all three locations as local businesses in Google Maps (done via business.google.com)
  • Register all three locations as local businesses in Bing local (top tip – ensure you type your addresses identically everywhere you submit them)

Optimisation

Next, we can begin the process of local SEO. There are a number of metrics which contribute to how our business ‘ranks’ in the local area, including data aggregation, location, and reviews; to influence them we can:

  • Confirm our location on international business directories (yelp.com, yell.com, etc)
  • Confirm our location on local business directories (eg, directory.manchestereveningnews.co.uk)
  • Gain (positive) Google reviews of our coffee shops

Other considerations include;

  • Ensuring our website is responsive – 57% of users say they won’t recommend a business with a poorly designed mobile website
  • Creating useful, informative content – for example, we may create local pages for each store and ask staff which their favourite coffee blend is ala Hotel Chocolat
  • Assessing other content elements such a title tags, page copy, site hierarchy and internal linking

Ultimately, how our business performs locally depends on how competent our competition are in relaying their own information to search engines in comparison to our own efforts, in addition to the location of the physical searcher.

local-shops-750

Business is expanding

Our coffee shop was a hit.

Business is booming, and we have received requests from international investors who would like to take our brand into their territories. Before we pop the champagne let’s take a moment to consider how this could affect our website visibility in search engines by asking a couple of key questions;

  • Would international investors wish to open their own local versions of the brand website?
  • How will this affect our own data in terms of cannibalisation and duplication?
  • How will they manage their localised content for local languages?

 

Whilst there is not necessarily any one approach to answering these questions, they are important to consider. As a minimum we always recommend following these steps:

  • Setting up a sub-directory or sub-domain for international site versions
    • Caution should be taken when adding local addresses to alternative site variations; ensure the address is identical to that used on the local site
  • Rel=”alternate” Hreflang markup should be used to identify similar content used for different jurisdictions (more on this another time)
  • Copy should ideally be written in the local tongue – from experience, language plugins rarely provide accurate/suitable copy
  • If planning to migrate a website to an internationally branded version, put a robust SEO site migration procedure in place
  • Follow these steps for each local business – our coffee shop in Paris would need to be subject to the same efforts placed into our Manchester store

Final thought

Localisation of a business can have a significant impact on its ability to compete with local competitors. Improved visibility online can be a major contributor to real world foot-fall, particularly as mobile search continues to grow. Is your business set up to compete with local markets? Get in touch with our SEO team if you’d like to learn more about website localisation.

Mobile deep linking: the relationship between web pages and apps

What is deep linking?

The concept of deep linking isn’t new. As early as 2005 it was used to describe the practice of providing search engines with a view of web pages beyond the homepage of a website, something that seems only natural today. At the time it actually sparked debate in the US courts about whether or not linking to deep content (and search engines returning these results) was a breach of the website’s copyright. Oh if they could only see us now.

Deep linking, in the current context, refers to links which point to content deeper than web pages – apps in particular.

The rise of mobile

Over the past decade, mobile commerce has grown, and mobile apps have become a key component of most mobile strategies. However, mobile apps and the web as we know it are two separate entities; you use one or the other at any one time, and the journey between them can be awkward and clunky.

Unlike most content on the web, content in apps isn’t publicly accessible. Normal links don’t work because apps are device-specific and once downloaded to a device, there’s no consistent way to find and share items. This lack of access prevents sharing between users and creates poor user experiences.

Deep linking as a solution

Deep linking is the beginning of a union between typical web behavior and native apps, allowing businesses to send people directly to a specific page of an app using a standardised form of deep link. This could be from:

  • Display advertising
  • Emails
  • Social platforms
  • Text messages
  • QR codes
  • Webpages
  • Other apps

Importantly, a deep link can direct users to an app or a webpage. The mechanics behind it are quite simple; The user is routed to the best solution. If I click a deep link on desktop, I’ll be routed to the web page (HTTP/S URL). If on mobile, I’ll be routed to the app (URL scheme) if it’s open on my device, and, if the app isn’t installed on my phone, I’ll be lead to install it.

How does this affect search?

Whilst there are huge benefits beyond search – especially the ability of users to direct other users to specific app pages – there are also significant benefits from a search marketing perspective. Apps typically provide a better user experience on mobile. They remove the browser bar and other clutter, and encourage users to remain in the app, increasing engagement. By providing deep links to web pages which have equivalent app pages, businesses can send users directly into apps from the search results.

 

A search for “Pokemon go” on mobile directs me straight to the download page

PG example

 

The Reddit results for “Pokemon go” directs me straight to that discussion on the Reddit app 

PG reddit example

 

Final thought

Deep linking is the beginning of a seamless experience between traditional web pages and deep app pages. As the relationship between mobile apps and traditional web pages matures, we’ll continue to see changes in how they interact with one another. But, for the time being, deep linking is a must for your web app.

SEO best practice: Site Speed

Site speed is important.

As SEOs, we’ve been aware of Google using speed as a ranking factor since 2010. And it makes sense – Google’s aim is to “organise the world’s information and make it universally accessible and useful”. If a person clicks on a website which takes a long time to load the user experience is poor which is neither accessible nor useful.

The stats to back this up are pretty conclusive – 40% of users abandon web pages that take more than 3 seconds to load! Read on to find out how websites can serve their content to users at super-fast speeds to improve user experience and benefit SEO.

AMPs

Accelerated mobile pages (AMPs) are a form of ultra-fast loading mobile pages designed purely for speed and readability. The long and the short of it is that if you are a publisher, e.g. a news site, you really need to be supplying your content in AMP form.

Why? Well, aside from the increased speed providing a better user experience, AMPs take up a serious chunk of mobile-SERP real estate, as seen in the screenshot below. This pushes organic search results below the fold, which will have a big impact on click through rates. So if you’re a topical news site that isn’t publishing your content on AMP pages, you can expect to see a significant drop in traffic.

 

Screenshot_2016-06-10-16-44-07 (1)

 

It’s worth noting that AMPs aren’t for every site. Not right now at least. The technology has been developed with content publishers in mind, which means there is not much opportunity for use on ecommerce and lead generation sites. Nonetheless, eBay recently announced that they have implemented AMP in an effort to provide an ultra-fast browsing experience for mobile users. And although Google currently has no functionality to support or promote ecommerce AMP pages in the SERPs, it’s likely that they will incorporate more areas in the future.

Pre-* commands

Whilst AMPs are more geared towards information publishers, all types of sites can make use of pre-* commands.

In short, pre-* commands can tell a browser to perform an action before it has been requested by the user. This works on the premise that a webmaster can predict with a reasonable degree of certainty what action a user on their site will perform next, and therefore, what resources they will need. This sounds difficult, but examining user journeys in the Behaviour reports of Google Analytics actually makes this quite simple.

Prefetch

Prefetch is based around anticipating what resources a user will likely need. For example, a user may currently be on Page A, but it’s likely they will navigate to Page B next. With this in mind, prefetch can be used to tell the browser to download some of the critical resources contained on Page B and store them in the cache. This means some of the work is already done by the time a user clicks through to Page B, which increases the page load speed.

Prerender

Where prefetch tells a browser to ready individual resources, prerender goes the whole way by rendering a full page in a hidden tab. If a user then clicks from Page A to Page B, the hidden tab is instantly substituted in. As such, it is a much more intense and resource heavy command than prefetch, so be careful – if a user doesn’t click on the link to the prerendered page then you have wasted resources unnecessarily.

Final thought

Faster page load speeds will increase your site’s ability to rank higher in search results and provide optimal user experience. Speed is a huge factor and can have massive implications on your site’s ability to convert users into sales and/or leads.

If you’re interested in further information on how AMPs, prefetch, and prerender commands can be applied to your site, feel free to contact me at samsheppard@candidsky.com

What to do when a search engine ranks the wrong page on your website

So you’ve spent hours crafting a page on your website to make sure it’s full of useful content…

It does a great job of targeting a particular keyword and will really help solve a problem for the visitor.

However, despite your efforts, Google has frustratingly picked out a different page on your website to rank for this keyword. It’s a fairly common problem that can be easily rectified with a structured, methodological approach.

Cannibalisation

This is usually due to ‘cannibalisation’ – don’t worry, it’s not as scary as it sounds. If you’re seeing a significant degree of fluctuation in daily rankings for a given keyword, it is worth investigating which URLs the search engines have been ranking each day. For example, ‘Page A’ and ‘Page B’ both contain engaging and highly relevant content on the same topic, and so Google can’t decide which should rank over the other. It chooses to serve ‘Page A’ one day, and then ‘Page B’ the next. Ultimately, neither page will rank as highly as it should.

More complex issues

Things get a little more difficult when you start experiencing more than just daily changes. This is actually something we found on our own site recently. We’ve always ranked well for a particular keyword, ‘SEO Manchester’, but at the beginning of April it suddenly dropped 65 positions and remained there.

Keyword rankings for our target keyword

Our keyword rank drops by 65 places – alarm bells start ringing at CandidSky HQ.

We immediately saw that Google had started ranking a different page to the one it had previously ranked for this keyword. After a few days, it was clear that this wasn’t just cannibalisation between two pages. Instead, Google had decided that the new page was the best one to return in its results. The previous page was no longer deemed relevant.

The problem we had was that the keyword had commercial intent (i.e. a searcher was looking for a particular service) and the newly ranked page was part of an informational blog series. So, although the blog was very useful, it wasn’t what a searcher was looking for. Meanwhile our commercially-driven service page, which had previously ranked, was the most relevant page to serve the searcher’s intent.

Finding the solution

Once an issue has been identified, we can start putting steps in place to solve it. As SEOs, when we find a problem the root cause is rarely clear. We need to hypothesise a number of reasons why the issue has arisen. In doing so, we create an action list to work through to determine which of the reasons identified is the most likely root cause.

This allows us to address each item, monitor its impact and follow a simple ‘did it fix it’ process. If ‘no’, we move to the next item in the list and follow the same systematic approach. If ‘yes’, we’ve solved it. Due to the complex nature of SEO and the fact that the rules change every day, the ‘did it fix it?’ process tends to be the best solution in identifying most SEO issues.

In the end, we managed to get our service page ranking again, in fact, it now ranks even higher than before!

a graph to show our keyword rankings return to normal

 

Why a content audit can do wonders for SEO and website migration

Content auditing is our bread and butter.

We’ve been in the industry a long time now and conducted a content audit or two, so we know the usual business scenario well. Your website is in desperate need of a revamp, yet the path to get it there seems fraught with danger and complication.

First you need to minimise any risks faced throughout the design, construction and site migration stages to ensure a smooth transition. Content, and certainly a content audit, is often an afterthought.

content audit

Know your purpose

Fast forward and let’s say you’ve just landed yourself a sleek new site at the cutting edge of design and user experience – one that really makes your brand sing. But after all that hard work, nothing wrecks the performance of your new platform faster than populating it with the same dated, ungainly content from your previous site.

Remember that the real purpose for a new site isn’t just to look nice and pretty; this is your most important piece of marketing collateral when it comes to achieving higher levels of traffic, higher user engagement and higher conversions.

Markets, customer trends and company objectives are constantly changing, which is why top businesses conduct a comprehensive audit of their site every year or so to ensure their expert content, communications and brand messaging are up to date and firing on all cylinders.

The real business cost of poor content

Gripping visitors from the moment they land on the site is enough to affirm the need for a content audit, true. But what businesses don’t tend to consider is that poorly formed content is the equivalent of poison for search engine rankings.

What do we mean by poor content?

  • No clear purpose for web pages
  • Key messages worded and presented weakly
  • Brand USPs not reinforced
  • Inconsistent formatting
  • Frequent typos

All of this has a huge impact on a website’s visibility, search rankings and, of course, user experience. These faults don’t inspire trust, whether that’s from potential customers or search engines.

Strong content = strong website

Powerful, carefully crafted content that is search engine friendly has an unparalleled influence on website performance.

In the digital age, in-depth online market and keyword research is so important to lay the foundation for consistent messaging across all pages. After that it’s about fine-tuning tone of voice and presentation with a specific target audience in mind. You should be speaking to people’s hearts as well as their heads.

Your next content audit – time for a scale and polish?

Like your next dentist appointment, a full-scale audit probably isn’t the most alluring of duties but that doesn’t make it any less essential.

Not only does this approach cement brand vision and values on every page and grip visitors from the moment they land on the site, it ultimately forms a rich tapestry of archived content for Google and other search engines to trawl and validate. 

This is what makes a website stand out in the search results and opens the door to more and more potential customers. Make sure to greet them with killer content that’s always on-brand and puts your expertise firmly in the spotlight.

Google Flux & SEO Strategy

In his post on 25th April 2016, Dr Pete Meyers discussed our ability to accurately predict the ‘Google Weather’.

 

It is commonly acknowledged amongst marketers that Google update their algorithm consistently, raising the question, ‘How do we strategise Organic campaigns in an ever-changing landscape?’

 

What is Google SERP Flux?

Google’s Search Engine Result Pages (SERPs) update frequently. Very frequently, in fact.

Outside of the notable updates (Panda, Penguin, Rankbrain, Pigeon, Adwords, etc), Google test and alter their algorithm more than many likely realise. In 2012 alone, Google made 665 improvements to their Search algorithm. This rose to 890 updates in 2014 (equating to an average of 2.5 updates a day!).

The result of these updates is ‘flux’; an ever-evolving environment in which no SERP position is ever truly owned by one website. In fact, due to how Google ‘personalises’ search results based on each user’s history, flux is more prominent than ever.

 

average rankings over time

We monitor rankings for thousands of keywords every day

Monitoring Google Flux

Improving search engine visibility is the primary aim of our SEO team, and with it, the average rank of keywords over time. Naturally, monitoring Organic positioning is essential to the role. Building a complete picture of the sources of inbound traffic requires a diverse set of tools, including dedicated rank tracking software, prospective rank tracking solutions, and collated data from Google platforms.

  • Monitoring primary keywords – Primary keywords, the priority search queries for a campaign, are usually represented by those shown to provide strong transaction and revenue opportunities. The value of these queries dictates that precision is essential when monitoring rankings; the difference between positions 1 and 2 can be significant. This necessitates a dedicated rank tracking solution; one which can provide accurate data from specific search engines, locations, devices, and SERP elements (such as knowledge graph or answers box).
  • Secondary & tertiary keywords – SEO campaigns are benefitted by a wide array of keywords; more than would ever be appropriate to track individually. Prospective rank tracking solutions can provide generalised ranking data for larger keyword sets, providing average rankings as an indicator to change. Additionally, understanding how a webpage’s visibility is affected by both long and short tail traffic can also influence decisions on elements such as hierarchy and semantic fields, leading to the development of enhanced rankings and traffic.

 

penguin penalty and recovery

The Identification and Remediation of Issues is crucial in providing consistent growth, as shown in this GA screenshot following the first Penguin Update

Predicting Flux & Influencing Change

Predicting Flux is equal parts impossible and essential. The frequency at which Google update their algorithm means that whilst change is inevitable, we can never be 100% certain of the refinements made. Yet to influence change we need to consistently collate data to map the importance of influencing metrics. This necessitates an open-minded approach to search marketing. Thankfully, the SEO community promotes a shared dialogue on collated data, assisting our ability to predict future requirements and ability to strategise accordingly.

Take for example the Penguin Update. On 22nd May 2013 Google rolled out Penguin 2.0; an updated version of their link penalising algorithm. This created chaos within the Search Marketing industry as experts clamoured to learn precisely what and how the algorithm affected their short and longterm positioning. The data collated by the industry, coupled with limited feedback from Google, led to a strengthened understanding over which links Google takes exception to. Fast forward 3 years and we find ourselves on the verge of the next (way-overdue) Penguin update. The release of this update will no doubt result in unprecedented flux across all SERPs, with the winners and losers determined by those who strategised best over the past 18 months (since Penguin 3.0 rolled out).

‘Influencing change’ is a misnomer; change will occur whether we act or not. The aim of an organic marketing campaign is to incite positive change, which requires not only the protection of existing search engine visibility but the improvement to search queries determined to be of importance.

Final thought – Providing Consistent Results

The vast number of metrics used to determine where a web page ranks for a given query means that there is no one solution for any business. Whether acquiring new traffic or protecting existing, Search strategies must be bespoke; tailored to the specific needs of the business and inclusive of competitor data. This could include improvements to Technical requirements, consideration over duplicate content, improvements to semantic fields, or the development of improved inbound Authority. For more on our approach to Search Engine Marketing visit our SEO services page.

 

How to: Perform a website SEO health check

Many digital marketers know a lot about SEO, but struggle to understand the big picture.

One of the most important macro skills we develop at CandidSky is the ability to interpret how healthy a website is from an SEO perspective. One of the best ways to determine the health of your website is to perform a technical check-up. It will tell you what you need to do to improve your SEO and how your competitors are doing. Continue reading for a few hints and tips on what to look for.

Check your meta data

The most important item of metadata on your site is page title tags. Each page on the site must have a unique and relevant title tag, which explains to search engines, what the focus of the page is. Additionally, the inclusion of a meta description for each page improves user experience, as well as increasing the likelihood that a user will click onto your site.

 

Screen Shot 2016-04-26 at 10.33.51

 

Check your sitemap.xml and robots.txt files

Make sure your site has a sitemap and a robots.txt file. Sitemaps are a sign of an organised and easily indexable site, which is good for SEO. A site with good SEO will also have a robots.txt file with no major disallows. It’s important that the robots.txt exists and that it’s not disallowing the crawlers from indexing important areas of your site. e.g.

Screen Shot 2016-04-26 at 10.28.40

Screen Shot 2016-04-26 at 10.28.13

 

Check your site speed or load time

Google ranks a site high only if the site has a good load time. If your site is slow, then SEO will suffer. By using a free tool such as Pingdom, you can test your site speed, as well as getting an indication of how you are performing versus your competitors. If your site speed is between 1 – 3 seconds, then it’s categorised as a fast loading site, so thumbs up.

Screen Shot 2016-04-26 at 10.18.59

 

Check your on-site content

A site needs continual, updated content in order to rank well. It’s very difficult to sustain good SEO rankings without consistently relevant content output.

Healthy content marketing is an essential ingredient of first class SEO. By having a consistent flow of fresh and well-written content, either on a company blog or simply on product or category pages, will prompt search engines to crawl your website regularly. This dramatically increases the likelihood of it being more prominent in organic search listings.

 

Final thought

Much like ‘a healthy body means a healthy mind’, ‘an optimised site means optimal rankings’. Follow our quick health check steps, maintain each stage and your organic rankings will improve.

 

Why SEO checklists fail: A takeaway from BrightonSEO

Last Friday a few of the CandidSky team took a trip to (not so sunny) Brighton to take part in BrightonSEO – the UK’s largest SEO conference – to listen to a host of presentations ranging from technical SEO to content, site speed, usability and psychology, and we weren’t disappointed.

We attended some great talks, but there was one in particular that resonated with me; “Ranking Factors Reloaded – Why Content Is The Key To Success” by Marcus Tober from Search Metrics. The reason why it struck a note with me in particular is that it involved some insightful research data which lined up with my own perception of how SEO has been changing over the past few years. In this article I’m going explain how the industry is changing and what we can do to make the most of it.

The checklist approach

All too often SEO is seen as a checklist exercise where a site needs to meet a number of absolute criteria, and the growing number of SEO tools only add to this perception.

  1. Keyword in title Check!
  2. Keyword in heading Check!
  3. Over 1,000 words Check!
  4. More links/ link quality pointing to my page than my competitors Check!

Now why am I ranking on the fourth page! 

It’s awfully convenient to think SEO is that simple.

What Marcus’ presentation showed most elegantly was two things:

  • That there are no absolutes that allow a checklist approach to be effective 
  • That every decision need to be made against the context of your users, industry and competition 

So lets look at the above checklist. More specifically item number 4 “More links pointing to my page than competitors”. Over the course of the presentation, Marcus pulled up research data from an analysis of rankings in 3 verticals, and the results were particularly interesting.

For the ecommerce sector, the Search Metrics team found that there was a strong correlation between referring domains (links) and rankings – as we would expect using a classic SEO checklist.

Here you can see the number of referring domains on the Y axis and average rank across the X axis.

Ecommerce referring domain link analysis

Next came the surprise. In the healthcare market there was NO correlation between the number of referring domains and average ranking.

health referring domain link analysis

Finally, there was a NEGATIVE correlation between the number of referring domains and average rank in the Finance industry. 

Finance referring domain link analysis

Marcus went on to run through a number of other metrics from the “classic” SEO checklist. And the results were the same. In some sectors (particularly ecommerce) classic SEO metrics correlated with rankings, but in other, more informational sectors they didn’t apply at all. 

Why we shouldn’t rely on traditional metrics

Simply put, some pages provide a better result for users than others, irrespective of the “classic” SEO ranking factors such as keyword targeting and links. It all comes down to satisfying user intent. This is more pronounced in sectors where people are looking for information rather than products or services, which is why we see such a stark difference between health and finance (informational) and ecommerce (transactional).

I wrote recently about two new usability measurements confirmed as part of Google’s algorithm; clickthrough rate and dwell time. These are the metrics search engines can use to measure content engagement and how well user intent is being satisfied, but how do we improve them to get ahead of competition and bypass traditional ranking metrics? Focus on the user.

Over the course of his presentation Marcus pulled up an example of two search results for the query “Costume Ideas”. One result (below left) fell behind with traditional ranking metrics, whilst the second (below right) would appear to be a much more competitive result based on its keyword targeting and inbound links.

 Search Query and expectations

In this case, the first result ranked 1st, whilst the second ranked 14th. Traditional SEO can’t account for this, but focusing on the user can. The first result contained a 60-image gallery, links to other relevant content and high social engagement. It provided inspiration for someone clearly searching for it when they typed in “Costume Ideas”. The second result contained a mere 14 costume selection, and is quite clearly aiming to sell a product rather than provide inspiration.

It’s true that the difference here will be measured in metrics like dwell time and clickthrough rate, what marketers need to focus on is offering best result for the user intent in question by understanding the desire behind it.

Rather than subscribing to a classic SEO checklist, we need to

  • Optimise for users, not search engines
  • Think topics and answers, not adgroups and keywords 
  • Benchmark against your industry, not SEO “norms”

SEO isn’t complicated any more – It’s complex

“What’s the difference?” I hear you ask. A complicated process like sending a rocket to the moon is difficult, but once you know the series of steps which need to be taken to reach your goal you can repeat them precisely, much like an SEO checklist. A complex process, on the other hand, will vary each and every time. A great example of this is building a skyscraper; it depends where you’re building it, what regulations you need to meet, the plot you’re building on and a whole host of other factors which will vary each time you build a new one. As you can imagine, a complex task requires high levels of skill and awareness, and the classic checklist approach – although successful sometimes – simply can’t guarantee the same result every time. 

Credit to Marcus and the team at Search Metrics for bringing this to the fore. You can access the full presentation from BrightonSEO here. 

Everything you need to know about Google’s new paid advert placement

Google have announced they will be removing the paid product listings on the right-hand side of search engine results pages.

Instead, those paid product advertisements will be replaced by four adverts above the organic search results. This is significant news for many, especially for SEOs who realise that their sites have less presence on the page, as well as for PPC management executives who now need to tailor their approach to the change in AdWord’s bidding landscape.

Highly commercial searches

Google has said that the changes will mainly affect highly commercial search terms. These are the searches that users make when they demonstrate a deep intent on making a purchase or carrying out a transaction.

These searches are usually at the end of a user’s purchasing journey, so not necessarily in the research phase. Using the below example, ‘pocket sprung mattress’ shows the four adverts above the organic search results.

 

Screen Shot 2016-02-26 at 10.22.32

These also tend to be the keywords with the highest possible commercial value, meaning that securing a position in the top four ads would be highly beneficial and lucrative for businesses. Achieving a higher quality score would also ensure a lower cost per click.

According to a Google spokesperson, “We’ve been testing this layout for a long time, so some people might see it on a very small number of commercial queries. We’ll continue to make tweaks, but this is designed for highly commercial queries where the layout is able to provide more relevant results for people searching and better performance for advertisers.”

Despite the spotlight being on the four main ads at the top of the search results, this does actually mean fewer ads overall. With the removal of the sidebar ads, this would bring the total number of possible ads on a page down from 11, to seven.

For example, searching for ‘pocket sprung mattress’ shows seven ads with the new Google AdWords layout; but with the old sidebar layout, there are 11 ads. This also means that competition for the fewer ad positions will become much tighter.

What is replacing side ad placements?

After the recent change moving forward, the space previously allocated to the sidebar ads will be home to more prominent product listing ads, as well as knowledge graphs with additional information becoming more prominent.

google ad placements

A theory behind Google’s removal of the paid sidebar ads is to have a consistent appearance for both desktop and mobile devices moving forward, since mobile does not have any sidebar ads running. Another reason could be to do with the historical performance of sidebar ads involving CTR (click-through rate). Due to the more prominent nature of sidebar information such as knowledge graphs, sidebar ad placements have become more or less redundant and, from Google’s perspective, it feels like a natural progression to remove them.

With the latest news from Google concerning the way paid ads are displayed, this has a direct impact on where organic listings sit on the page. Due to there being four ads at the top instead of three, that means that the top organic listings occupy less real estate above the fold and are pushed down the page. Ensuring strong rankings for commercial search terms has always been a measurable KPI for SEOs, and this would cement the need for a stronger organic search presence with less prominent listings.

Final thought

Due to there being fewer paid ads on the page, this can be considered as an opportunity for strong organic listings to receive a higher CTR, especially with additional benefits such as review rating stars, catchy meta description etc.

Highly relevant commercial search terms tend to display the most relevant pages that would enable users to carry out a purchase or transaction and, with well-optimised organic listings becoming more prominent in the wake of fewer paid sidebar listings, this could mean additional clicks and ultimately increased traffic and conversions.

You can connect with Firas on LinkedIn.

Google Brotli: The benefits of switching to a HTTPS domain

Google recently announced plans to upgrade their Chrome browser with Brotli, a much faster compression algorithm.

But what impact does this have on the average website?

Brotli, which compresses data up to 26% faster than current algorithms, will only benefit domains with a HTTPS connection. So, has the time come for digital businesses to make the switch to HTTPS?

An introduction to HTTPS

HTTPS, or HTTP over SSL/TLS, presents the user with a more secure version of the web. HTTPS encrypts and decrypts web pages, making user data harder to access for hackers, generally ensuring the web is a safer place to be.

In addition to improved web security and speed, owning a HTTPS status provides a lightweight ranking preference for search engines. With that in mind, HTTPS migration may seem like the obvious choice for all website owners.

To recap, the benefits include:

  • A quicker website
  • Improved security
  • Increased trust from Google leading to improved rankings*

It’s important to remember however, migration isn’t without its limitations.

https

Migration Considerations

HTTPS migration means every URL on your site, i.e. every page, must be restructured, more specifically altering every HTTP to HTTPS.

For example, your page ‘http://www.example.com/awesome-webpage’ would be stored in a new destination, ‘https://www.example.com/awesome-webpage’.

Whilst this change may seem minor, without proper consideration problems will arise. When we migrate a site we always weigh up the benefits against the following implications:

1. Google will need to recrawl and reindex your entire domain

This may not be an issue if your site is only ten pages in size, but an e-commerce store with tens of thousands of pages will take some time to relocate their entire inventory over to the updated URL.

I recently wrote about website migration and anticipated timescales, check it out for more information. During any migration, you should also expect to observe a decrease in search engine rankings as Google decides which variant it should rank.

2. Site wide loss of inbound equity

Every link that points to your domain contributes to its ‘authority’, the development of which leads to increased trust from Google that your domain is respected. Part of the HTTP migration process is the redirection of expired URLs, http to live https.

It’s widely accepted that links travelling through a redirect lose around 10% of their equity, meaning a HTTPS migration could result in a reduction of existing inbound equity.

3. Short-term pain, long-term gain

Every site is expected to be HTTPS compliant in time, and this levels the playing field somewhat.

The decision is whether to adopt the new system early and begin developing authority to a new HTTPS domain now, or to wait it out capitalising on short-term gains but playing catch-up later.

decisions

HTTPS migration check list

Migrating to HTTPS can be a hefty project, and requires an experienced web development team to implement. In addition to the inclusion of the SSL (Secure Socket Layer), a complete migration includes:

  • 301 redirects for any expiring HTTP URLs to HTTPS
  • Updating external plugins to ensure they are HTTPS compliant
  • Update third-party Ad code to support HTTPS
  • Register HTTPS version in Webmaster Tools, ensuring correct preferences
  • Update Google Analytics to ensure the correct tracking

Is it time to switch to HTTPS?

The loss of organic visibility in the short-term is obviously something all businesses must assess to understand its full impact.

The long-term benefits of migration will complement all domains in the future; however, over a long enough timeline it’s hard to imagine why any website wouldn’t eventually make the switch to HTTPS.

How to: Use WordPress visual editor like a pro

Most WordPress users spend the majority of their time in WordPress visual editor writing posts. Learn how to become a power user with me and make your life easier.

WordPress visual editor lets us create content in a WYSIWYG (What you see is what you get) environment. It’s easy to use, but as a power user, you can increase your productivity greatly. Read on for some hints and tips on making the visual editor work for you. Remember, repetition is excellence so take your learnings and practice.

1. Don’t forget the kitchen sink…

By default the Visual Editor only shows a single row of buttons in the editor. Kitchen Sink is the last button in this row. Clicking on it will display another row of buttons with more commands that will make your editing life a lot easier and more exciting.

blog 2

2. Learn some simple keyboard short cuts

Many beginner level users often find it difficult to figure out how to create paragraphs and line breaks in WordPress. To add a new paragraph in the visual editor you need to press the Enter key. WordPress will add a new paragraph with double line spacing.

If you would like to just enter a line break with single line spacing, then you need to press Shift + Enter keys together.

3. Rearrange the post editor screen

The post editor screen in WordPress is not just have a post editor. It houses other sections that control many other things such as categories, author and custom fields. When writing posts, you probably only use a few sections, the rest can be somewhat redundant. Luckily WordPress lets you ‘show’ and ‘hide’ items from the post edit screen. You can even rearrange the layout to suit your prefernece.

A clutter free writing area helps focus your efforts so do what you need to do to make the editor more conducive to how you work. 

4. Use distraction free or full screen writing mode

blog 3

There are times when you need to focus purely on writing without any user interface distractions. WordPress comes with a full screen writing mode which offers distraction free editing. Just click on the ‘fullscreen’ button and your post edit screen will de-clutter itself automatically.

5. Change the font size in WordPress visual editor

Typography plays a very important role in web content. Use different font sizes to grab your audience’s attention or highlight different sections in a lengthy article.

By default, WordPress visual editor allows you to change the font size into paragraph text, headings, etc.

blog 4

6. Switch to the HTML editor

While the visual editor is great for ease of use, keyboard short-cuts, and visual appearance, don’t underestimate the power of the text editor. Sometimes when you are struggling with image alignment or the editor not allowing you to start a new line after adding an image, switching to the HTML editor is often the best solution. Just click on the ‘Text’ tab on the editor and you can easily manipulate your content to force the layout you desire.

blog 5

More info

If you would like any more info about WordPress, SEO, search or digital marketing in general email me at firas@candidsky.com or reach out on LinkedIn.

How user behaviour affects organic search

SEO is about building links and writing content. Right?

Page titles and descriptions are used to comply with search engine guidelines and help a page rank. At most it needs to be optimised for keywords, even then, you have more important things to do. Get your primary keyword in there and move on to the next thing on your to-do list. Right?

Wrong. This data is much more important than you think because it affects how your site appears in the Search Engine Results Page (SERPs). Until now you’ve been thinking about how search engines interact with your web pages. What you should be thinking about is how users interact with your listings.

Why? Because how people interact with the search engine results page will affect your site’s positioning in search results.

You might have experienced personalised search, where your results are tailored to your historical behaviour. You might even know how important the appearance of your results is to get as many clicks as possible. What you probably don’t know is that how you interact with the SERP affects how it is displayed for others or – more importantly – how searchers interact with listings will affect results for other users.

I’m talking about two key metrics that Google uses to optimise the results page:

  • Click-through rate
  • Dwell time

Read on to find out how these user behaviours affect results, and how you can harness them to improve rankings without painstakingly producing content or investing time in acquiring new links.

But first, let’s become a search engine.

As a search engine you aim to provide the best possible answer for searchers. That means providing the best result in the highest position for a given search.

Pretend you’re Google. A searcher comes to you and asks you to provide 10 resources about “tying shoelaces”. You can use the data you have to find out which 10 websites receive the most links, and perhaps which have the best content about tying shoelaces. Now you’ve got a list of 10 sites that all appear to provide the answer, but which is the best? Furthermore, does the searcher even want to know how to tie shoelaces? Maybe they are trying to find out when shoelaces were invented? Or perhaps the dangers of tying shoelaces on the move?

If you take your list of 10 sites about shoelace tying and show them to 100 people who want to know about “tying shoelaces” surely they can answer the question for you? But how can you measure it?

Click-through rate

You’ve probably heard of click-through rate (CTR) before, especially in advertising. It represents how many people who saw a search result actually clicked it. Until recently it’s been an informative metric; it tells us how enticing your listing was compared with others on the page. No more. Now it has impact. Real impact.

If 90 of your “test subjects” click the listing in position 3, that tells you that the site in position 3 has the best promise of answering your searcher’s question. Maybe the site in position 2 was actually about the history of shoelaces. Perhaps the site in position 4 was about the benefits of Velcro. By looking at which results your participants click they are telling you which they think is the best result. You’d be crazy not to bump this up to position 1, right?

But what if promises are all that listing offered? By monitoring click-through rate you can see which listing was the most enticing, but how can you tell whether the page fulfilled its promise?

Dwell time


The answer is simple. If you’re out shopping for a new pair of shoes and accidentally walk into a hat shop what do you do? You turn around and walk back out (unless you’re a sucker for a good bonnet).

So can we use how many people who walk out of the shop as a metric? Not quite. Customers who bought a hat will eventually walk out. What we need to look at instead is how long people spent in the shop. Anyone who left immediately wasn’t interested in what the shop had to offer. Anyone who was interested stuck around a bit longer. Perhaps they had a browse. Maybe they even bought a nice flat cap.

This is known as dwell time; how long a user stayed on the website in question before returning to the results page (or the high-street in this metaphor).

By looking at (a) how many people click a certain result, and (b) how long they stayed on that site, search engines can optimise their listings to provide exactly what their customers are looking for. This becomes even more important as we use more colloquial (conversational) searches, or perhaps voice search.

Search engines aren’t people. They might not know exactly what you mean. But, by looking at how people interact with the options they provide they can crowdsource the understanding process.

How can you use this newly acquired knowledge? Simple.

  • Get more people to click your listing by making it as enticing and relevant to the search as possible. Try offers, promotions, promises and seduction, or some of these tips.
  • Encourage them to stay on your site longer. Provide further reading, a game, other resources, links to things they might be interested in. Maybe one of those pop-ups that stops you going back (please don’t do that).

As search engines progress and search queries change over time it’s certain that Google, in particular, will pay increasing attention to how people interact with search results. Don’t wait for your competitors to learn about CTR and Dwell time; get started now and give yourself a competitive advantage.

Get in touch

Connect with me on LinkedIn or Twitter, or learn more about our search engine optimisation services here.

Google’s Algorithms: What are they?

Whether you’re new to SEO or long enough in the tooth to have mastered its every nook and cranny, there’s a search phenomenon spearheaded by Google that you should always keep up to date with.

These are developments and updates of search algorithms, a calculated step-by-step procedure used by Google to ensure that results returned by specific search queries are as fresh and relevant as possible.

Google Search Algorithms

Google’s commitment to returning only the most relevant and helpful results in search engine results has driven them to program and implement search algorithms. These are used to identify websites that deserve a slot on its coveted first page.

As far as search engines are concerned, the algorithms carried out by Google identify and select the top results according to specific ranking factors.

Search algorithms are programmed, run, and tweaked by engineers. Those that pass a series of tests are hooked up to search spiders / bots. The links found in every website are traced by the bots and followed where their relevancy and usefulness to the user is rated.

So far, the most prolific search algorithms launched by Google are Panda, Penguin, and their latest iteration, Hummingbird.

Google Panda

Rolled out in 2011 and still very much alive today, Google Panda targets sites that publish low quality content and deals with them accordingly. This signalled the beginning of the notion “content is king” as it played such a central role in Panda’s analysis and evaluation of a website. On its initial release, Panda searched Google’s index for sites that had duplicate copy, advertising spam, and techniques such as black-hat SEO.

The point came where it wasn’t enough to simply rewrite “poor content”. SEO’s had to make sure that all content was not just original, but contributed tangible value to a users search.

Google Penguin

After identifying and dealing with websites with poor content, it was Google’s time to focus it’s energy on other questionable SEO practices.

In 2012, the search giant rolled out Google Penguin to regulate websites that were considered spammy and/or overly-optimised. The Penguin algorithm targeted websites that utilised keyword stuffing, and again those still practising black-hat SEO.

Furthermore, websites that generated high quality links ethically were given a positive performance reward.

Penguin 1.0 and it’s first series of updates focused on sites that made use of manipulated links to generate traffic, honing in on websites with irrelevant links to its industry / sector. Recently, Google refreshed Penguin and started digging deeper into websites for spam and dubious link building tactics such as buying links to attract natural traffic.

Google Hummingbird

Taking flight just in time for Google’s 15th anniversary, Hummingbird aimed to be the most precise algorithm ever released.

One of the most significant features of Hummingbird was that Google refreshed their search engine, as well as well as their usual update of its index. It’s important to note they also retained their exiting search algorithms Panda and Penguin.

Said to be affecting around 90% of all searches made, Google’s Hummingbird aims to deliver results differently. For instance, by using Google Voice, you can look up the best cinema to watch movies in. Hummingbird ensures Google understands you are looking for ‘the best cinema to watch movies in’ rather than a more generic search for ‘movies’. If you key in another query, Hummingbird will associate this question with the previous one making searches more intuitive and intelligent. 

Conversational search is at the core of Hummingbird’s algorithm change. In the last couple of years, we’ve seen the positive impact of Panda, Penguin and the Hummingbird algorithm changes on SEO. A primary focus of Google’s search objective is to improve a users experience by helping them find the correct information as quickly as possible.

These days, people search the web in a conversational way. No matter what your sector is there are conversational keywords that will allow a user to find information on your product or service quickly. Site owners and content writers need to align keywords and content to best match the way people talk and therefore search for information.

Final thoughts

These updates, especially Panda, reinforce Google’s commitment to its users. Although there is still a lot of work to do to perfect search results for a specific query, the results on the first page of Google are far more accurate, relevant and informative with the power of it’s updated algorithms than they ever have been.

Google, unless otherwise stated, doesn’t take down algorithms. They keep on updating and improving them to ensure the delivery of the most relevant answers to users questions, in the fastest time possible. As such, marketers can’t afford to be complacent even if you’re currently benefiting from high rankings and generous traffic.

If you would like to find out more about search engine algorithms and SEO, email firas@candidsky.com or call us on 0161 956 8963

 

The SEO’s Guide to Website Migration: 2019 Edition

website migration guide

Note: This guide was originally written in 2015, but it is updated every year to reflect relevant new trends and changes in the industry.

This guide is your website migration checklist; your map to ensuring your migration has the lowest possible impact on your search rankings.

Website migrations typically lead site owners wanting to know any potential impact in advance. This is understandable because where SEO is concerned website migration projects rarely go to plan. Expectation management is essential both in anticipating the impact on your business and in developing a strategy to mitigate downtime.

In my experience each site is unique in its remit and comes with its own set of challenges, creating a need to approach each project with a strategy that aims to tick all the proverbial boxes. Our guide breaks into three key areas;

  • Site Migration Risks
  • Site Migration Types
  • Site Migration Checklist

Site Migration Risks

Take search rankings out of the equation and a site migration should be an exciting time for any company. The development of a new website or asset that aims to provide your customers with an improved browsing experience is one we frequently encourage, particularly if it provides a responsive solution. From an SEO perspective, however, a site migration can be very risky and potentially comes at the expense of lost traffic and revenue if not managed properly. Following are some the most common risks involved with the relaunch of the website.

Domain transfer 

One of the riskier forms of migration, a Domain Transfer sees ranking potential and rankings gradually transferred across to the new website as redirects are detected. We have found this process to take 4-12 weeks for full completion, depending upon the size of the site. 

Structural changes

A site’s hierarchy helps to dictate vertical and horizontal linking, and should be considered to protect authority flow through the site. Failure to do so can result in lost rankings and poor communication with bots.

Doorway pages are but one consideration in this. A valuable approach can be to assess which pages on your existing site provide the most traffic and conversions, before ensuring they are given the same priority on your new site.

Content changes

If current content no longer exists or is radically changed to omit target keywords, rankings may decline.

It is prudent to ensure traffic providing content on your current site is given the same of improved exposure on your new site. Similarly, is it possible your existing site suffers from cannibalisation or doorway pages? Can you make similar pages more succinct by combining them into a single asset? Are there opportunities for improvement?  

URL restructuring (same domain)

A URL restructure can lead to both temporary and permanently lost rankings.

Temporary drops in ranking can be attributed to Search Engines recrawling and reindexing your site, and provided a robust redirect process is in place down times can be mitigated or reduced to 1-4 weeks (depending on the size of the site).

Permanent drops in ranking could occur if the URL structure is not as SEO friendly as your previous structure. This can include the addition of files, the removal of keywords, the lengthening of the overall URL, and poor 301 redirect process.

Site Migration Types

There are a number of types of migration, each of which coming with its own considerations and requirements.

Existing Domain Migration

An existing domain migration is one in which your root domain remains the same. This often includes a redesign, CMS update, or URL/hierarchy restructure. Of the three types of migration available an existing domain migration is one that should have the least impact on search performance (the expected migration time is 1-14 days depending upon site size), but can be dangerous if not carried out correctly.

Risks include technical changes, content alterations, URL restructuring, and hierarchy changes.

Time to migrate: 2-4 weeks

New Domain Migration

A new domain migration is one in which your domain changes. Perhaps you are moving to a new TLD (.com to .org), or combining a number of sites in to a single asset? Either way a new domain migration comes with the highest risk of any migration type, not least as the anticipated search impact could last up to 3 months, and must be handled with care.

This is due in part to the passing of link equity from your previous domain, as highlighting in the image below.

link equity flow with a new domain migration

Risks include an increased time to crawl and reindex the site, technical changes, content alterations, URL restructuring, and hierarchy changes.

Time to Migrate: 4-12 weeks

This is highlighted further in the following images which show Impressions data for a live client over the 90 day period after migration (domain size, circa-200 pages).

.co.uk visibility over time

The new version of the site witnessed a gradual increase in impressions, eventually matching the prior visibility of the old site 8 weeks after migration.

.org visibility iver time

The old site experienced a rapid decline in impressions immediately following migration and still receives some visibility 90 days later. This created a period of around 6 week in which total domain visibility for the brand was reduced.

website migration visibility over 90 days

HTTP – HTTPS Migration

HTTPS migration is becoming a frequent culprit in reduced search rankings, not least as many proceed with the migration without fully knowing that it is one. A HTTPS migration is one in which your website remains the same, but has an SSL certificate applied to it. This in effect completely changes every URL on your website from one that begins with HTTP to HTTPS. The difference is subtle, but the impact significant.

Google has stated that making the switch to HTTPS can positively influence your search rankings, giving enough motivation to many webmasters to make to switch. At the core of this transition however is a complete URL restructure, and is one that can have significant impact on rankings (as discovered by Moz).

Update: considerations for HTTP-HTTPS migrations in 2018, including;

  • migrations appear to be occurring much quicker recently, in comparison to the timings initially observed in 2015, with one particular example being remapped within a few days. The time for a complete migration will still vary based on the size of the domain, available crawl budget, etc.
  • as more websites commit to HTTPS protocol the more Google are able to give it weight as a ranking metric. Last year, Google’s Gary Illyes stated that 34% of Google’s indexed are HTTPS, allowing them to dial up the algo. This could provide additional motivation for webmasters to make the switch to HTTPS.
  • Google’s John Mueller shared a Google+post which answers 13 Frequently Asked HTTP-HTTPS Questions.

Risks include URL restructuring – Time to Migrate, 1-2 weeks.

moz-migration-to-https

Combining Domains

Combining domains represents the process of compounding your assets to a single location. This may include the migration of multiple domains to a completely new one, or the migration of smaller domains to your main website. As previously discussed, porting to a new domain will result in a longer migration, and appropriate considerations towards the impact on your business should be taken. You may also want to reconsider migrating some websites. Google has been prolific with Webspam Penalties over the past few years, and caution must be taken if redirecting a site with a poor backlink profile.

Benefits include a combining of Domain Authority which could lead to improved rankings. Risks could involve penalties if combining domains with a poor link profile.

Time to migrate: 2-4 weeks

Site Migration Checklist

  • Rank Monitoring
  • Technical & Content Audit
  • Redirect file
  • GA Audit
  • Offsite Asset Audit
  • Sitemap Management

Rank Monitoring

Rank monitoring is an essential initial step in the website migration process as it allows visibility on loss and gain of search exposure. It is very likely that there are many hundreds of keywords that provide your site with traffic every month, and whilst monitoring all of them is not essential, tracking top performing or high volume keywords can provide confidence that your campaign is moving in the right direction.

I would advise monitoring keyword rankings for a number of weeks prior to migration. This provides insight on ranking trends and fluctuations, allowing you to establish which keywords are consistent contributors to your sites exposure. It would be an idea to set these as priority keywords to provide quick access to their performance.

You should be monitoring a number of metrics;

  • Keyword
  • Ranking position
  • Ranking URL[s]

By monitoring keywords in this way you are able to assess which keywords have been positively or negatively affected by your migration, providing insight on where opportunities for optimisation may lie. Ultimately, monitoring your keyword rankings affords confidence that your migration has either been a success or where it needs remediation.

Selecting Keywords to track

We typically utilise a number of tools to pull data on suitable keywords, including Google Analytics and Webmaster Tools (now Search Console). You may also wish to use the Adwords Keyword planner for more inspiration on finding relevant keywords, combining all potential keywords before entering them into your Rank Tracking software.

Tracking Tools

You will likely need to subscribe to a suitable tool to assist you in monitoring rankings. There are a number of options you may consider. CandidSky uses their proprietary RankTracker, but good options include AWR (Advanced Web Rankings), and SEO Powersuites Rank Tracker, both of which come with instructions on how to manage your campaign.

Technical & Content Audit

The best migration procedure in the world is fruitless if the site being migrated too is of a lesser technical quality than your existing asset. To allow for better attribution in Correlation vs Causation we must also assess technical and content components that may contribute to a decrease in rankings.

Before launching your new site it is advisable to host your new assets on a test server to provide analysis and information on potential pitfalls.

**Top Tip** When hosting a test site ensure you have a password set on your test domain and robots set to “noindex” to prevent search engines from indexing your content prior to launch.

Site speed

Google may never have alluded to a sites load speed having an affect on its search rankings. The evidence that this metric contributes to a sites Organic visibility is, however, overwhelming to the point where Google do provide developer tools to assist Webmasters in identifying page load speed opportunities.

Though not technically a necessary part of a migration, migrating to a site that is significantly slower than your current asset could contribute to limitations in recovery.

  • Test your current sites page load speed (Pre-Launch)
  • Test your new sites page load speed on a test server (Pre-Launch)
  • Ensure your new site is quicker to load than its predecessor-to-be (Pre-Launch)
  • Test you new sites page load speed (Post Launch)
  • Ideally your page load speed should be less than 5 seconds, with 2 being a good target

Site Hierarchy

Hierarchy is seemingly becoming a more significant contributing factor in recent years, particularly as cannibalisation is increasing in line with Google’s ever growing index.

Crucially, hierarchy communicates the priority and importance of landing pages inline with potential counterparts, and dictates which should rank higher in search. This can be communicated with bots in a number of ways, from navigation linking (vertical) to your internal link structure (horizontal).

Hierarchy Considerations

  • Does your navigation allow access to top service pages or product categories?
  • Does cannibalisation occur within your hierarchy?
  • Are product pages available with a maximum of 3 clicks?
  • Is your internal linking strategy designed to allow authority flow?

Duplicate content management

Duplicate content is a major factor in search visibility. So much so that it warranted a post of its own. Read the full article here.

Content Management

Your website currently ranks where it does based upon many metrics, from Domain Authority (the management of which is considered in our redirect process), through to page level data like copy, page titles, headers, alt tags, etc. Of all the elements that could go wrong through your site migration, content management is likely the area that could pose the largest impact is dealt with incorrectly.

In this aim, we aren’t looking to optimise your website to rank better – at least not at this stage. Migration should be considered damage limitation, and it is likely you may want to come back post launch and further optimise your content to rank better for desired keywords, but for now the aim will be to ensure your high ranking pages remain in a similar form to what they currently do. Considerations for which may include;

  • The migration of comments (particularly if they add value to the original piece).
  • Page titles, and header tags (H1’s, H2’s, etc)
  • Internal links should be updated to their new location (see the following Broken Links section for more info).

Broken Links

We have already spoken of the importance of internal linking in communicating hierarchy… it is indeed surprising that in the excitement of launching a new site so many Webmasters neglect to update the URLs their links point to.

The situation? A user visits your site and clicks a contextual (in content) link to a page you have told them will aid them.

The issue? The link still points to the old page, leading them to a 404 error, and probably a small cup of frustration.

The solution? Many assume the best solution would be reliance on 301 redirects. This would after-all redirect a user to the desired content. The drawback with this solution is the lost link authority through the redirect. The best solution, therefore, is a database look up to change all contextual links to that of their new location.

The process? This actually has considerations for both pre- and post-launch. Pre-launch it would be beneficial to use the redirects file (discussed below) to identify donor and redirect pages, using a database lookup function to update en-masse. The pre-launch redirect procedure will never get every URL, so this process should also be followed post launch with any freshly identified 404 errors.

Redirect file

The redirect process is possibly the most crucial element of a site migration and is effectively responsible for ensuring search engines know where your content has moved to.

Redirects are crucial in:

  • Ensuring link equity is passes to the new site/page
  • Ensuring Bots understand the nature of the redirect (permanent vs temporary)
  • Allowing for deeper crawling of a site following launch.

There are three types of redirect we consider when migrating a domain;

Page to page redirects

Page to page redirects are crucial if your site pulls traffic from a wide and diverse collection of URLs. This may include redirect product pages to their counterpart on the new site, category pages to their new location, and informational content to the most appropriate page. Considers for this must also include links to each page to ensure anchors are managed and potential cannibalisation contained.

Page to parent redirects

The launch of a new site can offer an opportunity to clean up your product offering. With this you may opt to redirect expired products, or even those that do not pull traffic, to a parent or category page. In addition to decreasing the number of pages involved in your recrawl, this approach can also manage authority flow to pages with better ranking potential.

Page to root redirects

A page to root redirect is one in which all pages of a site are redirected to a single location, usually the root, or homepage. There are instances in which the benefits of page to page redirects are inefficient enough that a page to root redirect is the most practical solution. This can include when aiming to pass domain authority when combining a site.

Our Very Thorough Page to Page Redirect process

  • Pull a full list of available URLs for your existing site with a crawl agent. My preference is with Screaming Frog, but alternatives include Xenu, Integrity, and Website Auditor.
  • Import this data into Excel and use some wizardry to remove URLs with parametres, and duplicates.
  • Run this list through a http response checker (such as httpstatus.io, or scrapebox) to provide a complete list of status codes. Move any 404 pages to a new tab, keeping any 301 and 200 status pages.
  • Now run a crawl on the test server to source a list of the new sites URL structure, before using a VLOOKUP function to identify which URLs are changing, and which are remaining the same.
  • Take your list of expiring/donor URLs (those which will have a new file path) and manually map their new location on the new site (you may be able to use footprints to automate some).
  • We now have a complete list of expiring URLs and their new location. The next step will be determining which of the already expired pages (those with a 404 status code) have value, and should be redirected.
  • Take your 404 pages and run them through a link-monitoring tool (options include aHrefs, Majestic, and OSE – its advisable to use them all to ensure thoroughness). This will provide insight on which 404 pages should be redirected to a suitable page. Any 404 pages that have no inbound links can be discarded.
  • Import the suitable 404 pages into your redirect list, and add a suitable redirect to a page that would benefit from the link equity and anchor.
  • With your donor URLs and redirect URLs complete use the concatenate function in Excel to generate the required redirect code for your server.

GA Audit

An often-overlooked element is the migration/updating of Google Analytics data, which ultimately limits your ability to effectively track site usage through the migration. A complete GA Audit will be covered in a future topic, but below we’ve shared the main considerations for a Migration

**Top Tip** I tend to advise updating the name on the existing GA account as opposed to creating a new one to work with, as this allows for direct comparisons over time.

Considerations include

  • Updating of Goals
  • Updating of Ecommerce
  • Ensure tracking codes are on the new site
  • It is also beneficial to create a property set in Search Console to assist in monitoring the exposure of combined sites through the migration period and beyond. Learn more here.

Offsite Asset Audit

There are two elements to an offsite asset audit. These centre around either assets for which you have direct control, or assets owned by third-parties. An audit of each of these areas can ensure your inbound link equity remains as high as possible.

Owned Assets

Your business likely owns a number of offsite assets, potentially including Google My Business pages, Facebook pages, Twitter Accounts… the list goes on. These are pages you can log in to yourself and update for data congruency. Where possible it is advisable to update any owned assets with the details of your new domain to ensure any authority from them passes straight over, and not through a redirect. Please find a list of relevant redirect tutorials below;

Third Party Assets

There are also likely a ton of third-party assets linking to your domain. Whilst you could certainly use a link tracking tool such as Ahrefs, Majestic, or Moz/OSE to find links, it may not be practical or possible to locate and contact the owner to request updates on all of these (though you would be awarded with more link equity if you chose to do so).

A consideration we would advise however is the updating of any Directory listings you may have submitted data to. As Google aims to collate information across citations, Data congruency is an important element for local SEO, and having different domains listed across profiles could leads to inconsistencies and a loss of trust. It would be advisable to update any existing profiles as opposed to creating new ones.

Sitemap Management

The element most responsible for a delay in the updating of search rankings following migration is the rate at which search engines crawl and re-index the site. In this aim it is recommended that multiple site maps be uploaded to your domain to ensure crawlers have prompted access to as many URLs as possible.

Recommended Sitemaps;

  • Expired URL sitemap (providing access to removed URLs to promote the crawling of redirects)
  • New URL Sitemap (providing access to new URLs to promote the crawling of the new hierarchy)

Sitemap Procedure

  • Ensure both maps are accessible on the migrated site
  • Submit both to GWT
  • Fetch the URL of each within GWT, requesting ‘All’ linked pages are crawled.

Conclusion

Far from being a simple task, a website migration managed correctly can result in a positive lift in search exposure once the new site has been re-crawled and re-indexed. By following the pre- and post-launch procedures you can shield your website from potential negative impact of migration, paving the way to a stronger future for your website and business.

Should you require further information on website migrations, or would like to hire us for a migration project, please either comment below or contact our search team.

Pre-launch Process

  • Set up rank monitoring
  • Assess the new site for content and technical compliance
  • Prepare a 301 redirect file
  • Take a copy of your sites existing sitemap

Post-launch Process

  • Immediately following launch run a Site:Search to pull a complete list of all indexed URLs. Run these URLs through a HTTP response checker and clean up any missing or incorrect redirects.
  • Use GWT to determine any missed redirects and add to the original redirect file using the process above.
  • Update Google Analytics Tracking/Goals etc
  • Request a crawl of your old sitemap
  • Request a crawl of your new sitemap
  • Confirm site migration in GWT (external domain migration only)
  • Create GWT property set to allow assessment of combined asset visibility
  • Update offsite assets

For more details on our approach to website migration, or to share any questions or feedback, please contact us on our social channels, or at info@candidsky.com. Alternatively, contact me directly at anthony@candidsky.com.

Breadcrumbs: Remove Anxiety From Your Customers’ Website Experience

BREADCRUMBS – VITAL NAVIGATION AID, OR UNNECESSARY GIMMICK?

Often, people say there’s no point to breadcrumbs and we don’t need them. I disagree, I think they improve user experience and reduce anxiety.

In this article I’ll be focussing on the following areas:

  • What are breadcrumbs?
  • Breadcrumbs In The Real World: TK Maxx Vs Topman
  • Top 4 reasons why we need Breadcrumbs
  • 6 things to remember when using breadcrumbs

What are Breadcrumbs?

Breadcrumbs are a form of secondary navigation. Their primary function is to help customers understand where they are on a website and be able to easily jump to a previous page in the hierarchy. Essentially, if a customer reaches a page they don’t want to be on, they can easily find their way back. Customers may use the ‘back’ tab in their browser, which is fine, however breadcrumbs simply provide an additional option.

Breadcrumbs in the real world: TK Maxx vs Topman

TK Maxx – User Experience

Recently I went shopping at TK Maxx in Manchester, I was in a rush and had 10 minutes to find a t-shirt before I was due to meet some friends for lunch.

Around 15 minutes had passed when I eventually found what I was looking for. I proceeded to the checkout and paid, however on trying to exit the shop I completely lost my bearings as there were no obvious exit signs.

It wasn’t immediately apparent at the time, but without those signs I felt a little anxious and frustrated; after all I was now in serious danger of not being on time to meet my friends.

I had to consciously think about where I needed to go to exit the shop rather than being guided out of the shop. Maybe this is a clever tactic to encourage customers to browse for longer, but to me it left a negative association with the brand.

Topman – User Experience

Compare this to my experience at Topman one week later. Again I was under similar time constraints.

I found the item I was searching for, paid and left the shop without even thinking about it. Everything was well signposted and the shop had a systematic flow, leaving a positive association with the brand.

In-site navigation, breadcrumbs plays the same role as the signposts in Topman and eradicates that feeling of ‘anxiety and frustration’.

Despite this, breadcrumbs are often omitted from sites. A common sticking point is that website designers aren’t sure if it’s worth the effort as they can take time to implement properly and it’s perceived that they only serve to clutter the page.

 Top 4 reasons why we need Breadcrumbs

Breadcrumbs help customers establish where they are located on a website, they also allow customers to read information quickly. Below are the four key reasons why breadcrumbs can benefit your website and more importantly, your customer.

  1. Reduces customer anxiety

Breadcrumbs can reduce customer doubt and anxiety about what they can expect, aiding a positive brand experience. When you achieve this, a customer will affiliate your brand with a friendly, usable and pleasant experience.

  1. Less customers leaving the website

If a customer reaches a product page that they’re not interested in, they will either leave or go back to the category page. Breadcrumbs encourage the customer to start again, rather than leaving your website altogether. Breadcrumbs can also be used internally such as, an intranet, which can significantly increase productivity and time-efficiency.

  1. Impacts SEO rankings

Taking an SEO point of view, breadcrumbs are considered best practice. Placing them high up on the page helps search engines crawling the site.

From a usability point of view, customers can navigate a breadcrumb instead of hitting the back button. Google is moving towards usability metrics which can aid a websites ranking in SERPs. Having been around for decades, breadcrumbs show no signs of going anywhere, Google are replacing the URL within its search results with the site name and breadcrumb navigation path.

  1. Can you think of a good reason not to use them?

I research usability topics on a daily basis, not once have I been convinced of not using them and here’s my rationale.

  • Breadcrumbs never cause any problems when conducting user testing.
  • They don’t take up much real estate on a page.
  • Breadcrumbs by and large have not change over many years, so customers are not distracted by them. In fact, it could be recognition that a website is systematic and well set up.

6 things to remember when using breadcrumbs

Commonly, breadcrumbs appear in a horizontal line showing the trail from the highest level page (home) to the current page the customer is on. Below are my recommendations for best practice with the help from some industry leaders:

  1. Show the full customer journey to provide context to their location
  2. Remember to use your home page to anchor a breadcrumb
  3. Breadcrumbs are best located below the navigation and above the page title
  4. Clearly show where the customer currently is on the website
  5. Do not use Breadcrumbs on the homepage as that’s always considers that starting point
  6. Ensure your breadcrumbs follow SEO best practice guidelines

What’s your opinion?

Breadcrumbs have never found the spotlight due to their permanent status of ‘secondary to the navigation bar’. However, they can be a powerful element to improve user experience. Breadcrumbs aid customer acquisition (SEO) and customer usability (CRO).

Should you have Breadcrumbs on your website? For the majority of websites, yes. They do no harm and I believe if you can ease the customer journey a little you should do it. If you walk into a shop and an assistant smiles, that is a lot nicer than frowning. Ultimately, this won’t affect your purchase decision, but contributes to a pleasant experience.

I’d love to hear your opinion, find me on Twitter or LinkedIn

SEO and Content Marketing: Similar but different

blog-header

Since the growth of content marketing, there’s been plenty of talk about SEO and how it’s been “replaced”. But has it?

It is true that having some strong expertise when it comes to content marketing is proving to be invaluable, but that doesn’t mean that it has rendered SEO obsolete. On the contrary – both disciplines should work together and only when they are in harmony can they be truly effective together.

Key Differences Between Content Marketing and SEO

SEO and content marketing overlap constantly, yet there are some differences:

  • Aspects of SEO are more technical. This includes basic elements such as correct URLs, title and ALT tags, sitemaps etc; right through to architecture and internal link structure: the stuff that underpins your content marketing strategy.
  • Content marketing is broader and isn’t necessarily confined to SEO goals. For example, a publisher should produce excellent content first and foremost as a way of attracting and retaining an audience.   

How Content and SEO Can Work Together

There is a common misconception that the main purpose of SEO is to create good content for search engines to crawl and index.

Even though there are countless articles online that go on about how “content is king”, there is definitely more to it than that. If the more technical on-site SEO groundwork isn’t laid out properly, your content efforts will be wasted.

For example, if your site has a penalty, or Google isn’t indexing pages properly, then you’re going to need some search engine optimization expertise, great content or not. This is why the SEO experts and content teams need to work together.

SEO strategies change very often, partly because Google changes all the time, what worked a few years ago is now irrelevant. In the past it was all about keywords and where they were placed on and off your page. Of course, the technicalities of SEO such as keyword research, Alt tags, URL structure, meta description, link building, and header tags still matter. But, now it’s all about how much value your content is providing people. Google has made changes to make sure that the content ranking on its first page is more valuable and relevant to their users’ search intent.

Creating Original, Quality Content  

While creating a constant flow of thin content for Google’s crawlers to index may have worked a few years ago, it isn’t nearly as effective nowadays. Furthermore, weak content created simply for the purpose of making up the numbers will not work form a content marketing perspective, since it will become harder to engage with and retain your readers.

On the other hand, from an SEO perspective, creating original and engaging content will set you apart from your competitors, as well as give search engines some fresh content to index that can’t be found elsewhere. This also helps with achieving content marketing goals, since you are more likely to attract a higher caliber of visitor with well-written and targeted content.

A good way of making sure that what you’re writing is suitable is putting yourself in the place of your target audience, and whether you think they will genuinely find what they are reading useful. Does it provide good value and useful advice?

Evergreen Content and SEO

Evergreen content is SEO content that remains relevant and would constantly be of value to readers.

Creating evergreen content is considered a great strategy for a number of SEO performance benchmarks such as improved search exposure, increased keyword quantity and quality. Having that caliber of useful content would also increase the value of your website.

A fresh piece of news or a new article would often rank quite well within Google SERP’s while it’s fresh, but will then quickly fade away once there is more pertinent news to take it’s place.

On the other hand, a more useful and engaging article that contains valuable reader advice as well as a level of insight would attract a higher level of traffic and user engagement. Those are the standards to which Google assesses how relevant a piece of content is, and is thus more likely to do better over long periods of time.

Content Marketing Tips

Tip #1: Know Your Audience

Whenever you are creating content, you need to be thinking about who your potential audience is.

  • Who do you want to be reading your content?
  • Who would find it useful?
  • Who is it meant for?

By asking yourself those relevant questions it suddenly becomes clearer the approach you have to take. You would then be able to truly understand who your audience is, and be sure that the kind of content you are creating is suitable and effective.

Tip #2: Brainstorm Your Ideas

When it comes to content creation, it tends to have a different approach to an entirely SEO focused way of doing things. Starting with an obvious thing like keyword research when it comes to SEO projects comes second to knowing what the content is actually about. The type of content being produced and the kind of ideas that the pages need to focus on will guide the keyword research process, so that they work hand in hand.

Tip #3: Making Content Digestible

When looking at how search engines view content, think of them as a child learning how to read for the first time. When writing content that is search engine optimised, the content itself is important, but just as important is the layout.

Just like a child’s attention would be grabbed by imagery, so would a search engine’s attention. Adding images to your content will help it become easier to digest, as well as making sure to include descriptive alternate tags to help search engines understand them.

Breaking up your content into chunkier bullet points and lists will also make it easier to read and digest. Creating content that is easier to skim through makes it easier to share and promote, which in turn has a positive effect on SEO.

Top Tip – Make sure your keywords are present in your headers, and try to use various synonyms to help with increased exposure throughout your content.

Tip #4: Update and Re-promote Older Content

Sometimes fresh new ideas and concepts for content can be slim, so look at older content that performed really well when it was first published for some inspiration.

More often that none, older pieces of content just need a refresh in the form of a new headline, image or a layout restructure. Using older pieces of content to link back to newer, fresher pieces and vice versa can have a positive effect. Internal linking will help to disperse the flow of link juice.

Example of Content Marketing

A recent and quite a good example of how great content at the right time can achieve amazing results. For instance, take the recent Favrify infographic which looks at the effects of the Scottish drink Irn Bru on an Englishman. This idea was taken from a similar infographic to the effects of drinking Coke.

By utilising this idea and putting a different twist on it, this particular piece of content brought in over 100,000 visits to their website, as well as having the content shared over 7000 times on Facebook.

Favrify Irn Bru Infographic

Ahrefs links

Links and social shares generated through the infographic

Conclusion

When researched and done correctly, content marketing is an excellent way to build relationships with customers, as well as setting yourself apart from your competitors.

A well implemented content marketing strategy evokes trust and authority in your relevant market or sector, and positions your brand to be the go to resource for your specialty.

The crucial concept behind an effective content marketing plan is to tell stories that people find and will continue to find both useful and relevant.

Keep in mind though, SEO and content marketing are not the same, nor are they mutually exclusive – you need to combine both for the best looking results.

Google’s New Maps Box Improves Impact but Decreases Choice

More than any other search engine, Google regularly update the appearance of their results page. We’ve seen hundreds of different SERPs showing, including brand search boxes, maps boxes, knowledge graph entries and rich snippets. This is for two main reasons:

  • To improve the quality of results for users
  • To ultimately improve performance of paid channels

Local search, especially on mobiles, is a key component of Google’s success and in the last month a new results page has been rolled out in the UK. This latest version is much closer to the current appearance of local mobile results.

New local SERPs

1. A centered map

With the previous iteration the map was located on the right hand column, below any paid search ads that weren’t appearing at the top of the page. You could interpret this as the least valuable area of the search engine results page. Now, though, the map has been pulled into the centre of the page, directly above the local listings. Not only does this look more aesthetically pleasing, it captures more of the users attention as they scroll the page.

2. More negative space

Negative, or white space isn’t wasted space! It emphasises important elements, focusing on calls-to-action rather than cluttering the interface.

3. Clearer calls to action

Calls to action have become clearer, more defined, and more clickable. No doubt this will improve the impact of your local listings. It’s reminiscent of the mobile interface which is more optimised for user experience than the previous desktop version was.

plumbers mobile

4. Less options

The previous iteration of the maps listing showed anywhere between 4 – 6 listings per map snippet. Now, however, this is reduced to just 3. Great news for brands featuring in the top spot, less so for those lingering in positions 4+.

Why increase impact and reduce choice?

You’ll need to be in Google’s ‘circle of trust’ to be sure, but I suspect this is a precursor to paid local listings. Reducing the options for users increases the value of positioning in the top 3 results, whilst improving the impact of these listings makes paying for them more commercially viable.

Moz’s Dr Pete recently tweeted about new paid local listings appearing in Google’s SERPs:

google-adwords-paid-local

Keen to stay ahead of the game, we contacted Adwords to ask about a possible rollout of new paid local results. Here’s what they said:

SF bay testing

With advertising revenue representing the majority of Google’s revenue it’s constantly looking for ways to increase performance of paid channels, and the introduction of paid local listings may provide an invaluable opportunity for local service providers. What we may see are sponsored listing appearing in the top 3 results, whilst free listings are concealed behind the “show more” button at the bottom of the snippet. Of course, we’ll have to wait and see.

Questions? Thoughts? Tweet us @CandidSky.

How Website Speed Optimisation Impacts Marketing Performance

80s style loading illustration
80s-background-done

As the digital marketing space becomes increasingly competitive, everyone is looking for techniques that could give them the edge. There is one aspect of digital marketing that is often overlooked, website speed. In this blog, I explore some of the reasons for the common oversight, and the impact that this can have on both user experience, search engine rankings and ultimately, revenue.

A Brief History

When the internet was first gaining popularity in the late 80’s, limited computer processing power and extremely slow internet speeds, meant that website load speed was constantly measured and optimised. Due to innovations in technology we now live in a world of quad-core processors and fibre optic broadband. This coupled with a fierce competition to produce more and more innovative marketing campaigns, website designers and developers are often forced trade off website performance for increased aesthetics and functionality.

It’s true that internet speeds are increasing across the board, however last year Cisco reported that 62% of mobile connections were still limited to 2G speeds (typically loading a web page in around 6 – 8 seconds). The issue is that some people making do with a slow mobile connection could be forced to download the same ‘rich media’ website that wifi users get, but on a connection a fraction of the speed. The result is that load times are excessive, ultimately resulting in those users giving up and going elsewhere.

What’s the problem?

In the mid 90s, during the ‘age of dial up’, slow was the norm, so internet users expected to have time to make a brew while waiting for their download to finish. If a web page took slightly longer than others to load, it could be forgiven. However today, we are living in the ‘age of instant’, and we don’t like to wait. We have come to expect film streams to begin immediately, photos to appear instantly and downloads to arrive within seconds. Web pages are no different, and it’s now the norm for visitors to become frustrated if a site takes longer than a moment to load.

What’s the impact?

Following a study in 2009, web performance specialists, Akamai, found that “47% of consumers expect a web page to load in two seconds or less”. They also reported that “shoppers often become distracted when made to wait for a page to load. 14% will begin shopping at another site, and 23% will stop shopping and walk away from their computer”.

Interestingly the study compared results to a previous study in 2006, where consumers expected a load time of 4 seconds. As technology continues to improve and the bar for internet speeds rise, we expect that the impatience trend will too continue, with even lower load times expected as the norm.

So, we understand it’s important to manage the downside, but what about the upside?

Back in 2010 Google announced that website speed would for the first time be used as a search engine ranking factor. Respected search guru’s MOZ put the algorithm to the test, reporting that faster back-end performance (e.g. faster servers, databases and application code) does in fact directly impact search engine rankings.

And although no direct impact could be found, faster front-end load times (e.g. more efficient HTML, CSS and JS) can have an impact too. “A decade of research from usability experts has taught us that faster websites are more enjoyable to use, have more visitors, who visit more pages, for longer periods of time, who come back more often, and who are more likely to purchase products or click ads. Ultimately these happier users are more likely promote the site through sharing and linking, which directly contributes to better search engine rankings”.

What can be done?

The next blog in this series will give you a flavour of the methods and the tools that we use at CandidSky for on-site optimisation to improve search engine rankings.

Enter your email address here and we’ll let you know when the next blog entry is released. Your email address will never be used for spam or passed on to any 3rd party.

Tips for Improving Organic Click-through Rate

More often than not SEO focus revolves around positioning and rankings. It’s easy to forget that your objective isn’t to get the highest ranking possible – it’s to capture as many users as possible who are carrying out a search. Ranking improvements are typically the best way to go about this, with click-through rate (CTR) increasing dramatically the higher you’re placed above the page. However, in high-competition niches making that jump can be a huge task and significant gains in traffic can be made by optimising your listing for users rather than search engines. Here are some tips for improving your click-through rate from a search engine results page.

Get attention with aggregate review ratings

If you’ve spent some time staring at search engine results, you know how much difference a splash of colour (and superb trust indicator) can add to an organic listing. Much like the now-defunct Authorship snippets, aggregate review ratings help build credibility and attract users to your listing.

Aggregate Review Rating Example

Provide more information on your products

Advertisers love Google shopping. Why? It provides users with more information about a product before they click, so if it’s not what they are looking for (perhaps it’s too expensive) people don’t click them, and advertisers avoid paying to attract someone who ultimately won’t want to buy there product. To an extent, the same is true of Product Markup which shows in organic results. Yes, if it’s not what they were looking for perhaps they won’t click the listing, but presenting this data can increase click-through rates from the right people. Best effects are produced when combined with aggregate review ratings.

gluegunsdirect.com snippet example

There are many more types of rich snippets depending on your page type. You can find out more here: Schema.org Guide For Beginners.

Use your meta description wisely

It’s ad copy after all, right? Whilst the page title is a key factor in search engine rankings and shouldn’t be changed without thinking about the potential impact, we should apply typical ad copy methodology to meta descriptions. Use this space to craft a persuasive message; provide detail, describe benefits, and include a call-to-action. Are you running seasonal offers? Do you have a sale on? How about FREE DELIVERY? This is one of the rare exceptions where the use of block capitals can work in your favour to make your listing stand out. You can even use icons to make your description more interesting. In the example below we’re encouraging people to call straight from the SERPs:

CS meta icon

As its a big topic, here’s some further reading: How To Write Meta Descriptions for Maximum clicks

Be careful with your meta titles

Meta titles can be tricky. They are believed to be one of the most important on-page relevance indicators, and are also the primary headline for your ad. Write a headline for users but ensure your most valuable search terms are included, and remember to Capitalise The First Letter Of Each Word. You should also be very careful not to write titles which are too long or keyword stuffed, as there’s a good chance the title will be redacted in the SERPs, providing a somewhat unappealing title for users. Here’s an example from Littlewoods.com…

The version in the SERPs:

Littlewoods beds SERPs

The actual meta title:

Littlewoods beds meta title

The “credit & finance” availability is important to many shoppers but due to redaction searchers aren’t able to see it – no doubt resulting in more clicks for competitors who have been able to show the information.

Find the low hanging fruit

To find the most rewarding areas for improvement you’ll want to access search engines’ webmaster tools, such as Google’s Console. Access the “Search Analytics” report (shown below) and select clicks, impressions, CTR, and position.

Console serp optimisation

You’re looking for a search terms with an average position in the top 10 (anything lower and the returns won’t be high), low CTR, and high impressions. In the example below you can see that the listing for “Tempur Mattress” generated 11,000 impressions, with an average position of 6.8, and a CTR of 2.6%, quite low in relative terms. This is a prime candidate for SERP optimisation to improve CTR.

SERP CTR Example

If you’re trying to increase organic traffic to your site use these tips to properly optimise what you already have; it tends to provide a superb return compared with the time taken to make improvements. 

Thoughts? Questions? Tweet @sifryer or @CandidSky. Alternatively, learn more about our SEO services here.

 

An in-depth guide to duplicate content

Two sheep standing together

Albert Einstein - Duplicate Content

Unmanaged duplicate content is, in my opinion, one of the most detrimental search engine optimisation issues for a website, with the potential to significantly impact your SERP rankings and organic performance.

If you’ve been involved in Digital Marketing for a while you’ve most likely heard of “Duplicate Content”, perhaps from internal SEO teams, content marketers or partner agencies. You may have also listened to an explanation and feel you have a basic grasp of what it involves.

Over the past few years I have read, watched and heard a plethora of different explanations of duplicate content; from SEO forums to social media posts, and even blog articles from professional agencies. There are many cases – particularly since 2013 – where sites have been launched with issues which have never been identified, and as a result have never met their potential. As a result can’t help feeling that a lot of people (including professional optimisers) don’t quite understand what Duplicate Content is and how it can impact your online presence.

Given the potential impact it’s surprising that there’s so much misinformation about what it is and how to resolve it. In this post I’ll be explaining:

What duplicate content is (and isn’t)

“Your website seems to contain large amounts of duplicate content”

“But we wrote all the content ourselves!?”

The first hurdle to get past is language; more often than not people associate duplicate content with plagiarism. This is not the case.

There are two categories of duplicate content; on-site and off-site. Parallels can be drawn between off-site duplicate content issues and plagiarism, although this isn’t typically a technical issue you can control.

The associated causes, impacts, and solutions for each type are entirely different, and believe me, on-site duplicate content is far worse. Its this category which I will be addressing in this guide.

According to my definition (I may have read it somewhere, or made it up), “On-site duplicate content” is a technical SEO problem, caused by the way a website is engineered. It occurs when a specific webpage renders at multiple different URLs. It is not content which has been stolen, reused, or taken from other places of the web or your website.

duplicate content google bot

So you know, almost every CMS driven website produces duplicate content – the question is whether or not it’s being managed properly. 

The simplest example is your homepage. A homepage might show up when you type example.com or www.example.com  In this case the  same content is being rendered at two different URLs, meaning that one of them is a duplicate. 

Now, it’s only a problem if search engines are able to crawl the duplicates. That said, never underestimate a Googlebot’s ability to find stuff. They usually have a helping hand like an incorrectly configured sitemap or CMS link. When Google is sending you over 50% of your online customers it’s worth taking precautions.

I am googlebot

So why worry about it?

Don’t worry, but do be aware of it. Google’s index is based entirely on URLs. When the exact same page renders at 2 different URLs there’s no clear indication as to which is the correct page. As a result, neither page ranks as well as they should.

In addition, back in May 2012, amongst a raft of other updates, Google included harsher penalties for duplicate content as part of their Panda 3.4 update. I was fortunate enough to work on a site at the time that was heavily penalised following the update, and quickly learned how to deal with duplicate content penalties.

It’s worth mentioning at this point that, unlike Penguin’s link penalties, duplicate content penalties  can be removed very quickly by taking the right steps. In my experience you do not need to wait for a Panda refresh. 

Signs of duplicate content

There are a number of instances in which duplicate content can crop up, but it most commonly occurs around the time of a Panda update, following the launch of a new website, or during development changes to a site where management of duplicate content has been implemented incorrectly (or not at all). You’ll see rankings and traffic start to slide, although the impact will depend on the severity of the problem.

If you’ve got a solid grasp of duplicate content you’ll be able to find it by carrying out manual checks on a site, but for a quick spot-check you can carry out a site search in Google (site:yourdomain.com). If you see the following message on the last page of the search results there’s a chance that duplicate content is afoot. You’ll need to investigate further to be certain.

duplicate content warning

How duplicate content occurs

Homepage duplicates

As I mentioned at the start, one of the most common instances of duplicate content on every website is duplication between the www subdomain and non-www root domain

For example:

  • www.example.com
  • example.com

Depending on your server, you’ll find that the homepage could also render at:

  • example.com/index.php (linux servers)
  • www.example.com/index.php (linux servers)
  • example.com/home.aspx (windows servers)
  • www.example.com/home.aspx (windows servers)

This is the simplest, most noticeable instance of duplicate content, and for the most part people are aware of it.

This type of duplication usually occurs throughout a website, so if your site renders at www.example.com and example.com, it probably renders a www.example.com/category and example.com/category too. This means that the duplicates are sitewide, and will have a significant impact on organic performance.

Solutions

  • 301 (permanent) redirect
  • Canonical link element

Sub-folders, sub-categories, and child pages

Most websites use some form of categories and sub-categories to help users find information. Categories are often the most important areas of an ecommerce site, as they intuitively target refined, specific search terms. For example, If I sell a Widgets at Widgets.com, and a potential customer wants to buy “Blue Widgets”, more often than not it will be a category page for “Blue Widgets” returned as a result. The same applies to any site which categorises content into sub-folders and child pages.

Let’s say I have the category structure as follows:

example.com/category/sub-category

Here the user has probably navigated to the first category, and then into one of it’s sub-categories. Many systems will allow this sub-category to render at example.com/sub-category without the parent category included in the URL. This sub-category now renders the same content at multiple URLs; one which includes the parent category, and one which doesn’t.

The same applies to child pages which could render at example.com/category/product and example.com/product. This might occur on a non-ecommerce site as example.com/services/service-name and example.com/service-name.

Solution

  • 301 (permanent) redirect
  • Canonical link element

Pagination

In some cases the contents of a category page may be broken into several pages; 1, 2, and 3, for example. We refer to this as a ‘paginated series’.

Using the previous example, here’s what page 1 will normally look like:

example.com/category

Page 2 might then be accessed at:

example.com/category/?p=2

Precisely how the pagination is reflected in the URL will depend on the setup of the site. In this instance we’re still in the same category, but on the second page. Search engines may well interpret the subsequent pages as duplicates of page 1. 

Solution

  • rel=“next” and rel=“previous” link elements

Parameters

Most websites affix a parameter to a URL based on certain conditions, such as the use of a filter, a ‘sort by’ function, or a variety of other purposes. A common cause is the use of “breadcrumbs” which help users navigate a site. Breadcrumbs represent the path the user has taken to a specific page, and are usually clickable for navigation purposes.

Breadcrumbs are specific to the user, and are driven by session parameters which are sometimes visible in the page URL.

For example:

example.com/category/sub-category/product/?Path=312&214

Here “Path” refers to the route the user took, and the numbers represent specific categories. In this example the user has accessed category 312, followed by category 214. This might generate breadcrumbs that look like this:

home -> category -> sub-category -> product

Now we’re still on the same product page as identified in the URL, except with URL parameters that create the breadcrumbs.

The exact same content renders on this page, but it can be accessed using a variety of different URLs. This problem is exacerbated be the number of different routes a user could take, increasing the amount of duplicates considerably.

Solution

  • Canonical link element

Capitalisation & trailing slashes

Some platforms tend to ignore letter cases in URLs, allowing a page to render irrespective of capitalisation. If the page is accessible at URLs that contain upper case letters as well as ones using only lower case letters you’re probably going to have some problems. For example:

example.com/category

example.com/Category

The same applies to trailing slashes (/) in URLs:

example.com/category

example.com/category/ 

Solution

  • 301 (permanent) redirect
  • Canonical link element

Random CMS Junk

Obviously this is not a technical term. Not all websites operate on the latest, most up to date CMS platform. Many are outdated, bespoke, and quite frankly not in a good condition for SEO purposes.

The quality of a bespoke CMS, for example, is directly related to the knowledge and ability of the development team that built it. A slight lack in technical SEO knowledge can result in a site that outputs a large amount of dynamic duplicate content.

Looking for this is quite simple; conduct a site search in Google using “site:example.com”. Look for indexed URLs containing “?”’s, path parameters, “index.php/?”. Assuming you have SEO friendly URLs, these are most likely to be unmanaged duplicates of canonical pages.

Solution

  • Canonical link element

Localisation & Translation

There are two ways to tailor content for an audience. Localisation is when content is provided in the same language, but information is tweaked for each audience to account for linguistic differences. These variants might exist on a subdomain (us.example.com) or a subfolder (example.com/us).

Where the equivalent pages exists for another locale (such as uk.example.com or example.com/uk) content should be localised for 2 reasons

  • to ensure the right content ranks for the right audience
  • to ensure that similar content is not considered a duplicate

The same applies to translation, except the difference is in the language. For example en.example.com or example.com/en

What’s important is that search engines don’t perceive these pages as unmanaged duplicates, or as different pages; they are the same page, tailored for a different audience.

Solution

  • I’ll be covering this in a later post 🙂

Other instances of duplicate content

Duplicate content can arise in a number of other ways. Once you understand what it is, you can identify and resolve duplicate issues. Remember “duplicate content occurs when the same page renders at multiple URLs”.

How to manage duplicate content

First of all, duplicate content is not a bad things – almost every website outputs duplicate content. The problem is when this duplicate content is not managed using 301 redirects, robot directives, canonical link elements, or alternate link elements.

301 (permanent) redirects

Until the introduction of the canonical link element, 301 redirects were the best way to manage duplicate content. However, redirect and link elements work different.

Once a 301 redirect is applied to a duplicate a user will no longer be able to access it, and will be redirected to (all being well) the (correct) canonical version. The problem is that often duplicates exist precisely for users. To use the example of path parameters; breadcrumbs provide great usability for visitors. If the URLs including path parameters are redirected breadcrumbs will no longer work correctly, detracting from the website’s navigation.

A 301 should only be applied to pages which offer no extra value to a user, such as the root domain and subdomain (www.example.com and example.com). In doing so roughly 90% of the authority of the donor page to the target page provided the redirect is maintained, consolidating your link equity. 

Canonical link elements 

The canonical link element deals with duplicate content in the same way as a 301 redirect, with one exception; users can still access the page. Therefore this is the most effective way to manage duplicates without running the risk of detracting from the user experience. 

A canonical link element looks like this:

<link rel="canonical" href="http://example.com">

It points to the canonical (correct) version of the web page on which is found. The beauty of the canonical link element is that it can be applied site wide, ensuring protection against duplicate content issues, irrespective of whether there’s a problem or not.

The canonical version of the page should have a self referring canonical link element – one that points to itself. Therefore, and duplicates of this page will have a canonical link element pointing to the canonical version.

Like a 301 redirect, the canonical link element passes roughly 90-95% of link equity to the target page. Canonical link elements work across domains too. So, if for some reason your site is rendering on a second domain, the canonical link elements will still point back to the original, preventing duplicate issues.

A Final Tip

There are some nuances to getting the most out of a canonical link element, and choosing the canonical version. The version set as the canonical will rank in search engines, therefore we want to use the one with the best possible chance of rank well.

For example, I might have a product page which renders at example.com/mens-shoes/black-shoes and also at example.com/black-shoes. If someone was to search for “men’s black shoes” which do you think has the best chance of ranking? Where the category or subcategory contain valuable search terms, it may be worth setting the canonical version as the one which includes them in the URL.

You may have noticed the appearance of “structured breadcrumbs” some time in 2013, or maybe not. Traditionally, when a webpage appears in the SERPs, the URL of the page is displayed below the page title.

Newlook structured breadcrumbs

With the right code in place, it’s now possible to show the actual site architecture, based on breadcrumbs.

Johnlewis structured breadcrumbs

Referring to my previous example of categories, sub-categories, and child pages, in order for these beautifully structured elements to show, the subcategory’s canonical versions MUST include the parent categories in the URL in order for canonical version to include the correct breadcrumbs.

Robots.txt (Kidding!)

Neither duplicate content, nor indexation should be managed using the robots.txt file. A disallow entry in Robots.txt provides meta directives at the root domain level and as such It’s very common for pages disallowed in Robots.txt to continue to be indexed when they are accessed directly by Googlebot, or another crawler. Once a disallowed page is indexed it will remain in the index irrespective of the content of your robots.txt file and will also prevent crawlers from picking up canonical link elements on the pages in question. Take a look below:

Robots.txt example

If you insist on trying managing duplicate content by controlling indexation, you’re better off using the “noindex” meta directive at the page level – a much more reliable solution. However, this will not pass Link Authority to canonical pages in the same way a canonical link element or 301 redirect would.

Right…any questions?

At 2,400 words there’s an awful lot more that I’d like to write on the subject, and perhaps I will. If after reading this you still don’t know what duplicate content is feel free ask for help in the comments below.

 

Customer Acquisition: SEO vs Social Media

As we all know, social media has boomed somewhat in the last few years. More and more businesses – both B2B and B2C – turned to social media as a new way to acquire customers and many people are decreasing investment more traditional online marketing such as search engine optimisation in favour of platforms like Facebook, Twitter and Pinterest in an effort to reach out to their target audience, but how effective has it been?

A survey carried out in the late 2012 by the The Creative Group reported that 62% of markers were planning on increasing investment in Facebook and Linkedin marketing.

The question is – are they just buying in to the hype?

As initially reported by Barry Adams, In 2012 Forrester produced a study that analysed 77,000 online transactions in a two-week period. They found that only 1% of those transactions had any influence from a social media channel.

1%

Forrester also reported that “Forty-eight percent of consumers reported that social media posts are a great way to discover new products, brands, trends, or retailers, but less than 1% of transactions could be traced back to social.”

This represents a startling gap between what marketers and consumers believe is driving sales and the reality of the situation.

In July 2013 a study by Custora also showed that the real drivers of online transactions were organic search, pay-per-click and email marketing, by a long shot.

acquisition growth

It’s easy to see why people have turned to social media. It has a huge reach and target audience, and with recent developments in organic SEO presenting new challenges for the industry, marketing on Facebook may seem like an easier option. However, the statistics don’t lie.

The main problem I see with marketing through social channels is a question of user intent. Social platforms aren’t the online high-street; that position is reserved for search engines for the foreseeable future. We don’t rush to Facebook or Twitter when we want to buy something… Selling to someone when they are engaged it is a far better strategy.

Every Business is Different

Remember, though, that there is no one-size-fits-all approach to marketing online. Many businesses have had great success through social media, and there are some platforms which stand out as drivers of revenue in the social world.

Pinterest, for example, is it the third most popular social networking site, but it has a large number of users in a medium to high income bracket who are looking for products for their homes, weddings and wardrobes. It’s estimated that 69% of Pinterest users have found a product, via the site, that they have gone on to buy – this is compared to just 24% of Facebook users who have done the same. It also draws more referral traffic than Google+, LinkedIn and YouTube combined. You can read Emily’s post on using Pinterest for ecommerce for some tips on getting started.

Similarly, Linkedin has proven to be highly successful for building high-value B2B relationships, with 65% of B2B companies reporting that they have acquired a customer through LinkedIn.

So what’s the answer?

Search marketing is beating social media to a pulp, although purchasing habits may change over time. Despite poor performance on the customer acquisition front social media has become the cornerstone of customer service. Furthermore, we can’t assume that this is the case in every instance; you’ll need to understand your market and audience in order to establish which is the best route for your marketing.

Marketing Analytics: How to measure your SEO

Online marketing should start and end with analytics. Unlike traditional forms of advertising, every visitor, every action, and every penny spent can be accounted for and analysed. 

This is the first of a 4 part series on how to measure your online marketing efforts. We’ll post up links to the latest articles as we release them.

  1.  How to measure your SEO
  2.  How to measure your PPC
  3.  How to measure your blogging
  4.  How to measure your social media marketing

 

How to measure your SEO

If you’re engaging in organic SEO, it’s crucial that you measure your success on a regular basis. In order to analyse your SEO efforts you’ll need to understand 5 key metrics:

  • Keyword performance and rankings
  • Total traffic from organic search
  • Branded vs. non branded search traffic
  • Search terms driving traffic
  • Conversion rates from organic search

 

Keyword performance and rankings

“How well your web pages rank for your targeted search terms in the appropriate search engines.”

You should be measuring the performance of your keywords on a regular basis. This will help you understand any increases or decreases in organic search traffic in a given period. You should monitor where you’re ranking for your targeted terms, and see if they are generating traffic for your website. If you are ranking in the top 3 but aren’t receiving much traffic, perhaps you should revise your strategy to target better keywords. Be aware that rankings vary depending on personalisation and the location you’re searching from – never accept them at face value.

How to measure it:

You can monitor your keyword rankings with a vast number of different tools. Here are our recommendations:

Fat Rank 

This is a great free tool which will allow you to type a search term in to a Chrome extension to find out how well the website you are on is ranking. It’s great for spot-checks if you’re only targeting a few search terms, or if you’re checking out competitors.

Authority Labs

Authority labs offer a brilliant rank tracking system which starts at $49 a month. Great if you want to measure a list of keywords without any other tool functionality.

We Use:

AnalyticsSEO

Analytics SEO is a comprehensive SEO toolset which we use to manage campaigns. If you’re looking for a rank tracker that comes as part of a diverse toolset, this is the choice for you.

 

Total traffic from organic search

“The number of unique visitors who enter your site from a search engine results page (SERP).”

A good SEO strategy which makes the most of both ‘head’ keywords and ‘long-tail’ keywords should make up around 50% of your total traffic, although this can vary from business to business. If the amount of traffic you’re receiving from search is considerably lower than this, look for opportunities to target more keywords, increase the ranking of your current keywords, or create new content.

How to measure it:

Google Analytics is your one-stop shop for on-site analytics. In the ‘Audience Overview’ section, select ‘Advanced Segments’ in the top right of the page, then select ‘Non-Paid Search Traffic’.

 

Branded vs. non branded search traffic

“The percentage of search visits which come from brand related search terms (such as your company or product) versus non-branded search terms related to your industry or services.”

The importance of branded vs non branded traffic is completely overlooked by even the largest of organizations. It’s important to understand the difference between a branded search terms and a non-branded search term to help understand the difference between each visit.

A branded search term represents someone who already knows your business. They’re looking for you, and the branded search is how they find you. Typically branded search traffic will increase as a result of advertising. For example, running a TV advert would lead to people then searching for your brand name. It’s a great way to measure the effectiveness of any marketing which** increases brand awareness. **

A non-branded search term, on the other hand, is much more valuable in my opinion. This represents someone looking for something (ideally a service or product you provide). They know what they want, but not where to get it from – your goal is to make sure that they choose you by ensuring that you rank as high as possible for these search terms. Every click you gain from a non-branded term is a customer that your competitors aren’t getting!

How to measure it:

We use enterprise level tools to carry this out efficiently across a range of clients, but this can be achieved with a little bit of manual work.

Head over to Google Analytics and download the .CSV of search terms which brought in visitors to your site over a given period. You can then use the filter function in excel or Google spreadsheets to filter our your branded or non-branded search terms. If you need some help here feel free to ask!

 

Inbound Links

“The quantity and quality of hyperlinks pointing from other websites to yours.”

Because inbound links are a major factor in how search engines rank your website, your should aim to gradually increase the number of inbound links over time. You should target authoritative websites in your industry, as these will pass the most powerful links.

Unfortunately, link value doesn’t come immediately. First of all search engines need to find the link that points to your site, then they need to trust it. This can take a few days or a few months, depending on the link.

How to measure it:

Open Site Explorer

OSE from SEOmoz is one of best link analysis tools available. You can use it with an SEOmoz pro account, which will also give you access to their cracking set of SEO tools for $99 a month.

Ahrefs

Ahrefs provides the exact same function as Open Site Explorer, but if you don’t want the rest of SEOmoz’s tools you can use Ahrefs at a lower $79 per month.

Google Webmaster Tools

It’s free, yay! The interface is poor, some of the data is patchy, and it’s not exactly quick and easy, but with a little spreadsheet management you can use WMT to monitor your inbound links.

 

Conversion rates from organic search

“The percentage of visitors who arrived at your site via organic search and completed a desired action, such as requesting contact or buying a product.”

At the top-level you want to know your overall conversion rate from organic search, but you should also investigate further to track conversion rates by keyword and landing page.

Knowing how well each keyword or landing page converts will help you fine-tune your strategy, allowing you to focus on areas that convert well, and cut out or improve those that don’t. This is the single most important part of SEO, after all, you want conversions, not visitors.

How to measure it:

Google Analytics

This warrants a new blog post to explain how. I’ll add the link in here when I get round to it. Watch this space!

What Local SEO Means For Small Businesses

In this post I’d like to talk to present some points as to what local search means for small/medium sized businesses, and why you as a local business should love local SEO.

1) It’s easy/ (easier)

Local SEO is low-hanging fruit for any small business that wants to take their first steps online. Many are slightly imtimidated by online marketing, as the investment needed to enter the race can seem quite high at times, but because local search results feature only listings with a permanent local address, the level of competition is instantly cut down, making it possible to start generating interest online without breaking the bank.

2) Trust

Most consumers view local results in Google as more authentic than traditional results. This is because of the map, address verification, images, videos, and customer reviews that come with a small business page. Trust is one of the most valuable comodities online, and this is one way that small businesses can achieve it without having to fight tooth and claw.

3) Real estate

Local listings receive much more ‘real estate’ in search results (the amount of the page which is dedicated to a particular result). Have a look at the image and you’ll see what I mean. In layman’s terms, businesses with well placed local rankings will get seen more.

local search real estate

4) Mobile

Mobile search is growing every single month. The smartphone is the new place to find local businesses and services, and local map listings are excellent for new businesses who want to be found by people on the go. What’s more is that mobile searchers have a much higher intent to purchase than the average desktop user.

5) Conversions

Conversion rates for prospects searching with local intent (eg “hairdresser in Manchester) have a much higher conversion rate than national terms. Geographic proximity makes is much more likely that a local searcher will click through to your site and then visit your business.

 

This is the fourth blog in our mini-series on local search. Take a look at the other posts in the series below. Enjoy!

Part 1 – A history of local search

Part 2 – How local search works

Part 3 – The importance of local search: A case study

Part 4 – What local SEO means for small businesses