Speak to an expert 0161 956 8963

Prefer to keep it digital? Complete our quick form, we'll get back to you within 4 working hours.

Get directions
Conavon Court, 12 Blackfriars Street, Manchester, M3 5BQ

Yoast SEO auto-applying ‘disallow all’ to robots.txt


Henry
Henry is one of CandidSky's SEO Executives. He has a keen interest in digital marketing as a whole, with a particular enthusiasm for organic search.

August 11, 2017

3 minute read

Yoast is the most common SEO plugin for WordPress, with 40 million downloads to date. The plugin allows you to customise your meta tags, have complete control over breadcrumbs, ensure your content is readable and much more.

One of the advanced features available on Yoast SEO allows you to generate a robots.txt and sitemap.xml file. Both of these files provide guidance to bots on how you want them to crawl your website, however, as they’re often auto-generated, technical issues can arise.

This morning, we recognised a crawling issue on multiple websites that use WordPress, in which Yoast had auto-applied a disallow rule to the robots.txt file (see screenshot below) – the result being that search bots are blocked from crawling every page on the website, which poses obvious problems from an organic perspective. Luckily, we spotted this early meaning that none of the websites we manage have been affected, however, it could’ve been a totally different story had this been missed, and organic visibility could’ve dropped off completely.

Yoast Robots.txt example

Whilst we have investigated possible reasons for this occurring on some WordPress websites, but not others, we were unable to pinpoint the source of the issue or recognise any patterns.

How do I check if my site has been affected?

The easiest way to check if your website has been affected is the robots.txt checker in Google Search Console – this tool will allow you to check for any errors and test specific bots to see if they’re able to crawl a specific page on your website. For this purpose, we would set the test to see if Googlebot can crawl your homepage.

If Google can’t crawl my website, what do I do?

If the test results come back negative, you will need to take immediate action, however, it should be a simple fix by following a few simple steps – these are outlined below:

  1. Login to the CMS of the affected website
  2. Navigate to Yoast & choose ‘Edit files’ or ‘Tools’ (depending on your version of Yoast SEO)
  3. If your version of Yoast uses ‘Edit files’, you will be able to update your robots.txt from the page that opens, however, if your version of Yoast displays ‘Tools’, you will need to click ‘File Editor’ in the next window.
  4. Now you will need to edit the robots.txt file – you can either copy the template below or use one of the many tools available online to generate the content of the file.
  5. Hit ‘save changes to robots.txt’.
  6. Your robots.txt file is now updated on the server, however, there is one more step – you will now need to go back to the robots.txt Tester in Google Search Console and submit your new robots.txt file.
  7. As always, you should test that your file has updated and that Google recognise this by refreshing the page a few minutes later.

Almost half of all WordPress websites we manage were affected by Yoast SEO auto-applying disallow rules and we believe that this could be a widespread issue affecting many websites – so if you have any friends, colleagues or people in your network that you know are running a website on WordPress, let them know and be sure to share the article to help others.

Can you help?

We have created a PDF to create a single resource for investigation by Yoast. At this time we do not have enough of a pattern to pinpoint the cause of this issue, but if you experience the same, please share here.

Was this valuable?

Why not share it on your favourite social network.

facebook twitter linkedin google+