Yoast is the most common SEO plugin for WordPress, with 40 million downloads to date. The plugin allows you to customise your meta tags, have complete control over breadcrumbs, ensure your content is readable and much more. As you know good content can impel your customers to a certain action. Check out Localviking for easy-to-use software to manage GMB posting. GMB accounts can be a good way to promote your business, so it is important to work with them efficiently and quickly.
One of the advanced features available on Yoast SEO allows you to generate a robots.txt and sitemap.xml file. Both of these files provide guidance to bots on how you want them to crawl your website, however, as they’re often auto-generated, technical issues can arise.
This morning, we recognised a crawling issue on multiple websites that use WordPress, in which Yoast had auto-applied a disallow rule to the robots.txt file (see screenshot below) – the result being that search bots are blocked from crawling every page on the website, which poses obvious problems from an organic perspective. Luckily, we spotted this early meaning that none of the websites we manage have been affected, however, it could’ve been a totally different story had this been missed, and organic visibility could’ve dropped off completely.
Whilst we have investigated possible reasons for this occurring on some WordPress websites, but not others, we were unable to pinpoint the source of the issue or recognise any patterns.
How do I check if my site has been affected?
The easiest way to check if your website has been affected is the robots.txt checker in Google Search Console – this tool will allow you to check for any errors and test specific bots to see if they’re able to crawl a specific page on your website. For this purpose, we would set the test to see if Googlebot can crawl your homepage.
If Google can’t crawl my website, what do I do?
If the test results come back negative, you will need to take immediate action, however, it should be a simple fix by following a few simple steps – these are outlined below:
- Login to the CMS of the affected website
- Navigate to Yoast & choose ‘Edit files’ or ‘Tools’ (depending on your version of Yoast SEO)
- If your version of Yoast uses ‘Edit files’, you will be able to update your robots.txt from the page that opens, however, if your version of Yoast displays ‘Tools’, you will need to click ‘File Editor’ in the next window.
- Now you will need to edit the robots.txt file – you can either copy the template below or use one of the many tools available online to generate the content of the file.
- Hit ‘save changes to robots.txt’.
- Your robots.txt file is now updated on the server, however, there is one more step – you will now need to go back to the robots.txt Tester in Google Search Console and submit your new robots.txt file.
- As always, you should test that your file has updated and that Google recognise this by refreshing the page a few minutes later.
Almost half of all WordPress websites we manage were affected by Yoast SEO auto-applying disallow rules and we believe that this could be a widespread issue affecting many websites – so if you have any friends, colleagues or people in your network that you know are running a website on WordPress, let them know and be sure to share the article to help others.
Can you help?
We have created a PDF to create a single resource for investigation by Yoast. At this time we do not have enough of a pattern to pinpoint the cause of this issue, but if you experience the same, please share here.