Googlebot Warning Blog

Googlebot warns of access issues

Google has sent out a mass of emails last week to every site which is set up in Google Webmaster Tools and is currently blocking CSS or Javascript (JS). This isn’t a new Google algorithm update, if you are reading this you may well have received one.

If you are in receipt of this warning message there may be folders in your robots.txt file which are not accessible and blocking the search engines. Each site is different and there may be reasons (i.e. site security) as to why you don’t want these folders to be seen by the search engines.

To help Google understand your sites content, allow all of your sites assets such as CSS and JavaScript files to be crawled.

In order to comply with this guideline you can start by using the Fetch As Google tool in the Search Console.  This will show you page assets that Googlebot cannot crawl within your site and you can use Robot.txt Tester to debug directives.

The ‘Fetch as Google’ tool enables you to test how Google crawls or renders a URL on your site.  In Google’s words: “You can use Fetch as Google to see whether Googlebot can access a page on your site, how it renders the page and whether any page resources (such as images or scripts) are blocked to Googlebot.  This tool simulates a crawl and render execution as done in Google’s normal crawling and rendering process and is useful for debugging crawl issues on your site.”

This guideline is not new but Google is starting to issue warnings about it which means it is well worth investigating and may cause issues with your Google rankings if left ignored.  Similarly, if your competitors are implementing these changes on their site and you are not, this could see your site left behind with a lower ranking.

Letting Google access CSS and JS is listed under the technical guidelines for webmaster which can be found here: