teaser.png

How To Fix: ‘Googlebot cannot access CSS and JS files’

Table of Contents

    Google warns webmasters worldwide that blocking CSS or Javascript may cause suboptimal rankings

    This morning in Australia, many prudent SEO webmasters woke to the news that Googlebot could not access CSS (custom style sheet) & JS (javascript) files on their site. It looked a little something like this:

    capture

    Google has warned that left unattended, it could affect your rankings.

    “Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly, so blocking access to these assets can result in sub-optimal rankings.”

    So how can you run a technical SEO audit and fix this to make Google happy again? Here are the three required steps necessary to identify the issues, troubleshoot them and resolve.

    1. Identify the blocked resources

    You need to isolate what exactly is causing the rendering issue. Within Google’s Search Console (the old Webmaster Tools) there is a “Fetch” and “Fetch & Render” function for desktop and mobile

    capture2

    Go into each and hit “fetch and render” for Googlebot to access your site and bring back how it looks to you and how it looks to Google, this is where you can spot discrepancies causing this error message to be sent.

    capture3 0

    Once you have fetched and rendered, click on the left page path denoted in blue to visit how the page looks to Google:

    capture4capture6

    If you then scroll down, you will see a list of resources Google could not access:

    capture7

    There you have your culprits!

    2. Modify robots.txt to allow access

    So how do you allow access to them? Modify (or have your Web Developer modify) your robots.txt – check any plugins you may be using and ensure you have all the latest updates. Enabling access to all JS & CSS content may make your site more vulnerable, you may need to speak to an SEO professional to make sure this is done correctly.

    You can test your robots.txt live in Search console to check the errors have been addressed:

    capture8

    3. Post your updated fixed website to Google’s index to validate the change

    capture9

    Make sure you fetch both your desktop and mobile version of your site, hit “submit to index” and if you receive a “complete” message, you are all good!

    Confused?

    It’s a lot to take in let alone attempt yourself. Contact White Chalk Road for help with your website in implanting these changes. When SEO rankings are at potential threat, it is wise to trust quality professionals to guide you through the right course of action to keep Google happy with your site – and you happy with your business!

    Share this Article

    Read On

    Continue reading with these related posts:

    Google Search Console.

    Your 2021 Guide to Google Search Console

    If you have a beautifully designed website but aren’t really sure if it’s serving its...
    Free keyword search volume tools.

    5 Free Tools to Find Data on Keyword Search Volume

    The term ‘keyword search volume’ refers to the number of search queries for a specific...

    Latest Posts

    Picture of phone next to icons of social media marketing apps.

    The Top 5 Social Media Management Tools You Need To Grow Online

    Google Search Console.

    Your 2021 Guide to Google Search Console

    Free keyword search volume tools.

    5 Free Tools to Find Data on Keyword Search Volume