How 3rd party CDNs can block GoogleBot

by on 11th March 2015

You shouldn’t block Googlebots’s access to your CSS or JS files.

We’ve known that for a long time, but more recently we’ve had reminders from around the community and Google themselves because of algorithm updates and proposed changes to the way mobile sites might be ranked in mobile search.

Some of us might be assuming that CDN’s serve our content in precisely the way we’d like. Not necessarily so, not for search engines at least.

I’m guessing the timing of Google Webmaster Tool’s new feature: the Blocked Resources Report isn’t a coincidence.

We need to check what, if any, resources we’ve blocked in case the restriction is unintentional.

Here’s how the tool looks;


And immediately it’s clear we’ve been serving something from a CDN that’s blocking Googlebot.

The files in question:

Which are, of course, easily updated to local files (or a CDN of your choice, but we’ll get to why I chose local in a moment):

I’ve chosen just to host those two files locally because we’re running on WPEngine. Their service automatically serves JS and CSS via their CDN, so now in our page, you’ll see:

So that’s a terrific little nudge from Google to remind us to check the accessibility of our externally hosted files. Pretty cool.


  1. Hi Richard,

    This is a great piece.

    I’m curious to know have you seen any SEO benefit from using a CDN?


  2. I can’t say there’s a direct benefit but if you look at your page speed then you should see a big difference. That directly improves user experience. Google say page speed is a factor so don’t leave it on the table as an unexplored option.

    This is an older post but discusses page speed from the perspective of the potential business benefit:

Comments are closed.

We're hiring – check out our careers page Careers

Get insights straight to your inbox

Stay one step ahead of the competition with our monthly Inner Circle email full of resources, industry developments and opinions from around the web.