Googlebot Crawler Slows if Website Loads Slowly

We have known for some time (April 2010) that a slow site is bad for SEO which was somewhat explained by Matt Cutts as well.  Whilst this ranking factor is said to be fairly minor in influence, they have not clarified exactly at what speed your site might be effected.  Research has been done on the matter, and it was indicated that the Time to First Byte influenced rankings, but the total load time of the page did not.

Fast forward to this week, and John Mueller has stated that  Googlebot significantly slows down the speed in which it crawls your site if your website loads slower than 2 seconds.  The revelation came in response to a Google Question on the Webmaster Central Help Forum.

We’re seeing an extremely high response-time for requests made to your site (at times, over 2 seconds to fetch a single URL). This has resulted in us severely limiting the number of URLs we’ll crawl from your site, and you’re seeing that in Fetch as Google as well. My recommendation would be to make sure that your server is fast & responsive across the board. As our systems see a reduced response-time, they’ll automatically ramp crawling back up (which gives you more room to use Fetch as Google too). 

John Mueller

It is unclear about which metric they are referring to, but our guess is either time to first byte, or average page download time.  The reason why it might be worth paying attention to the latter is that this is something recorded in webmaster tools crawl stats.  See an example graph of this below:

Webmaster Tools page download time

We think this gives an accurate viewpoint of what Google is seeing in relation to your site (although with Google you can never be certain what metrics they are taking into account), and can be accessed from within your Webmaster Tools dashboard.  We have not really seen any issues from the sample above, so we think that if you keep the total time downloading a page under 2000 ms (2 seconds) on average you won’t have many issues.  The great thing about using this particular metric is that you can see the your page download speed history, as well as looking at an average of all the pages crawled rather than individual pages.

A high crawl rate is important, as Google will only pick up changes to your site if it sees them.  Such changes could be new posts, new comments or just a change in site structure (new tags etc) that could influence link juice flow through the site.  Other benefits include the freshness boost that site gets when new content is posted.  As such if your site is being crawled regularly it can significantly help with your website’s visibility in Google.

What can you do to speed up your site?

Well, our first recommendation is to choose a fast Web Host that is specially configured for speed, and secondly optimize your site for speed (this is a great guide for WordPress).

In addition, you may wish to consider optimizing your site for Googlebot.  There are a number of things you should think about, and we have highlighted them in this article (coming soon).

We will be happy to see your thoughts

Leave a reply