Googlebot Crawler Slows if Website Loads Slowly

SEO

Googlebot Crawler Slows if Website Loads Slowly

Googlebot significantly slows down the speed in which it crawls your website if your website loads slower than 2 seconds according to John Mueller at Google.

We have known for some time (April 2010) that a slow website is bad for SEO which was somewhat explained by Matt Cutts as well. While this ranking factor is said to be fairly minor in influence, they have not clarified exactly at what speed your website might be affected. Research has been done on the matter, and it was indicated that the Time to First Byte influenced rankings, but the total load time of the page did not.

Fast forward to this week, and John Mueller has stated that Googlebot significantly slows down the speed in which it crawls your website if your website loads slower than 2 seconds. The revelation came in response to a Google Question on the Webmaster Central Help Forum.

We're seeing an extremely high response-time for requests made to your website (at times, over 2 seconds to fetch a single URL). This has resulted in us severely limiting the number of URLs we'll crawl from your website, and you're seeing that in Fetch as Google as well. My recommendation would be to make sure that your server is fast & responsive across the board. As our systems see a reduced response-time, they'll automatically ramp crawling back up (which gives you more room to use Fetch as Google too).

It is unclear about which metric they are referring to, but our guess is either time to first byte, or average page download time. The reason why it might be worth paying attention to the latter is that this is something recorded in webmaster tools crawl stats. See an example graph of this below:

Webmaster Tools page download time

We think this gives an accurate viewpoint of what Google sees concerning your website (although with Google you can never be certain what metrics they are taking into account), and can be accessed from within your Webmaster Tools dashboard. We have not seen any issues from the sample above, so we think that if you keep the total time downloading a page under 2000 ms (2 seconds) on average, you won't have many issues. The great thing about using this particular metric is that you can see your page download speed history, as well as looking at an average of all the pages crawled rather than individual pages.

A high crawl rate is important, as Google will only pick up changes to your website if it sees them. Such changes could be new posts, new comments or just a change in website structure (new tags, etc.) that could influence link juice flow through the website. Other benefits include the freshness boost that website gets when new content is posted. As such if your website is being crawled regularly it can significantly help with your website's visibility in Google.

What can you do to speed up your website?

Well, our first recommendation is to choose a fast Web Host that is specially configured for speeds and secondly, optimize your website for speed (this is a great guide for WordPress).

In addition, you may wish to consider optimizing your website for Googlebot. There are a number of things you should think about, and we have highlighted them in this article (coming soon).

Check out our top user-rated host: SiteGround
Need help choosing a hosting provider?
Check out our top user-rated host: SiteGround