The Google webmasters community has recently posted the video to answers the questions related how one can control the crawl rate of Googlebot.
Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index. Googlebot is Google’s web crawling bot (sometimes also called a “spider”). Googlebot uses an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site.
Crawl rate refers to the speed of Googlebot’s requests during the crawl process. It doesn’t have any effect on how often we crawl or how deeply we crawl your URL structure. If you want Google to crawl new or updated content on your site, use ‘Fetch as Google’ instead.
If Google is crawling your site too often, and slowing down your server, you can change the crawl rate (the time used by Googlebot to crawl the site) for sites that are at the root level – for example, www.example.com and . Changing the crawl rate can cause some problems (for example, Googlebot will not be able to crawl at a faster rate than the custom rate you set), so don’t do this unless you are noticing specific problems caused by Googlebot accessing your servers too often.
You can’t change the crawl rate for sites that are not at the root level – for example, www.example.com/folder.
To Change the Crawl Rate :
- On the Webmaster Tools Home page, click the site you want.
- Click the gear icon , then click Site Settings.
- In the Crawl rate section, select the option you want.
The new crawl rate will be valid for 90 days.
Watch this video and get to know more about Google Crawl, Crawl Rate and how it can be change.