Google says crawl demand and crawl rate make up GoogleBot’s crawl budget for your web site.
Gary Illyes from Google has written a blog post named What Crawl Budget Means for Googlebot. In it, he explains what crawl budget is, how crawl rate limits work, what is crawl demand and what factors impact a site’s crawl budget.
First, Gary explained that for most sites, crawl budget is something they do not need to worry about. For really large sites, it becomes something that you may want to consider looking at. “Prioritizing what to crawl, when, and how much resource the server hosting the site can allocate to crawling is more important for bigger sites, or those that auto-generate pages based on URL parameters,” Gary said.
Here is a short summary of what was published, but I recommend reading the full post.
- Crawl rate limit is designed to help Google not crawl your pages too much and too fast where it hurts your server.
- Crawl demand is how much Google wants to crawl your pages. This is based on how popular your pages are and how stale the content is in the Google index.
- Crawl budget is “taking crawl rate and crawl demand together.” Google defines “crawl budget as the number of URLs Googlebot can and wants to crawl.”