What is Crawl Delay?
Crawl delay directive specifies how quickly a crawler bot can request pages on your website.
Crawling uses up a server’s resources. This could slow down the server or even overload it and cause it to go offline.
So, if you notice that crawlers are using up more server resources than you desire, you can set up a crawl delay by adding the below rule to your robots.txt file:
In the above rule,
10 is a placeholder for the time in seconds. You can change it to any time of your choice.
However, whether the crawler bot recognizes the crawl delay rule is up to its creators. Google and Yandex have clarified that they do not obey the crawl delay rule, while Bing does.