🛍 Remise sur les soldes de printemps ! + 2x avantages GRATUITS


What is Crawl Delay?

La directive Crawl delay spécifie la rapidité avec laquelle un robot robot peut demander des pages sur votre site Web.

Crawling uses up a server’s resources. This could slow down the server or even overload it and cause it to go offline.

So, if you notice that crawlers are using up more server resources than you desire, you can set up a crawl delay by adding the below rule to your robots.txt file: 

crawl-delay: 10 

In the above rule, 10 is a placeholder for the time in seconds. You can change it to any time of your choice. 

However, whether the crawler bot recognizes the crawl delay rule is up to its creators. Google and Yandex have clarified that they do not obey the crawl delay rule, while Bing does.

🇫🇷 Français