Crawl-delay – SEO WIKI

Crawl-delay is an obsolete directive for the robots.txt file. The Crawl-delay directive sets a limit on the speed of crawling pages on a site and is used when you need to set the search robot a minimum period of time (in seconds) between the end of loading one page and the start of loading the next. This limitation of crawl speed reduced the load on the server and removed the risks of crashes in the site when it was crawled by search bots. Currently, neither Google nor Yandex take this directive into account when indexing sites. At Google, crawls automatically end when they fix a bug or slow down the server’s response. At Yandex, Crawl-delay was replaced with a special tool called Crawl Speed ​​in the Indexing section.