Robots.txt file directives

Directive (in the context of robots.txt) – SEO WIKI

Directives are commands for search engines, written in the robots.txt file. If there is such a file, robots will index certain pages of the site in accordance with the directives specified in it. With their help, an SEO specialist can allow or block visits to certain sections, determine the time period for downloading files, etc.

Robots directives need to be written from a new line. A colon is placed after the directive, and then the full address of the file in the directory is given.

For example, the allow directive in robots.txt forcibly opens, and disallow closes specific sections or pages from being viewed by robots. The user agent determines for which search engine bots the specified conditions will apply. The sitemap directive in robots.txt shows the path to the site map.