Disallow directive in robots.txt

Disallow – SEO WIKI

Disallow is one of the key directives of the robots.txt file. Disallow does not allow search engines to “see” the folders or files listed immediately after it. An example of disallow in robots.txt: “Disallow: / page2 /” means that PS robots will not index Page 2. If you want to completely disable site visibility for PS robots, you must write Disallow in the robots.txt file: /.

The purpose of the prohibition directive is to save time on processing pages that are not important for promoting a resource (a folder with system files or sensitive information).