Why we have to write robots.txt in website application? [closed]

It lists URLs that you don’t want bots to hit, so you can keep pages out of search indexes and stop CPU intensive scripts from being repeated hit by automated processes.

The syntax is described at robotstxt.org and is basically a series of:

User-agent: $PATTERN

each followed by any number of

Disallow: $PATH_PREFIX

Browse More Popular Posts

Leave a Comment