Robots txt is a very important file. It resides in the root directory of a website and controls the behavior of web crawlers.
Some web spiders crawl your website for the purpose of generating backlink reports, getting keyword rankings, etc. Such bots waste the server bandwidth. Robots.txt file allows you to bots that don't belong to any of the major search engines.
Robots.txt file can prevent search engines from accessing/indexing certain pages on the site. They help search bots in identifying the location of the sitemap site.
Like other SEO Tools we've built for you, Robots Txt generator is easy to use. To generate the robots.txt file data, enter the location of your website's sitemap file. If you want to prevent search bots from indexing your entire site, chose the "Yes" option that appears under the "Block Entire Website" label.
Now, select the bot from the list that appears under the label "Bots" and enter the name of the directory you want the bot to crawl/ignore in the text field that you'll find below the "Allow Directory" and "Disallow Directory" label. The directory should begin and end with "/" e.g: /dir/.
Specify the crawl delay if you want bots to crawl your pages slowly. Ignore this field if your website is deployed on a fast server. Now, click the generate button.
Note: Our tool allows you to specify different disallow/allow directory and crawl delay rules for different search bots.
Copy the robots.txt file data displayed by the tool, go through it carefully, create a new text file called robots.txt and paste the copied code. Now, upload this file to the root directory of the website. To make sure that everything's Ok, test the robots.txt file with the robots txt checker tool of Google Search Console .
Note: Use the tool at your own risk. If the syntax/code is incorrect, report it to us.