Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robots.txt is a file which is available for all the websites which are hosted on the internet. Whenever a search engine start crawling a website the first thing which it looks for is a file named robots.txt. This file is generally present in the root of the domain name. Once the file has been located by the spiders they find out the links which are blocked by the administrator or disallowed. If you are using a robots.txt file all the search engines like Google and Yahoo can understand the links which you don’t want to get indexed. One can consider robots.txt as the opposite of sitemap.

Now in order to generate the robots.txt you can take help of our robots.txt generator. Robots.txt generator is a free online tool which will help you in creating a robots.txt file for your website. If you have already created a robots.txt file and want to make it better then you can upload it and modify it using our tool. Using the tool is very easy as the directives which you don’t want the search engines to index will be added in the disallow section whereas the directives which you want the search engines to index will be present in the allow section. If you want to modify the already created robots.txt file you can select Remove directive option. You can also create the different robots.txt files for corresponding search engines by selecting them from the drop-down list.

If you are creating the robots.txt file for the first time and are wondering what you need to disallow then you can exclude the below-mentioned things.

  • Login Page of your website.
  • The Contact Page of your website.
  • Internal structure.
  • Privacy Page.
  • All the Media files which you don’t want to see in the search results.
  • All the Image folders which you don’t want to see in search results.

Tips for Optimizing Robots.txt file

You can follow some of the tips mentioned below which will help you in optimizing the robots.txt file.

  • If you are going to test a syntax in robots.txt file we suggest you to add it at the bottom of the file. When a search engine reads the robot.txt file it starts from top to bottom. If the syntax you have added is incorrect then the above mentioned directives will be read and won’t get ignored.
  • You can easily create simple statements with the help of wild card directive. The wildcard directive will disallow all the patterns which are found in the URL. Wildcard directive is supported by a couple of search engines only so we suggest you to add it at the bottom of the robots.txt file.
  • Don’t use robots.txt file to allow the directives you want to index. The purpose of robots.txt file is to mention the directives which you don’t want to get indexed in search engines. So you robots.txt only for disallow directives.

We hope that the next time you are going to create a robots.txt file you keep all the above-mentioned tips in your mind.



SEO Tools Powered by Sujoy Dhar
The Magical Innovation & Creative Ideas

Stay Connected With Us
Facebook
Twitter

CONTACTS
Email: support@iamsujoy.com
Phone: +917278665321

LINKS

  • SEO Audit
  • Facebook Multiposter
  • Social Count

SUBSCRIBE TO OUR NEWSLETTER

Thanks for Subscribing To Our Mailing List

FOLLOW US