Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Create a Robots.txt File Online:

Robots.txt Generator is very easy online tool that can build a valid Robots.txt file for your website. You can copy and adjust finely Robots.txt files from other sites or create your own Robot file. All search engine spiders crawling your website each time, they mostly start in first step by recognizing your website robots.txt file that will be place in root domain folder of your hosting control panel. If the spiders identifies that file, the crawler reads the files and see if some page are blocked to be index site. All these Blocked filed can be generated with the robots.txt generator free. The file allowed in your sitemap may be different from your Robots.txt file.

How to use Robots.txt Generator:

  1. Select if you want to allow or disallow your site to crawl:
  2. Set the Crawl-Delay time:
  3. Write your Sitemap url if you have:
  4. Allow or refuse Search Robots for all type of search engines:
  5. Put the Restricted Directories with '/' (Slash) if you want to refuse from indexing:

At the time of utilizing the robots.txt record generator, to see a one next to the other examination on how your site right now handles seek bots versus how the proposed new robots.txt will function, sort or glue your site area URL or a page on your site in the content box, and afterward click Create Robot.txt catch. To exchange a typical Disallow order into an allow mandate for the custom purchaser specialist, make a shiny new permit order for the exceptional client operator for the substance. The coordinating Disallow mandate is dispensed with for the custom individual specialist.You should use Website Reviewer to find if your site Robots.txt file is working fine and then try to avoide the errors.