Tuesday, September 17, 2013

Easiest Way To Create Robots.txt File


First of all must know the importance of Robots.txt File

Robots.txt file helps web crawlers to know which pages have been restricted by the website owner from crawling. When Google crawler named Google Bot comes to crawl your website the first thing it does to check the robots.txt file if any pages have been mentioned in it. If some pages exist it leaves those pages and crawl other pages of the website. So it is a very important file to upload. 

To create robots.txt file just open notepad in your system and write below mentioned code to crawl all pages of your website and save it as robots.txt and upload it in the root folder of the FTP server.
Write this code for all robots to crawl all pages of your website.

                     User-agent: *
                     Disallow:

Write this code for all robots to crawl no of your website.

                     User-agent: *
                     Disallow: /

Write this code for all robots to crawl all pages except one specific page.
User-agent: *
Disallow:/private_file.html

Write this code for all robots to not enter these four directories.
User-agent: *
Disallow:/cgi-bin/
Disallow:/images/
Disallow:/admin/
Disallow:/users/

My Google+ Page