What is Robots.txt and How to use it

0
583

robotstxtRobots.txt is used to instruct search engines to automatically discover your pages and then search the index page. Most sites have folders and files that do not need the search engine robots to visit, so creating a robots.txt file can help you with SEO.

Robots.txt file is a simple file format that can be created with Notepad and is located in the root directory of the website
for example : https://www.webmastersun.com/robots.txt

User-agent : *  means all search robots from Google, Yahoo and MSN should use this guide to find your site.

Disallow : /your- folder/  This line of code tells the search engines that it should not “rummage” in the files of the website beginning with /your-folder/.

Instructions on how to create a robots.txt file
To create a robots.txt file open Notepad and type the command, then save it as robots.txt and copy it to the root directory of your website.

Three of the most basic commands for a robots.txt file are as follows:

User-agent : *
Allow :/folder-name/
Disallow :/folder-name/

User-agent: bot used to determine the search engines.
Allow: Allows bot to discover folders on any page.
Disallow: blocks the bot from finding folders on any page.

Some things should avoid when using robots.txt

Do not use footnotes in the robots.txt file, it can confuse the search engine spiders.
Do not leave any white space at the top of the command line

User-agent : *
Disallow :/folder-name/

Do not change the order of the command line.

Disallow :/foldername
User-agent : *

Do not use more than one folder in the Disallow line.= like this:

User-agent : *
Disallow :/foldername1/foldername2/foldername3/

It should instead be written like this:

User-agent : *
Disallow :/foldername1
Disallow :/foldername2/
Disallow :/foldername3/

I hope this will help you create a robots.txt file suitable for your site!

LEAVE A REPLY

Please enter your comment!
Please enter your name here