Header Ads

Header ADS

What is robots.txt and how it works in website


Robots.txt is a computer program of search engines. We are calling it another different name that's are slurps, web crawlers, spiders, robots, bots, etc. This is the name of a document that's uploaded to the internet site's root directory and linked within the HTML code of the web site.

Every day many websites are published on the web. So everyday robots and program crawlers scour online, collecting data from public sites or posts. the info collected is employed for a variety of things by a variety of various people and corporations.

The robots.txt file is employed to supply instructions about the online site to web robots and spiders. Website authors can use robots.txt to stay co-operating web robots from accessing all or parts of an internet site that you simply want to stay private. Search engines use web crawlers to index online and supply relevant search results.

When we publish any website our search engines then the web site are able to read for everyone. The robots.txt program goes around all websites and collects all information and data on every website.

Then continuously we will find our information which is searched in the program. except for these reasons, every person can sow every personal information of internet sites and lost the privacy of internet sites.

If we don’t fore get our personal information’s so, we note all the knowledge on our website and next time we recover the info on our website. The websites owner can control their privacy (if they want) by the robots.txt. and therefore the robots.txt doesn't see your private post or page.

The robots.txt is to be in an all-time text format and it holds the basic folder all the time. If you would like to see this, please write www.your websitename/robots.txt.

But it’s vital and high notes that if you would like to rank your website, you ought to be permitted to enter the robots.txt on your website for collecting all information.

I hope everybody understands what is robots.txt and how it works in the website.


  1. Very Nice comments. But I want it more description. All the best.

  2. I understand your description on this blog. This is a very good tutorial blog site.

  3. simultaneous localization and mapping patents While GPS fills in as a decent planning framework, certain limitations limit its scope. For instance, inside compel their reach and outside have different boundaries, which, if the robot hits, can imperil their wellbeing. simultaneous localization and mapping patents for robots


Theme images by pixhook. Powered by Blogger.