robots.txt

Robots.txt is a special text file web developers use to help the search engines index their web resource correctly.

Each web site has directories and pages that should not be indexed by search engines. For example, printed versions of web site pages, pages of the security system (registration, authentication). There may also be directories like administrator resources folder, various technical folders.

In addition, webmasters may want to give additional information about indexing to search engines. For example, the location of the sitemap.xml file.

All these tasks are performed by the robots.txt file. It is just a text file of a specific format, and you put it on your web site (to the main directory) so that the web crawlers know how to properly index the web site contents. Full specification of this file format can be found in the Google Developers portal.

To learn more, read about Robots.txt in our blog.

Like this article? Share it with others:
×

Creating Documentation?

ClickHelp is a documentation tool for Mac and Windows
Get a documentation site running in minutes!
  • Cloud Solution
  • Single-Sourcing
  • Author in Web Browser
  • Teamwork Support
  • Flexible Branding
  • ...and features
×

Creating Documentation?

ClickHelp is a documentation tool for Mac and Windows
Get a documentation site running in minutes!
  • Cloud Solution
  • Single-Sourcing
  • Author in Web Browser
  • Teamwork Support
  • Flexible Branding
  • ...and features
×

Have Not Found What You Need?

Ask us and get an answer within 24 hours!
All channels are monitored 24x7.
Your e-mail (for our response):*
Subject:*
Question text:*
×

Need Custom Offer?

Ask us and get a response within 24 hours.
Your e-mail (for our response):*
Subject:*
Question text:*