The Most Active and Friendliest
Affiliate Marketing Community Online!

“Adavice”/  “CPA

Use of robots.txt file?

It basically lets Bots like ( Google,Bing ) to know whether they can crawl your website and what parts of the website should not be crawled.
 
Robot.txt is a simple text file we keep inside our website to tell the search engine's Search bots to what to look and what not to look inside our site.

its not mandatory to use a robot.txt but it is highly recomended by the SEO's and Search engines,to improve the page ranking.
Example:
User-agent: * // * means all the search bots
Disallow: / // / banned to look inside here we can add seperate directory so the search bot wont // check inside that directory
 
It's worth pointing out that while its supposed to tell bots whether or not they should crawl your site, some of them just ignore it and crawl anyway!

But its good practice to have it and most of the big players do honor it.
 
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. This is really recommended by the SEO's and Search engines,to improve the page ranking.
example: Its looking like
User-agent: Googlebot
Disallow: /images/
Disallow: /temp/
Disallow: /cgi-bin/
 
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
 

Attachments

  • Screenshot_1.jpg
    Screenshot_1.jpg
    19.6 KB · Views: 23
banners
Back