Top White Papers
Robots.txt Tips For Dealing With BotsJan 15, 2010, 03:02 (0 Talkback[s])
[ Thanks to Andrew Weber for this link. ]
"The robots.txt is used to provide crawling instructions to web robots using the Robots Exclusion Protocol. When a web robots visits your site it will check this file, robots.txt, to discover any directories or pages you want to exclude from the web robot listing on the search engine. This is an important file which determines SEO for search engines and can help rankings.
0 Talkback[s] (click to add your comment)