Linux Today: Linux News On Internet Time.

More on LinuxToday

Dynamic Robots.txt Rules for Apache Security

Feb 06, 2009, 19:33 (0 Talkback[s])
(Other stories by Ken Coar)


Desktop-as-a-Service Designed for Any Cloud ? Nutanix Frame

"This one is rather more complex, so let's go through it field by field, and follow that with some examples. The value of the 'pattern' field is key; it is to this value that aspects of the client request are compared to see if a particular rule matches or not. It might contain a string or a regular expression; how it is interpreted is controlled by the 'mode' field.

"Ordinarily, the value of the user-agent line of the robots.txt stanza will come directly from the request's 'User-agent' header field. The 'alias' field in the table provides a means to override this. For instance, the rule may actually have matched Firefox, but you can say that it matched Opera.

"The awkwardly-named 'field' field specifies which aspect of the request is to be matched against the pattern. I have found use only for the user-agent and the IP address, but there is no reason others might not be used."

Complete Story

Related Stories: