The robots.txt syntax looks correct. You probably have an incorrect User-agent identifier. Try using "*" for the User-agent to see if the problem is really a web crawler; Google and other legitimate crawlers will stop. The "*" is a wild card for all user-agents.


>>> <[log in to unmask]> 5/29/2007 12:49 PM >>>
...
we've tried- perhaps we're doing it wrong. We put the following in
inetsrv.wwwroot

User-agent: googlebot
Disallow: /

Questions:
 am I doing the right thing?
 Will this stop the messages in the Listserv logs, or will they continue
 to show?