|
Sender: |
|
Date: |
Wed, 26 Oct 2005 17:35:23 -0500 |
Reply-To: |
|
Subject: |
|
MIME-Version: |
1.0 |
Content-Transfer-Encoding: |
7bit |
In-Reply-To: |
|
Content-Type: |
text/plain; charset="US-ASCII" |
From: |
|
Thanks. I'll try that.
Chris Simon
ITS Helpdesk
[log in to unmask]
-----Original Message-----
From: LISTSERV site administrators' forum
[mailto:[log in to unmask]] On Behalf Of Rich Greenberg
Sent: Wednesday, October 26, 2005 5:34 PM
To: [log in to unmask]
Subject: Re: Google porblems
On: Wed, Oct 26, 2005 at 04:08:06PM -0500,Chris Simon Wrote:
} Has anyone seen a problem with their server in regards to a Google
crawler?
} The crawler has been going through our archives about twice a week causing
} high CPU usage for an hour or so. It has been happening on off-hours so
it
} hasn't had a huge impact but I'm curious to see if anyone else has
} experienced this.
Put a "robots.txt" file in the directory (or the root). Most spiders
will honor it. You can get the format by googling.
I think the following is a "keep out of everything:
User-agent: *
Disallow: /
--
Rich Greenberg N6LRT Marietta, GA, USA richgr atsign panix.com + 1 770 321
6507
Eastern time zone. I speak for myself & my dogs only. VM'er since
CP-67
Canines:Val, Red & Shasta (RIP),Red, husky Owner:Chinook-L
Atlanta Siberian Husky Rescue. www.panix.com/~richgr/ Asst
Owner:Sibernet-L
|
|
|