|
Sender: |
|
Date: |
Wed, 27 Dec 2006 09:26:29 -0500 |
Reply-To: |
|
Content-Transfer-Encoding: |
7bit |
Subject: |
|
From: |
|
Content-Type: |
text/plain; charset=ISO-8859-1; format=flowed |
In-Reply-To: |
|
Organization: |
ISS Enterprise Systems |
MIME-Version: |
1.0 |
Ay, but this was part of a list clean-up and we had lists that had NEVER
been used in the log window. Believe me, I'll be celebrating when I see
our current Listserv box drop off the top-20 page. FWIW- we're talking
about 6 or 7 GB of logs.
Bill Brown wrote:
>> Once upon a time, I had to do something similar. I moved the logs to our
>> research computing machine and wrote a multithreaded perl script to do
>> the scanning and analysis. Using 6 of the 8 processors, the job took ~4
>> days to run, and that was with only about 10 months of logs. Hope your
>> site doesn't see too much volume, and with 2 years of logs to search, I
>> hope you have some cpu cycles to spare
>>
>
> How big were the log files you were looking at?
>
> Since the goal of the original request was to see the last post in each
> list, could you have jsut done "tail -50" (or whatever number would mnake
> sense) on each log and processed the results of that rather than looking
> at the beginning of the log since that info would be superceded by
> subsequent lines in the log?
>
--
Christopher Wilson
Information Systems Coordinator
ISS Enterprise Systems
The George Washington University
202-994-3878
[log in to unmask]
|
|
|