Sender: |
Revised LISTSERV forum <LSTSRV-L@DEARN> |
Subject: |
|
From: |
"A. Harry Williams" <HARRY@MARIST> |
Date: |
Wed, 20 Apr 88 15:02:52 EDT |
In-Reply-To: |
Message of Wed, 13 Apr 88 12:58:04 SET from <REICHETZ@AWIIMC11> |
Reply-To: |
Revised LISTSERV forum <LSTSRV-L@DEARN> |
Parts/Attachments: |
|
|
>Since the LISTS Database already exists on some LISTSERVs I'd like to have
>numbers on the resources (esp. DASD space) needed to house it. If they are
>reasonable I'd also take it.
>Christian
>P.S.:
>I got complaints from several users saying "I've read the MEMO but it doesn't
>tell me the name of any Database to start with". I must agree that this point
>is not very clear in the description.
>What if your first 'SELECT *' with no additional parms would give you a list
>of available databases ? (Even if you're smart enough to use the name of a
>list you'd never find BITEARN or LISTS)
><CR
Ross, Eric and myself have recently been talking a little about the LISTS
database, since all 3 of us run it, and several questions have come up
about it. I don't have enough experience with it yet to give any performance
numbers, however I can give size of DASD space required.
There are on my server as of today 734 files in the database. They
are named $nnnnnnn LDBASE. They are the bulk of the space required.
I have tried several different blocksizes on 3380s to see how much
space is required. Using a 20 cyl 3380, I copied all the $* files
to this disk from my A-disk. Using the following blocksize, here are
the numbers I got.
4K 2K 1K
#blocks 3000 5400 9300
#blocks used 780 871 1505
% used 26 16 16
Largest file 2 3 6
Distribution 14-2B 14-3B 3-6B
720-1B 33-2B 11-5B
687-1B 5-4B
28-3B
270-2B
417-1B
The largest files dealt with I-KERMIT or IBMPC-L, and were generally
90-110 lines long. Updating time doesn't seem to be much more than
the normal update for the network wide lists. All servers get the material,
its just that those of you without the LISTS database, just throw it
away. If I get a chance and can get some performance numbers, I'll
post them. As it stands, Eric had reccomended a 2K blocksize, and from
the above, I tend to agree with him.
Harry
|
|
|