On Aug 4, 11:42, Dave Gomberg wrote: } Subject: Re: Re SCIFAQ-L errors Your last message showed a complete lack of understanding of the situation. This one is a bit better. } Posting of duplicates: I guess I would argue that deliberate posting of } duplicates is a waste of bandwidth and therefore should not be encouraged. Reposting FAQs on a periodic basis is standard practise on USENET. And in many cases they work quite well in reducing USENET bandwidth. I post to USENET, not a LISTSERV. Why is a LISTSERV imposing it's own policies on USENET postings? What is particularly annoying is that a listserv with the name "SCIFAQ-L" that exists solely to remail sci.* FAQs isn't cogniscent of standard FAQ practise of periodic repost and beats sci.* FAQ posters around the brain for simply doing what they've always done, and will continue to do. The whole intent behind SCIFAQ-L was to forward these articles, that are often unchanged from issue to issue by their very nature. Thus, the very creation of SCIFAQ-L encouraged the FAQ posting "culture". [Una, lest you misunderstand - we consider the FAQs valuable, otherwise we wouldn't do them. We encourage the further distribution of them, otherwise, again, we wouldn't do them. But it really doesn't help to have to pay per-byte communications costs for bogus bounces as I do.] } If the poster cannot even be bothered to date the posting, The USENET software dates it for me. I don't send FAQs to LISTSERVs. I send them to USENET and conform to USENET practice and standards. Why should I program around LISTSERV then? Should I also have to program around Waffle with its 30K limit? Must I lie in my Sender/From/Reply-To: to get PostalUnion to send its bogus "you're not allowed to post" bounces into /dev/null? Was I supposed to program around the server that cut my FAQ up into 6 pieces and reposted them in someone else's name? Nonsense. The software should be fixed. } I don't think Eric (or anyone else) should make a serious effort to } facilitate double postings. We're not asking him to facilitate double postings. Because we *aren't* double posting. Nor do we expect him to special-case FAQs so that they be transferred inspite of violating a list-owner's duplicate checking - we really don't care what LISTSERV does with duplicate postings within the LISTSERV's checksum window, as long as it doesn't bounce them back at the originator. All we're asking is, that if they're gatewaying USENET postings, that if they must impose additional restrictions, that they conform to gatewaying conventions and don't bug us. If the only way to gateway USENET is to discard *all* duplicates regardless of source, with only an error log, so be it. } If somebody's faqs are so important, they should be archived } at the recipient, or rerequested when needed. My 5 FAQs are archived for anonymous FTP. And they include instructions on how to retrieve new copies. But I don't provide a FAQ-mailing service. Nor does everybody have FTP/WWW/Gopher/Prospero whatever. I wrote the FAQs, keep them up to date, and maintain the roboposter. Plus several distributed and supported medium-to-major software packages and a mailing list[*]. That is enough. Why should I have to program around busted news gateways? What will it be next week? [*] And indeed, I use LISTSERV as a list exploder to keep my costs down. For which I am both greatly indebted to Bill Gruber for his assistance, and *quite* appreciative of a *really* excellent piece of software called LISTSERV. An *extremely* good job Eric. But it has this one major annoyance that makes it quite unfriendly in the multi-media electronic world. -- Chris Lewis; [log in to unmask]; Phone: Canada 613 832-0541 Psroff 3.0 info: [log in to unmask] Ferret list: [log in to unmask]