PREFACE
=======
Again i found a little tool that needed to be fixed. But i
totally rewrote it and now it works for me 100%.
Hopefully for you aswell.
No copyright this time. Just PD.
March 1995, goldt@math.tu-berlin.de

New Maintainer - 
July 1995, boby@pixi.com

LOCATION
========
Site1	     = sunsite.unc.edu
Path1        = /pub/Linux/system/Mail/news
File1        = suck-2.3B.tar.gz

Site2        = tsx-11.mit.edu
Path2        = /pub/linux/sources/sbin
File2        = suck-2.3B.tar.gz

INTRODUCTION
============

READ README.NEWS FIRST!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

This package contains the following software for copying news from an
NNTP server to your local machine, and copying replies back up to
an NNTP server.

suck		Pull a small newsfeed from an NNTP server, avoiding the
			NEWNEWS command.

lpost		Gives one article fetched by suck to the local server.

rpost		Posts article(s) to an NNTP server (like inews).

get.news	a script to tie all this together.

put.news	a script called by rpost (see below)

NOTE:
Suck will not work with obsolete NNTP servers that can't handle
the xhdr command. inn-1.4sec has its own NNTP server that can handle xhdr,
cnews.CR-G has no own NNTP server so it depends on which NNTP server you
connect to cnews. To get debugging output in suck.debug you can
edit the Makefile and add -DDEBUG1 -DDEBUG2 to OPTS before you compile
suck again.


SUCK
====
Suck works in two modes.

1. stdout mode
	%suck
or	%suck myhost.com

	Suck grabs news from an NNTP server and sends the articles to
stdout. Suck accepts as argument the name of an NNTP server or
if you don't give an argument it will take the environment variable
NNTPSERVER. You can redirect the articles to a file or better
compress them on the fly like "suck server.domain | gzip -9 > output.gz".
Now it's up to you what you do with the articles. Maybe
you have the output already on your local machine because you
used a slip line or you still have to transfer the output to your
local machine.

2. batch mode
	%suck -b[ir] batchfile
or	%suck myhost.com -b[ir] batchfile

	Suck will grab news from an NNTP server and store them
into files, one for each article.  The location of the files
is based on the defines in both.h.  Once it is done downloading
the articles, it will build a batch file which can be processed
by either innxmit or rnews.
	-bi - build batch file for innxmit.  The articles are
		left intact, and a batchfile is built with a
		one-up listing of the full path of each article.
		Then innxmit can be called:
	e.g: innxmit localhost batchfile
		
	-br - build batch file for rnews.  The articles are
		concatenated together, with the #!rnews size
		article separator.  This can the be fed to
		rnews:
	e.g: rnews -S localhost batchfile	
	

sucknewsrc file ------------------
	Suck looks for a file sucknewsrc to see what articles you want and
which you already received. The format of sucknewsrc is very simple. It
consists of one line for each newsgroup.  The line contains two fields.
The first field is the name of the group.  The next field is the highest
article number that was in the group when that group was last downloaded.
The fields are separated by a space.

	To add a new newsgroup, just stick it in sucknewsrc, with a
highest article number of 0.  When suck finds a highest article number
of -1, it does not suck any news from that group.  When suck is
finished, it creates the file suck.tmp which contains the new
sucknewsrc with the updated article numbers.

	So now you get the picture. To get automatically a newsfeed even
without support from the news administration you could write a little
shell script for the remote site that:

a) copies suck.tmp to sucknewsrc.
b) sucks news and redirects the output to a file.
c) transfers the file to your local host if it isn't already there.
d) logs you out.

	Suck uses a variety of temporary and working files during
operation.  You shouldn't care for them because they will be
recreated every time you run suck.  Suck does not itself clean up any of
these temporary files when it is done.

suck.restart - This file is created to allow a restart in case of
	interruption.  It will restart with the last incomplete
	article downloaded.
suck.tmp     - sucknewsrc with the updated article counts downloaded
suck.index   - list of articles the NNTP server says we haven't 
		received
suck.sorted  - suck.index deduped so we don't download articles twice,
		as if an article is cross-posted to multiple groups.		

LPOST
=====
NOTE - use the batch function of suck and avoid rpost, it is
not the most efficient way of doing things.  Unless I get a lot
of demand, it will be going away in the future.

 lpost (local post) reads stdin and pipes the articles to rnews.  You
need rnews which is typically found in /usr/lib/news. If not, you should
create a symbolic link to rnews in that directory (while you are at it,
you could create a symbolic link for inews as well) like ln -s
bin/input/rnews rnews.  A typical way to use lpost is "zcat output.gz |
lpost".  This could take a while, so be patient. If you want to see how
lpost is working invoke it with the option -v.

 RPOST
=====
rpost (remote post) will post articles from the local machine to a remote NNTP server
(the opposite of suck)).  It works in two modes

1. stdin mode
	 rpost reads 1 article from stdin and sends it to your
NNTP server. The article must have a header of at least 2 lines, namely
Newsgroups: and Subject: and a body (the message). Header and body
have to be seperated by a newline.  Rpost does not change the message
in any way.

2. batch mode
	rpost -b batchfile -p prefix -f filter filter_arg1 ...

	-b batchfile
	
	A listing of the articles to be posted.  One article per 
line, with the line being the path to the file.  On my system, I
just point it to the out.going file produced by inn:

	e.g: -b /usr/spool/news/out.going/pixi

	-p prefix
	If the batchfile for the above article does not contain
	a full path, rather a partial path, this paramater must
	be specified.  This is useful when the out.going file is
	automatically by another program.  Inn lists the path in
	the out.going file relative to its base directory
	/usr/spool/news.  In that case I just do
	-p /usr/spool/news

	-f filter $$o=<outfile> $$i filter_arg2 ...
	In many cases, each article must be massaged before the 
	remote NNTP will accept it.  This option lets you do that.
	There are two required parameters with this:

	$$i - When rpost is called, it will replace $$i with the file name
		of the message to be massaged.
	$$o=<outfile>	- <outfile> is the name of the file produced by
		your filter that will get uploaded to the remote NNTP server.
	arg2 ... - any additional args needed can be specified.

	$$o can be specifed anywhere on the command line.

	This sounds confusing, so let me try to clarify it a bit.  Say
your NNTP server doesn't like messages with NNTP-Posting-Host filled in.
I could write a one line batch file calling sed to delete this from my
messages.

------------myscr
#!/bin/sh
#sample filter script
sed -e "/^NNTP-Posting-Host/d" $1 > /tmp/FILTERED_MSG
------------end myscr script 

Then I call rpost like this

	rpost -b /usr/spool/news/out.going/pixi -f myscr $$o=/tmp/FILTERED_MSG $$i

Then, before each message is uploaded, myscr is called with the file
name of the message to be cleaned.  It puts the cleaned-up message in
/tmp/FILTERED_MSG, which is then uploaded to your NNTP server.


CONCLUSION
==========

 The suck package is useful if you have a dialup connection to a shell
or slip account and want to reduce the expenses for your
phonebill/online time. Rumors say that some dudes use suck for other
reasons like archiving, extra newsfeed or sending articles by mail.
 With all these little tools together you almost have the functionality
of a newsreader, but without the advantage of comfortability and the
disadvantage of online time.

Hey, isn't it UNIX philosophy ? Keep it simple.
 Ok, one last fortune cookie i always enjoy to read:
"Emacs is a nice operating system, but I prefer UNIX."

