Hi, everyone,
I had posted a question regarding solaris programs that could retrieve
some WWW pages so that I could browse them offline. The most interesting
solution I have found is using webcopy, a perl script that can retrieve
pages and the links they refer to. I have downloaded it, but didn't have
time to use it yet. Its home page can be found at
http://www.inf.utfsm.cl. Thanks to Martin Li <Martin.Li@unsw.edu.au> and
Jochen Bern <bern@TI.Uni-Trier.DE>.
Detlev Habicht <habicht@ims.uni-hannover.de> suggested looking at
VisualWeb (http://www.web-factory.de/visualweb.html), but there is no unix
version of it yet.
Some people suggested writing customized perl script for doing that. Raju
Krishnamurthy <raju@ecologic.net> suggested me writing a perl script using
a library called lib-www. Mark `Hex' Hershberger <mah@eecs.tulane.edu>
suggested using the lwp library; he's given an interesting example:
> You could write a perl script using the lwp library. For example, I like
> to read comics online, but it is a real waste of time if I have to go to
> every site and wait for them to download. So, I've written a script that
> downloads the comics everyday (in a cron job) and creates a web page with
> them all on it for me to read. I can let you have a copy if you want. It
> will give you an idea of what you want to do.
My original question was:
> Hi, everyone,
>
> I'm looking for a program which can retrieve several WWW pages (e.g.
> all pages under http://www.foo.edu/docs/), so that I can browse them
> offline and/or even print them. I've heard about programs like that for
> Windows, but none for solaris or unix in general). Does anyone know a
> program like this ?
Many thanks for all who took the time to respond !
Regards,
Fernando Frota Redigolo fernando@larc.usp.br
---------------------------------------------------------------------------
Systems Administrator
Laboratorio de Arquitetura e Redes de Computadores LARC - EPUSP
This archive was generated by hypermail 2.1.2 : Fri Sep 28 2001 - 23:11:08 CDT