[Air-l] advice requested regarding web archiving programs

jeremy hunsinger jhuns at vt.edu
Sat Mar 9 21:32:23 PST 2002


similar tools have been around for ages, but I guess it depends to a 
great extent on your platform.
another possibility for windows is httrack.  I used to use that for some 
time, pretty simple saves a complete local copy.  I usually suggest to 
people on campus adobe acrobat 4.0+'s webcapture facility, which does 
similar things to what Ulla describes below.  It works on windows and 
apple products.

for unixlike systems apple osx/linux/bsd etc. there are a wide variety 
of tools that can do just about anything one can imagine, from the 
simplest tools like wget and mirror.pl that just make copies to systems 
that store the material in databases that can nearly anything one 
desires.  To start, I'd suggest wget though  the wget -m command will 
mirror the precise url you give it to whatever directory you want 
combined with a shell script and cron you can take a period of snapshots 
of the website at times when the sites are less likely to have traffic 
and have them for a historical comparison or similar minded study.   you 
could do just about anything in the end though.



On Saturday, March 9, 2002, at 08:32 PM, Bunz, Ulla K wrote:

> Nicky,
> For my dissertation research I used WebCopier, and was quite pleased 
> with
> it. You can download it for free at www.maximumsoft.com, and you get a 
> trial
> period of a month or so. After that, some of the functions cease 
> working. I
> never bought it, just downloaded what I needed within that time. But 
> even if
> you have to buy it, it's very cheap. If I remember correctly it costs 
> less
> than $50. Someone else in our department bought it based on my
> recommendation and is happy with it.
jeremy hunsinger
jhuns at vt.edu
on the ibook
www.cddc.vt.edu
www.cddc.vt.edu/jeremy
www.dromocracy.com





More information about the Air-L mailing list