[Air-l] archiving websites for later analysis

Miasma miasma at earthlink.net
Thu Sep 26 15:59:00 PDT 2002


>Message: 2
>Date: Thu, 26 Sep 2002 14:44:13 +0200
>From: Frank Schaap <architext at fragment.nl>
>To: air-l at aoir.org
>Subject: [Air-l] archiving websites for later analysis
>Reply-To: air-l at aoir.org
>
>I know this has been discussed here before, but I can't seem to find that
>discussion anymore.
>
>I'm analysing a relatively limited number of homepages and I'm looking for a
>way to archive them to be able to later go back to the state I found them in
>for analysis purposes.
>
>Using the "save as" function of for instance IE isn't sufficient. Since
>personal homepages aren't all that big usually, I want to archive the whole
>site including underlying pages.
>
>I have found WebCopier <http://www.maximumsoft.com/> and that seems to work
>quite okay, but it still has some issues, for instance with iframes. It also
>converts the directory structure of the site and sometimes it's important to
>see how someone structures their site.
>
>So, in other words, does anyone have any other recommendations? Dept. and IT
>policies make that I'm looking for something that I can run locally on my
>own machine...
>
>TIA
>
>Frank.
>--
>Fragments Blog:         http://fragment.nl/
>Cyberculture Resources: http://fragment.nl/resources/


Apologies if someone already gave this solution, as I'm on digest.

Internet Explorer 5 and above offers a number of ways to save sites, 
for purists who want all the HTML and code, and a Web Archive feature 
which will save a site and all its links to whatever link depth you 
specify.

It will also preserve framesets, I believe. One of the drawbacks of 
the Acrobat method is that the PDF archive, altho stable, won't 
preserve framesets, or at least not in the last version I had.

Chris
-- 

Books just wanna be FREE! See what I mean at:
http://bookcrossing.com/friend/Miasma




More information about the Air-L mailing list