[Air-l] archiving websites for later analysis

Jillana Enteen jillana at rcnchicago.com
Thu Sep 26 06:27:20 PDT 2002


Frank,

I use adobe acrobat. If you have the writing feature, not the free reading
feature, you can save webpages exactly as they are, including their
directories, and you can decide exactly how many links deep. Endnote also
provides this feature, with detailed cataloging available so you can recall
them later.

Jillana Enteen
jillana at rcnchicago.com
http://www.rcnchicago.com/~jillana
,¸¸,ø¤º°`°º¤ø,¸,¸¸,ø¤º°`°º¤ø,¸

----- Original Message -----
From: "Frank Schaap" <architext at fragment.nl>
To: <air-l at aoir.org>
Sent: Thursday, September 26, 2002 7:44 AM
Subject: [Air-l] archiving websites for later analysis


> I know this has been discussed here before, but I can't seem to find that
> discussion anymore.
>
> I'm analysing a relatively limited number of homepages and I'm looking for
a
> way to archive them to be able to later go back to the state I found them
in
> for analysis purposes.
>
> Using the "save as" function of for instance IE isn't sufficient. Since
> personal homepages aren't all that big usually, I want to archive the
whole
> site including underlying pages.
>
> I have found WebCopier <http://www.maximumsoft.com/> and that seems to
work
> quite okay, but it still has some issues, for instance with iframes. It
also
> converts the directory structure of the site and sometimes it's important
to
> see how someone structures their site.
>
> So, in other words, does anyone have any other recommendations? Dept. and
IT
> policies make that I'm looking for something that I can run locally on my
> own machine...
>
> TIA
>
> Frank.
> --
> Fragments Blog:         http://fragment.nl/
> Cyberculture Resources: http://fragment.nl/resources/
>
> _______________________________________________
> Air-l mailing list
> Air-l at aoir.org
> http://www.aoir.org/mailman/listinfo/air-l
>





More information about the Air-L mailing list