[Air-l] Sofware to capture content

Jennifer Stromer-Galley jstromer at albany.edu
Wed Mar 1 09:16:48 PST 2006


In the past I've used Teleport Pro. It's a good archiving tool. You can set it
up to grab unique pages, to grab the top page and subsequent links off of the
"top" page (so you can specify, I want NYTImes.com/index.html and all links
off that page, plus all the links off those pages (three deep)). It grabs a
page or one level deep of a site very quickly. However, it struggles with
really big websites. It also doesn't always grab images (it depends on how the
website's file system is set up), but it does usually grab the html formatting
fairly well.

Here's a link: http://www.tenmax.com/teleport/pro/home.htm.

Best part is that Teleport Pro is very cheap: $40.00 or so.

Good luck,
~Jenny


> Hi everyone,
> I was wondering if any of you know about software to capture website content –
> specifically, to capture online news outlets (CNN, The Washington Post, The
> New York Times
) as well as blog-types news.
> We are about to engage in a research involving content coding these sites and
> were wondering if anybody has information on costs (any free out there?), ease
> of use, effectiveness in capturing content, time needed to capture content at
> a point in time, time needed to capture 24-hour content, and any other
> pertinent information that you may want to share.
> Thanks in advance to ya all! Eulàlia Puig Abril
> _______________________________________________



-- 
Assistant Professor
Department of Communication, SS 340
University at Albany, SUNY
1400 Washington Ave.
Albany, NY 12222
518-442-4873
jstromer at albany.edu
http://www.albany.edu/~jstromer




More information about the Air-L mailing list