[Air-l] Sofware to capture content

Charlie Balch charlie at balch.org
Wed Mar 1 08:06:00 PST 2006


For capturing site content, look into RSS which requires some programming
skills and site ripper applications such as http://www.httrack.com (free).
Most news sites have RSS feeds. I don't know how you plan to code your
information but programs like Atlas.ti do have free trial versions.
Charlie

-----Original Message-----
From: air-l-bounces at listserv.aoir.org
[mailto:air-l-bounces at listserv.aoir.org] On Behalf Of Eulalia Puig Abril
Sent: Wednesday, March 01, 2006 9:00 AM
To: air-l at listserv.aoir.org
Subject: [Air-l] Sofware to capture content

Hi everyone,
I was wondering if any of you know about software to capture website content
– specifically, to capture online news outlets (CNN, The Washington Post,
The New York Times
) as well as blog-types news. 
We are about to engage in a research involving content coding these sites
and were wondering if anybody has information on costs (any free out
there?), ease of use, effectiveness in capturing content, time needed to
capture content at a point in time, time needed to capture 24-hour content,
and any other pertinent information that you may want to share.
Thanks in advance to ya all! Eulàlia Puig Abril
_______________________________________________
The air-l at listserv.aoir.org mailing list is provided by the Association of
Internet Researchers http://aoir.org Subscribe, change options or
unsubscribe at: http://listserv.aoir.org/listinfo.cgi/air-l-aoir.org

Join the Association of Internet Researchers: 
http://www.aoir.org/






More information about the Air-L mailing list