[Air-l] Sofware to capture content

Vidar Falkenberg vidar at readability.dk
Thu Mar 2 00:58:56 PST 2006


I would like to recommend a piece of software called WebSite-Watcher from  
www.aignes.com  - I've used it to monitor and archive changes from online  
newspapers. The archiving is done with Local Website Archive from the same  
company.
There is a free trial, and the price is reasonable.

I've found it easy to use and adjust to my needs, and the developer is  
really helpful and active on the site forum.
When checking and archiving content, it took just under a minute to get  
through the list of 1000+ bookmarks.

HTtrack mentioned earlier is also effective, especially when archiving  
complete websites.


Good luck,

Vidar Falkenberg

Den 01.03.2006 kl. 16:00:25 skrev Eulalia Puig Abril <epabril at wisc.edu>:

> Hi everyone,
> I was wondering if any of you know about software to capture website  
> content – specifically, to capture online news outlets (CNN, The  
> Washington Post, The New York Times…) as well as blog-types news.
> We are about to engage in a research involving content coding these  
> sites and were wondering if anybody has information on costs (any free  
> out there?), ease of use, effectiveness in capturing content, time  
> needed to capture content at a point in time, time needed to capture  
> 24-hour content, and any other pertinent information that you may want  
> to share.
> Thanks in advance to ya all! Eulàlia Puig Abril
> _______________________________________________
> The air-l at listserv.aoir.org mailing list
> is provided by the Association of Internet Researchers http://aoir.org
> Subscribe, change options or unsubscribe at:  
> http://listserv.aoir.org/listinfo.cgi/air-l-aoir.org
>
> Join the Association of Internet Researchers:
> http://www.aoir.org/





More information about the Air-L mailing list