[Air-l] SW to store webpages

Thomas Koenig T.Koenig at lboro.ac.uk
Sun Jun 5 13:54:14 PDT 2005


elijah wright wrote:

>
>> Wget is another free possibility, but why make things difficult, when 
>> a free Windows based program such as HTTrack exists? No commands to 
>> learn! No tedious installation routines!
>
>
> because someone might want the additional flexibility to make their 
> research better?
>
> isn't that the real goal, here?

No, not really. As is often the case, there is a trade-off between 
flexibility (read: complexity) and parsimony (read: easy of use). 
Maximum complexity is not always the best solution. At the moment, I 
cannot see how the alleged greater flexibility of wget would improve 
research. If I want to capture an entire website, then HTTrack seems to 
do the job. It seems to do it even more complete than wget: (Funny 
languages only!)

http://linuxfr.org/~blackshack/2889.html
http://lists.bxlug.be/pipermail/linux-bruxelles/2002-September/005497.html
http://groups-beta.google.com/group/de.comp.lang.php.misc/msg/76f85bf9a2bef551
(http://tinyurl.com/bqwqp)

It's thus faster than wget and, unlike wget, HTTTrack can retrieve some 
broken links and, more importantly, it captures some dynamic URLs 
(though not all of them).

Some even claim that HTTrack is more powerful (flexible?) than wget:

http://lists.gulp.linux.it/pipermail/gulp/2004-May/002747.html

If you want to capture an entire website, HTTrack thus seems both better 
suited and easier to operate. What are some examples for the greater 
flexibilty of wget?

BTW: There are GUIs for wget:

http://kmago.sourceforge.net/index.htm
http://www.jensroesner.de/wgetgui/

And finally, yet another supposedly good non-freeware application is:

http://www.tenmax.com/teleport/pro/home.htm

Haven't tried it yet, so I must rely on reviews here, which say that 
this is the best option.

Thomas

-- 
thomas koenig, ph.d.
http://www.lboro.ac.uk/research/mmethods/staff/thomas/




More information about the Air-L mailing list