[Air-l] Re: Hyperlink analysis tools

Stephanie Hendrick stephanie.hendrick at engelska.umu.se
Mon Dec 6 02:26:19 PST 2004



 

I've put all my link analysis software on the web, including a web crawler

and a link analyser. It integrates into Pajek for network diagrams and

produces loads of statistics about counts of links between sites. The

software is free for academic research and anyone is welcome to use it.

http://socscibot.wlv.ac.uk

A related link analysis web site is at:

http://linkanalysis.wlv.ac.uk

And a preprint of what I hope will be a useful overview paper is now also

available online:

Thelwall, M. (2005, to appear). Interpreting social science link analysis

research: A theoretical framework. Journal of the American Society for

Information Science and Technology.

http://www.scit.wlv.ac.uk/%7Ecm1993/papers/Interpreting_SSLAR.pdf

 

Mike Thelwall

 

Mike,

I am very curious about your crawling software. I am a researcher at Umeå
University researching discourse structure in weblog networks and recently
finished a paper with Lilia Efimova about weblog networks (https://doc.telin
nl/dscgi/ds.py/Get/File-46041). While writing this paper, we quickly
discovered that the URL crawler we were using would not be adequate for a
larger sample, especially as it only crawls the first page. In order to
compile the sample we wanted, we had to manually collect the archives into a
text file and run them through the crawler. Is it possible to define which
parts of a site you want crawled in your program? Do you see possibilities
for using it on a blog?

 


More information about the Air-L mailing list