[Air-L] Tweet Secondary Media Uptake

Astvansh, Vivek astvansh at iu.edu
Sun Aug 5 08:13:23 PDT 2018


1. I GUESS you can search for the tweet URL if you have a news media article in a markup language (XML, HTML, etc.) format. However, if you use Factiva or LexisNexis Academic, you will download text (TXT, DOCX, PDF) and not markup language format files.


2. Although they differ in MINOR ways, I GUESS Factiva and LexisNexis Academic are substitutes. Also, they are far more (than web search) acceptable sources of media data.


3. I like your use of the label "secondary media." You seem to consider tweets by users as primary media and the appearance of those tweets in other nedua (as evidence for a narrative) as secondary media.


I hope others respond so that we go beyond my GUESSES.

________________________________
From: LIAM MONNINGER <lmonninger at ucla.edu>
Sent: Sunday, August 5, 2018 10:56 AM
To: Astvansh, Vivek
Subject: Re: [Air-L] Tweet Secondary Media Uptake

Thanks for the quick response Vivek!

Yeah, that's essentially what I'd like to do. I was thinking there might be way to trace where a tweet is embedded via its URL. But, your method should certainly work for what I'm trying to accomplish!

A couple of questions...

1) In your experience, is there a major difference between the results returned by Factiva vs. those returned by LexisNexis?

2) Methodologically, can I rely on the number of search results as figure for my secondary media audience? I.e., are the local search engines viable for this kind of research?

Thanks,
Liam Monninger

On Sun, Aug 5, 2018 at 7:28 AM, Astvansh, Vivek <astvansh at iu.edu<mailto:astvansh at iu.edu>> wrote:
Thank you for asking, Liam. Let me paraphrase my understanding of what you want. Given a topic (hashtag, or just a word or phrase), you'd first collect all the tweets on that topic in a given date range. For each of the tweet in your sample, you next want to obtain data on news media articles that refer to the focal tweet. Did I understand you correctly AND completely? If no, please help.

If yes: you can obtain Twitter data from vendors such as Crimson Hexagon or GNIP. Next, you can write a program that searches Factiva or LexisNexis Academic for each tweet by text, username, etc.
________________________________________
From: Air-L <air-l-bounces at listserv.aoir.org<mailto:air-l-bounces at listserv.aoir.org>> on behalf of LIAM MONNINGER <lmonninger at ucla.edu<mailto:lmonninger at ucla.edu>>
Sent: Sunday, August 5, 2018 10:24 AM
To: air-l at listserv.aoir.org<mailto:air-l at listserv.aoir.org>
Subject: [Air-L] Tweet Secondary Media Uptake

Hi everyone,

My name is Liam Monninger. I'm an undergraduate researcher at UCLA, working
on a project that seeks to understand tweets as commitment devices in
international politics.

To better understand the audience of a given tweet, I would like to be able
to see and tabulate the secondary media where said tweet is embedded. I was
thinking, in the very least, I could just make a list of major publications
worldwide and manually check articles to see if a certain tweet is
embedded. But, it would be nice to have a more sophisticated method.

Any ideas? Is there perhaps a useful API?

Thanks,
Liam Monninger
_______________________________________________
The Air-L at listserv.aoir.org<mailto:Air-L at listserv.aoir.org> mailing list
is provided by the Association of Internet Researchers http://aoir.org
Subscribe, change options or unsubscribe at: http://listserv.aoir.org/listinfo.cgi/air-l-aoir.org

Join the Association of Internet Researchers:
http://www.aoir.org/
_______________________________________________
The Air-L at listserv.aoir.org<mailto:Air-L at listserv.aoir.org> mailing list
is provided by the Association of Internet Researchers http://aoir.org
Subscribe, change options or unsubscribe at: http://listserv.aoir.org/listinfo.cgi/air-l-aoir.org

Join the Association of Internet Researchers:
http://www.aoir.org/




More information about the Air-L mailing list