[Air-L] ICYMI: Publication of two new reports from Data & Society's Media Manipulation initiative + livestream discussion happening today

Monica Bulger monicabulger at gmail.com
Wed Feb 28 08:26:56 PST 2018


Dear colleagues,

This past year has been a sometimes overwhelming whirl of news events and commentary about the origins of fake news and ways to counter it. ICYMI, I’m delighted to share two reports that the Data & Society Media Manipulation <https://datasociety.net/research/media-manipulation/> initiative recently published + share a reminder that “Real Talk about Fake News <https://datasociety.net/events/databite-no-107-real-talk-about-fake-news-nabiha-syed-in-conversation-with-joan-donovan/>” with Nabiha Syed in conversation with Claire Wardle and Joan Donovan is livestreaming today <https://datasociety.net/events/databite-no-107-real-talk-about-fake-news-nabiha-syed-in-conversation-with-joan-donovan/> at 4p.m. Eastern.

The Promises, Challenges, and Futures of Media Literacy <https://datasociety.net/output/the-promises-challenges-and-futures-of-media-literacy/> , which I co-authored with Patrick Davison, aims to make sense of the complexity of media literacy as a fix or solution to fake news. We examine the research base for media literacy across four themes: what is media literacy, how can media literacy help, how can media literacy fail, and the future of media literacy. 

Our key findings are:
(1) Media literacy places responsibility for discerning quality and accuracy of information, which works in a world that assumes either good actors or that through evaluation, quality information can be discerned. In an environment where the mechanisms for serving information are unclear, traditional markers for quality and trust have been circumvented, individual skills are insufficient. 

(2) Media literacy is not a panacea, it cannot be considered a standalone solution to fake news and misinformation, but one frame in a complex media & info environment. It must be considered in combination with strategies for addressing how state-sponsored disinformation efforts and tech platforms influence the information we see and how we interact with it.

Dead Reckoning: Navigating Content Moderation After “Fake News,” <https://datasociety.net/output/dead-reckoning/> authored by Robyn Caplan, Lauren Hanson, and Joan Donovan clarifies current uses of the term “fake news” and analyzes four specific strategies of intervention. 

Key findings:
(1) Moderating “fake news” well requires understanding the context of the article and the source. Currently automated technologies and artificial intelligence (AI) are not advanced enough to address this issue, which requires human-led
interventions.
(2) Third-party fact-checking and media literacy organizations are expected to close the gap between platforms and the public interest, but are currently under resourced to meet this challenge.

Please feel free to share widely!
Monica
----
Monica Bulger, Ph.D.
Research Affiliate <http://www.datasociety.net/>, Data & Society Research Institute
36 West 20th Street
New York, NY 10011
@literacyonline






More information about the Air-L mailing list