[Air-L] New Social Media and Democracy Research Facebook Group - AOIR

kalev leetaru kalev.leetaru5 at gmail.com
Wed Apr 18 06:58:16 PDT 2018


 Charles, the issue is not whether there are good players or not working on
the initiative, the issue is that for all the talk about ethics, for all
the initiatives and grants and discussions, ethics is still a second class
citizen that is always "TBD" after all the exciting "what can we do with
all this data" discussion has happened. For all the luminaries and
prestigious organizations and funders involved and all the talk about what
will be possible under the new initiative, why is it that absolutely every
single last question about ethics is TBD?

Replication, publication review, funding, process - all of that has at
least basic detail sketched out in the new initiative. Everything that has
to do with how do researchers get their hands on all this amazing data, at
least has very basic outlines around it. Ethics? That's all TBD to be
figured out later.

That's the real problem here - its that for all the great talk about
ethics, where is all of that with this new initiative? Most importantly,
its not enough to blame Facebook in this case. Where was the academic
leadership from those crafting the new initiative and their institutions
and from SSRC to lay out at the very onset a set of basic principles that
are codified into the new initiative that show that it is at least
acknowledging and taking these issues seriously? Putting into writing right
up front that certain kinds of activities are too ethically fraught to
permit? Or if the belief is that all ethics are situational and even the
most fraught questions should be weighted against societal benefit, that
should be codified, along with guiding principles. If the latter, then why
didn't the initiative have a set of basic "ethical principles" that don't
constrain what can be done, but at the very least set out the ethical
vision of the initiative, reassure the public whose data is being used, and
codifies in writing from the outset the behaviors that will require the
greatest attention (much as IRB guidance lists specific areas of concern
without absolutely prohibiting them)?

For all the copious discussion of replication in the initiative's
whitepaper, which came from academia, not Facebook, there is no discussion
of the ethical implications of preserving all that data and the conflict
between replication and user rights to delete their content. Not once does
"consent" or "permission" make a single appearance in the entire academic
whitepaper, except once, to note Facebook's permission being required. All
of the discussion is about the tremendous value of this treasure trove of
data to academics, but precious little mention of what are the implications
for privacy and the right of individuals to control access to their
information. For all the academic criticism of Facebook performing research
on its users without their knowledge or informed consent (for example its
emotions study), now we have academia lining up to perform their own
research without the knowledge or informed consent of Facebook's users.

That's the problem here - for all academia's talk, all the working groups
and conferences and grants and initiatives and condemnations, at the end of
the day little has changed and in 2018 we have a new data initiative, built
not by Facebook and released to the world with zero academic input, but
rather quite the opposite. This was built by academia by, as you note,
first rate luminaries and with top organizations like SSRC and major
funders all involved and yet absolutely every single question regarding
ethics, user protections, informed consent, user control over their PII,
etc, are all TBD. Not even a basic vision of ethical principles is
presented.

We can't keep blaming this on Facebook. Here's an initiative come out of
academia involving the key luminaries of the field and yet the entirety of
its ethical underpinnings are "TBD." Focus on outcomes first, then worry
about ethics down the road. Academia had an opportunity in this initiative
to build an ethics-first data access program that demonstrates to companies
like Facebook what it looks like to build a program where ethical
considerations drive the effort, rather than "wow what can we do with all
of this data" and it didn't. For all the time and effort paid on the
replication workflow, where was the corresponding effort spent on ethics?
Even just an announcement in the whitepaper that a special ethical advisory
board consisting of X, Y, and Z organizations and ethical luminaries A, B,
and C has already been launched and is drafting the initial ethical
guidelines for the initiative that codify a set of basic written "ethical
principles" codified into the initiative's charter would have gone a long
way towards at least offering that ethics was viewed as important to the
initiative.

As I wrote last year summarizing the data ethics landscape (
https://www.forbes.com/sites/kalevleetaru/2017/10/16/is-it-too-late-for-big-data-ethics/),
for all the talk about ethics, I'm not sure academia really views ethics as
anything more than something to tack onto the end of a project? After all,
large funders like Gates have committed to open access, but offered that
they are uninterested in open ethics. NSF doesn't make even basic
non-sensitive IRB information available even upon request and nearly every
institution I've asked has refused to make even the most basic of detail
available about their large data-driven projects, including just confirming
whether the project even underwent IRB review at all. Here was an
opportunity to change that, and instead we have an initiative that
entrenches that ethics-last approach.

And Sonia, absolutely hear you regarding the issue of childrens' data being
scooped up in this. In fact, I note in my piece, even if protections were
put in place to try and filter out minor accounts (including those by
minors who falsify their age) the fact that people's private photographs,
Messenger chats and posts will likely contain copious PII of their
children, including biometrics, their medical issues, psychological
development, etc mean children will factor heavily into the datasets being
analyzed without additional "TBD" precautions.


Kalev






On Wed, Apr 18, 2018 at 3:46 AM, Livingstone,S <S.Livingstone at lse.ac.uk>
wrote:

> Some of the data being harvested by researchers through the techniques
> below are from children. I still hope to hear explicit recognition of and
> attention to this fact in discussions of researcher ethics. Best, Sonia
>
> > On 18 Apr 2018, at 06:32, Charles M. Ess <c.m.ess at media.uio.no> wrote:
> >
> > In a word, yes.
> >
> > To be sure, there are wonderful potentials and promising opportunities
> here.  And the academic players involved, from my perspective, are clearly
> first rate.
> > But given the platform's rather dreadful record on privacy matters (to
> put it charitably) and the comparatively weaker privacy protections and
> culture of the U.S. more broadly, it is more than reasonable to voice these
> concerns at the very outset.
> >
> > Thanks for an excellent list, kalev.  I very much hope that those
> involved in this new initiative will be able to take these - and related -
> concerns on board from the outset.
> >
> > best,
> > - charles ess
> > Co-chair, AoIR Ethics Working Group
> >
> >> On 17/04/2018 18:00, kalev leetaru wrote:
> >> With respect to the Facebook/SSRC initiative, it will be very
> interesting
> >> to see how the ethical landscape of such unprecedented researcher
> access to
> >> Facebook plays out. In my interview with SSRC about the new effort,
> Alondra
> >> noted that SSRC has not ruled out academic researchers being permitted
> to
> >> access private posts, photographs, videos, Messenger chats and other
> >> private communications and that the question of how to robustly
> anonymize
> >> such content for the initiative (given that even blurring faces in a
> >> photograph cannot prevent the identities of the depicted individuals
> from
> >> being reconstructed in all cases, video anonymization is even more
> >> difficult and even anonymized text chats can still reveal considerable
> PII)
> >> is still unknown.
> >> As the first studies emerge from the initiative that make heavy use of
> >> private content that Facebook's two billion users thought were private,
> it
> >> is unclear whether public reaction will yield any changes to this
> program
> >> or lead to its discontinuation in its present form.
> >> There is also the question of how replication will be managed on a
> platform
> >> as fluid as Facebook, especially given that activists may engage in
> >> systematic deletions when papers are published that expose and document
> >> certain of their communicative patterns. Either replication must accept
> >> that key content may be systematically deleted in some cases (especially
> >> for controversial high profile studies that are the kind that might
> provoke
> >> such a response) or Facebook will have to permanently archive user data,
> >> removing the right of users to delete or control access to their
> content,
> >> which creates an uncertain legal landscape.
> >> SSRC also did not rule out permitting researchers to actively manipulate
> >> production interfaces and algorithms for actual users during an active
> >> election (ie, an algorithmic change that would test whether favoring
> >> certain content had an impact on voter behavior), nor did it rule out
> >> permitting researchers in one country to perform such active
> >> experimentation during another country's elections.
> >> The first contested election where you've got papers being published in
> a
> >> foreign country documenting active modification to Facebook during the
> >> election and claiming that some of those changes altered voter behavior,
> >> there is an increased likelihood of governmental intervention in this
> >> initiative, especially in an era when candidates and campaigns
> increasingly
> >> blame Facebook for unexpected voter outcomes.
> >> There is also the issue that while proposals will be directly funded by
> a
> >> small set of funding agencies, it is extremely likely that research labs
> >> with substantial DOD funding (whether DARPA, IARPA, NRL, ARL, and
> foreign
> >> counterparts globally) will submit proposals stemming from that work.
> For
> >> example, DOD-funded academic work in the social sciences has routinely
> >> cited private Facebook Messenger and post data as being critical to
> >> increasing the accuracy of their models. One can easily imagine labs
> >> participating in IARPA or DARPA programs submitting proposals to the
> SSRC
> >> initiative that extend that IARPA/DARPA work to private Facebook user
> data.
> >> SSRC offered that there are not presently any restrictions on this
> codified
> >> into the initiative.
> >> So, while I know this initiative has received a lot of fawning press and
> >> academic reaction, its important that we not lose sight of the ethical
> >> components of the initiative, especially given that nearly the entirety
> of
> >> the ethical underpinnings are, in the words of SSRC, "TBD":
> >> https://www.forbes.com/sites/kalevleetaru/2018/04/12/is-
> facebooks-new-academic-initiative-even-more-frightening-than-its-own-
> research/
> >> Kalev
> >>> On Tue, Apr 17, 2018 at 11:09 AM, Steven Clift <slc at publicus.net>
> wrote:
> >>> In recent weeks, you've probably heard <http://po.st/chronphilsocmedia
> >
> >>> about Facebook's connection <http://po.st/zucksocmediademo> to an
> >>> independent
> >>> foundation funded <http://po.st/socdemknight> (multiple foundations
> >>> <http://po.st/socdemhewlett>) effort to research the impact
> >>> <http://po.st/socdemssrc> of social media on elections and democracy.
> >>> That's great. There is even a related conference this week
> >>> <http://po.st/socdemconference> with an amazing list of academic
> speakers.
> >>>
> >>> I am interested opening up digital channels of communication to help
> >>> researchers hear from practitioners to help generate more actionable
> >>> research related to social media and democracy.
> >>>
> >>> Whether you are a democracy/civic engagement practitioner, a political
> >>> campaign operative, or an activist seeking to influence people or your
> >>> government, what is happening *now* with social media and democracy
> that
> >>> needs solid research? If you care about useful research, this group is
> for
> >>> you.
> >>>
> >>> Perhaps you are within government having used social media to help win
> an
> >>> election or you are in media looking for trends or digital options to
> boost
> >>> journalism's role in the future of democracy, then this *off the
> record*
> >>> group on social media and democracy research is for you too.
> >>>
> >>> (If you know practitioners active in the digital
> democracy/politics/media
> >>> space who also care about the big picture as well as "winning" with the
> >>> latest tactic or tool, please pass this invite along.)
> >>>
> >>> Researchers invited!
> >>>
> >>> This is about creating an effective digital feedback loop. I am
> interested
> >>> in how researchers can more effectively engage and access digital
> >>> practitioners across all the major sectors of democracy - government,
> >>> media, campaigns and elections, advocacy and legislative bodies, and
> more.
> >>> Since Facebook is the main target of a wave of research funding, let's
> >>> build a digital bridge between research and practice that people
> actually
> >>> use on that platform. So, if you do research in this space, please
> apply to
> >>> join us. Once we reach 100 charter members, then we will switch to a
> member
> >>> referral required to join process (which will be crucial to ensure
> >>> participation versus free riding.)
> >>>
> >>> So, if you are interested in joining, you *must* answer the join
> request
> >>> survey questions before your application will be approved:
> >>> http://facebook.com/groups/socialmediademocracy
> >>>
> >>> Thanks,
> >>> Steven Clift
> >>>
> >>> P.S. If you just prefer light reading and not participating actively,
> >>> everyone is welcome to join my 7500+ member Civic Technology and Open
> >>> Government Facebook Group. It is a very active group with daily posts:
> >>> http://facebook.com/groups/opengovgroup
> >>>
> >>>
> >>> Steven Clift  -  Executive Director, E-Democracy.org
> >>>    clift at e-democracy.org  -  +1 612 234 7072
> >>>    http://twitter.com/democracy
> >>>
> >>> Join in: http://facebook.com/groups/opengovgroup
> >>> Digital engagement for your org via E-Democracy:
> >>>    http://po.st/engageclift
> >>> _______________________________________________
> >>> The Air-L at listserv.aoir.org mailing list
> >>> is provided by the Association of Internet Researchers http://aoir.org
> >>> Subscribe, change options or unsubscribe at: http://listserv.aoir.org/
> >>> listinfo.cgi/air-l-aoir.org
> >>>
> >>> Join the Association of Internet Researchers:
> >>> http://www.aoir.org/
> >>>
> >> _______________________________________________
> >> The Air-L at listserv.aoir.org mailing list
> >> is provided by the Association of Internet Researchers http://aoir.org
> >> Subscribe, change options or unsubscribe at: http://listserv.aoir.org/
> listinfo.cgi/air-l-aoir.org
> >> Join the Association of Internet Researchers:
> >> http://www.aoir.org/
> >
> > --
> > Professor in Media Studies
> > Department of Media and Communication
> > University of Oslo
> > <http://www.hf.uio.no/imk/english/people/aca/charlees/index.html>
> >
> > Postboks 1093
> > Blindern 0317
> > Oslo, Norway
> > c.m.ess at media.uio.no
> > _______________________________________________
> > The Air-L at listserv.aoir.org mailing list
> > is provided by the Association of Internet Researchers http://aoir.org
> > Subscribe, change options or unsubscribe at: http://listserv.aoir.org/
> listinfo.cgi/air-l-aoir.org
> >
> > Join the Association of Internet Researchers:
> > http://www.aoir.org/
>



More information about the Air-L mailing list