[Air-L] FB statement on banning NYU researchers for scraping

K Eckert stine.eckert at wayne.edu
Wed Aug 11 07:40:17 PDT 2021


This is the link, I think

https://www.nytimes.com/2021/08/10/opinion/facebook-misinformation.html
[https://static01.nyt.com/images/2021/08/09/opinion/09edelson/09edelson-facebookJumbo-v5.jpg]<https://www.nytimes.com/2021/08/10/opinion/facebook-misinformation.html>
Opinion | Facebook Shuts Down Researchers Looking Into Misinformation - The New York Times<https://www.nytimes.com/2021/08/10/opinion/facebook-misinformation.html>
Ms. Edelson is a Ph.D. candidate in computer science at N.Y.U.’s Tandon School of Engineering, where Dr. McCoy is an associate professor of computer science and engineering. They are affiliated ...
www.nytimes.com



Stine Eckert, Ph.D. (she/her)

Associate Professor

Department of Communication

Wayne State University

Detroit, MI 48201


@stineeckert

http://stineeckert.com/

https://s.wayne.edu/nsf-advance<https://s.wayne.edu/nsf-advance/>


Reflections on Feminist Communication and Media Scholarship: Theory, Method, Impact. <https://www.routledge.com/Reflections-on-Feminist-Communication-and-Media-Scholarship-Theory-Method/Eckert-Bachmann/p/book/9780367609832>

Stine Eckert and Ingrid Bachmann (eds.), Routledge



________________________________
From: Air-L <air-l-bounces at listserv.aoir.org> on behalf of Joly MacFie <joly at punkcast.com>
Sent: Wednesday, August 11, 2021 2:08 AM
To: aoir list <air-l at aoir.org>
Subject: Re: [Air-L] FB statement on banning NYU researchers for scraping

[EXTERNAL]

And, here is the response, via NYT.

https://www.nytimes.com/2021/08/10/theater/international-puppet-fringe-festival-nyc.html

By Laura Edelson and Damon McCoy

We learned last week that Facebook had disabled our Facebook accounts and
our access to data that we have been using to study how misinformation
spreads on the company’s platform.

We were informed of this in an automated email. In a statement, Facebook
says we used “unauthorized means to access and collect data” and that it
shut us out to comply with an order from the Federal Trade Commission to
respect the privacy of its users.

This is deeply misleading. We collect identifying information only about
Facebook’s advertisers. We believe that Facebook is using privacy as a
pretext to squelch research that it considers inconvenient. Notably, the
acting director of the F.T.C.’s consumer protection bureau told Facebook
last week that the “insinuation” that the agency’s order required the
disabling of our accounts was “inaccurate.”

“The F.T.C. is committed to protecting the privacy of people, and efforts
to shield targeted advertising practices from scrutiny run counter to that
mission,” the acting director, Samuel Levine, wrote to Mark Zuckerberg,
Facebook’s founder and chief executive.

Our team at N.Y.U.’s Center for Cybersecurity has been studying Facebook’s
platform for three years. Last year, we deployed a browser extension we
developed called Ad Observer that allows users to voluntarily share
information with us about ads that Facebook shows them. It is this tool
that has raised the ire of Facebook and that it pointed to when it disabled
our accounts.

In the course of our overall research, we’ve been able to demonstrate that
extreme, unreliable news sources get more “engagement” — that is, user
interaction — on Facebook, at the expense of accurate posts and reporting.
What’s more, our work shows that the archive of political ads that Facebook
makes available to researchers is missing more than 100,000 ads.

There is still a lot of important research we want to do. When Facebook
shut down our accounts, we had just begun studies intended to determine
whether the platform is contributing to vaccine hesitancy and sowing
distrust in elections. We were also trying to figure out what role the
platform may have played leading up to the Capitol assault on Jan. 6.

We are privacy and cybersecurity researchers whose careers are built on
protecting users. That’s why we’ve been so careful to make sure that our Ad
Observer tool collects only limited and anonymous information from the
users who agreed to participate in our research. And it is also why we made
the tool’s source code public so that Facebook and others can verify that
it does what we say it does.

We strongly believe we are not violating Facebook’s terms of service, as
the company contends. But even if we had been, Facebook could have
authorized our research. As Facebook declared in announcing the disabling
of our accounts, “We’ll continue to provide ways for responsible
researchers to conduct studies that are in the public interest while
protecting the security of our platform and the privacy of people who use
it.”

Our research is responsible and in the public interest. We’ve protected the
privacy of our volunteers. Essentially, our ad tool collects the ads our
volunteers see on their Facebook accounts, plus information provided by
Facebook about when and why they were shown the ads and who paid for them.
These ads are seen by the specific audience the advertiser targets.

This tool provides a way to see what entities are trying to influence the
public, and how they’re doing it. We think that’s important to democracy.
Yet Facebook has denied us important access to continue to do much of our
work.

One of the odd things about this dispute is that while Facebook has barred
us from research tools available to users and other academic researchers,
it has not blocked our Ad Observer browser either by technical or legal
means. It is still operational, and we are still collecting data from
volunteers.

Still, by shutting us off from its own research tools, Facebook is making
our work harder. This is unfortunate. Facebook isn’t protecting privacy.
It’s not even protecting its advertisers. It’s protecting itself from
scrutiny and accountability.

The company suggests the Ad Observer is unnecessary, that researchers can
study its platform with tools the company provides. But the data Facebook
makes available is woefully inadequate, as the gaps we’ve found in its
political ad archive prove. If we were to rely on Facebook, we simply could
not study the spread of misinformation on topics ranging from elections to
the Capitol riot to Covid-19 vaccines.

By blocking us from its platform, Facebook sent us a message: It wants to
stop us from examining how it operates.

We have a message for Facebook: The public deserves more transparency about
the systems the company uses to sell the public’s attention to advertisers
and the algorithms it employs to promote content. We will keep working to
ensure the public gets that transparency.

-----------------------
Laura Edelson is a Ph.D. candidate in computer science at New York
University’s Tandon School of Engineering, where Damon McCoy is an
associate professor of computer science and engineering. They are
affiliated with the nonpartisan research group Cybersecurity for Democracy.

On Wed, Aug 4, 2021 at 7:36 PM Joly MacFie <joly at punkcast.com> wrote:

> [if anyone already mentioned this, I missed it]
>
>
> https://about.fb.com/news/2021/08/research-cannot-be-the-justification-for-compromising-peoples-privacy/
>
> Research Cannot Be the Justification for Compromising People’s Privacy
> August 3, 2021
> By Mike Clark, Product Management Director
>
> For months, we’ve attempted to work with New York University to provide
> three of their researchers the precise access they’ve asked for in a
> privacy protected way. Today, we disabled the accounts, apps, Pages and
> platform access associated with NYU’s Ad Observatory Project and its
> operators after our repeated attempts to bring their research into
> compliance with our Terms. NYU’s Ad Observatory project studied political
> ads using unauthorized means to access and collect data from Facebook, in
> violation of our Terms of Service. We took these actions to stop
> unauthorized scraping and protect people’s privacy in line with our privacy
> program under the FTC Order.
>
> The researchers gathered data by creating a browser extension that was
> programmed to evade our detection systems and scrape data such as
> usernames, ads, links to user profiles and “Why am I seeing this ad?”
> information, some of which is not publicly-viewable on Facebook. The
> extension also collected data about Facebook users who did not install it
> or consent to the collection. The researchers had previously archived this
> information in a now offline, publicly-available database.
>
> We offer researchers <https://research.fb.com/> a number of privacy-protective
> methods
> <https://about.fb.com/news/2021/01/increasing-transparency-around-us-2020-elections-ads/> to
> collect and analyze data. We welcome research that holds us accountable,
> and doesn’t compromise the security of our platform or the privacy of the
> people who use it. That’s why we created tools like the Ad Library and
> launched initiatives like Data for Good <https://dataforgood.fb.com/> and Facebook
> Open Research & Transparency (FORT <https://fort.fb.com/>) — to provide
> privacy-protected APIs and data sets for the academic community.
>
> We told the researchers a year ago, in summer of 2020, that their Ad
> Observatory extension would violate our Terms even before they launched the
> tool. In October, we sent them a formal letter notifying them of the
> violation of our Terms of Service and granted them 45 days to comply with
> our request to stop scraping data from our website. The deadline ended on
> November 30, long after Election Day. We continued to engage with the
> researchers on addressing our privacy concerns and offered them ways to
> obtain data that did not violate our Terms.
>
> Earlier this year, we invited researchers, including the ones from NYU, to
> safely access US 2020 Elections ad targeting data
> <https://research.fb.com/blog/2021/02/introducing-new-election-related-ad-data-sets-for-researchers/> through
> FORT’s Researcher Platform. This offered the Ad Observatory researchers a
> more comprehensive data set than the one they created by scraping data on
> Facebook. The researchers had the opportunity to use the data set, which is
> designed to be privacy-protective, instead of relying on scraping, but they
> declined.
>
> We made it clear in a series of posts
> <https://about.fb.com/news/2021/04/how-we-combat-scraping/> earlier this
> year that we take unauthorized data scraping seriously, and when we find
> instances of scraping we investigate and take action to protect our
> platform. While the Ad Observatory project may be well-intentioned, the
> ongoing and continued violations of protections against scraping cannot be
> ignored and should be remediated.
>
> Collecting data via scraping is an industry-wide problem that jeopardizes
> people’s privacy, and we’ve been clear about our public position
> <https://about.fb.com/news/2021/04/how-we-combat-scraping/> on this as
> recently as April. The researchers knowingly violated our Terms against
> scraping — which we went to great lengths to explain to them over the past
> year. Today’s action doesn’t change our commitment to providing more
> transparency
> <https://about.fb.com/news/2021/01/increasing-transparency-around-us-2020-elections-ads/> around
> ads on Facebook or our ongoing collaborations with academia. We’ll continue
> to provide ways for responsible researchers to conduct studies that are in
> the public interest while protecting the security of our platform and the
> privacy of people who use it.
>
> --
> --------------------------------------
> Joly MacFie  +12185659365
> --------------------------------------
> -
>


--
--------------------------------------
Joly MacFie  +12185659365
--------------------------------------
-
_______________________________________________
The Air-L at listserv.aoir.org mailing list
is provided by the Association of Internet Researchers http://aoir.org
Subscribe, change options or unsubscribe at: http://listserv.aoir.org/listinfo.cgi/air-l-aoir.org

Join the Association of Internet Researchers:
http://www.aoir.org/



More information about the Air-L mailing list