[Air-L] Bloomberg on Facebook Research Project

Joly MacFie joly at punkcast.com
Thu Jun 14 23:03:43 PDT 2018


https://www.bloomberg.com/news/articles/2018-06-14/if-you-re-a-facebook-user-you-re-also-a-research-subject

If You’re A Facebook User, You’re Also a Research Subject

The social network is careful about academic collaborations, but chooses
projects that comport with its business goals.

By Karen Weise and Sarah Frier
June 14, 2018

The professor was incredulous. David Craig had been studying the rise of
entertainment on social media for several years when a Facebook Inc.
employee he didn’t know emailed him last December, asking about his
research. “I thought I was being pumped,” Craig said. The company flew him
to Menlo Park and offered him $25,000 to fund his ongoing projects, with no
obligation to do anything in return. This was definitely not normal, but
after checking with his school, University of Southern California, Craig
took the gift. “Hell, yes, it was generous to get an out-off-the-blue offer
to support our work, with no strings,” he said. “It’s not all so black and
white that they are villains.”

Other academics got these gifts, too. One, who said she had $25,000
deposited in her research account recently without signing a single
document, spoke to a reporter hoping maybe the journalist could help
explain it. Another professor said one of his former students got an
unsolicited monetary offer from Facebook, and he had to assure the
recipient it wasn’t a scam. The professor surmised that Facebook uses the
gifts as a low-cost way to build connections that could lead to closer
collaboration later. He also thinks Facebook “happily lives in the
ambiguity” of the unusual arrangement. If researchers truly understood that
the funding has no strings, “people would feel less obligated to interact
with them,” he said.

The free gifts are just one of the little-known and complicated ways
Facebook works with academic researchers. For scholars, the scale of
Facebook’s 2.2 billion users provides an irresistible way to investigate
how human nature may play out on, and be shaped by, the social network. For
Facebook, the motivations to work with outside academics are far thornier,
and it’s Facebook that decides who gets access to its data to examine its
impact on society.

“Just from a business standpoint, people won't want to be on Facebook if
Facebook is not positive for them in their lives,” said Rob Sherman,
Facebook’s deputy chief privacy officer. “We also have a broader
responsibility to make sure that we’re having the right impact on society.”

The company’s long been conflicted about how to work with social
scientists, and now runs several programs, each reflecting the contorted
relationship Facebook has with external scrutiny. The collaborations have
become even more complicated in the aftermath of the Cambridge Analytica
scandal, which was set off by revelations that a professor who once
collaborated with Facebook’s in-house researchers used data collected
separately to influence elections.

“Historically the focus of our research has been on product development, on
doing things that help us understand how people are using Facebook and
build improvements to Facebook,” Sherman said. Facebook’s heard more from
academics and nonprofits recently who say “because of the expertise that we
have, and the data that Facebook stores, we have an opportunity to
contribute to generalizable knowledge and to answer some of these broader
social questions,” he said. “So you’ve seen us begin to invest more heavily
in social science research and in answering some of these questions.”


Facebook has a corporate culture that reveres research. The company builds
its product based on internal data on user behavior, surveys and focus
groups. More than a hundred Ph.D.-level researchers work on Facebook’s
in-house core data science team, and employees say the information that
points to growth has had more of an impact on the company's direction than
Chief Executive Officer Mark Zuckerberg’s ideas.

Facebook is far more hesitant to work with outsiders; it risks unflattering
findings, leaks of proprietary information, and privacy breaches. But
Facebook likes it when external research proves that Facebook is great. And
in the fierce talent wars of Silicon Valley, working with professors can
make it easier to recruit their students.

It can also improve the bottom line. In 2016, when Facebook changed the
“like” button into a set of emojis that better captured user expression—and
feelings for advertisers— it did so with the help of Dacher Keltner, a
psychology professor at the University of California, Berkeley, who’s an
expert in compassion and emotions. Keltner’s Greater Good Science Center
continues to work closely with the company. And this January, Facebook made
research the centerpiece of a major change to its news feed algorithm. In
studies published with academics at several universities, Facebook found
that people who used social media actively—commenting on friends' posts,
setting up events—were likely to see a positive impact on mental health,
while those who used it passively may feel depressed. In reaction, Facebook
declared it would spend more time encouraging "meaningful interaction." Of
course, the more people engage with Facebook, the more data it collects for
advertisers.

The company has stopped short of pursuing deeper research on potentially
negative fallout of its power. According to its public database of
published research, Facebook’s written more than 180 public papers about
artificial intelligence but just one study about elections, based on an
experiment Facebook ran on 61 million users to mobilize voters in the
Congressional midterms back in 2010. Facebook’s Sherman said, “We’ve
certainly been doing a lot of work over the past couple of months,
particularly to expand the areas where we’re looking.”

Facebook’s first peer-reviewed papers with outside scholars were published
in 2009, and almost a decade into producing academic work, it still wavers
over how to structure the arrangements. It’s given out the smaller
unrestricted gifts. But those gifts don’t come with access to Facebook’s
data, at least initially. The company is more restrictive about who can
mine or survey its users. It looks for research projects that dovetail with
its business goals.

Some academics cycle through one-year fellowships while pursuing doctorate
degrees, and others get paid for consulting projects, which never get
published.
​​
When Facebook does provide data to researchers, it retains the right to
veto or edit the paper before publication. None of the professors Bloomberg
spoke with knew of cases when Facebook prohibited a publication, though
many said the arrangement inevitably leads academics to propose
investigations less likely to be challenged. “Researchers focus on things
that don’t create a moral hazard,” said Dean Eckles, a former Facebook data
scientist now at the MIT Sloan School of Management. Without a guaranteed
right to publish, Eckles said, researchers inevitably shy away from
potentially critical work. That means some of the most burning societal
questions may go unprobed.

Facebook also almost always pairs outsiders with in-house researchers. This
ensures scholars have a partner who’s intimately familiar with Facebook’s
vast data, but some who’ve worked with Facebook say this also creates a
selection bias about what gets studied. “Stuff still comes out, but only
the immensely positive, happy stories—the goody-goody research that they
could show off,” said one social scientist who worked as a researcher at
Facebook. For example, he pointed out that the company’s published widely
on issues related to well-being, or what makes people feel good and
fulfilled, which is positive for Facebook’s public image and product. "The
question is: ‘What’s not coming out?,’” he said.

Facebook argues its body of work on well-being does have broad importance.
“Because we are a social product that has large distribution within
society, it is both about societal issues as well as the product,” said
David Ginsberg, Facebook’s director of research.

Other social networks have smaller research ambitions, but have tried more
open approaches. This spring, Twitter Inc.  asked for proposals to measure
the health of conversations on its platform, and Microsoft Corp.’s LinkedIn
is running a multi-year program to have researchers use its data to
understand how to improve the economic opportunities of workers. Facebook
has issued public calls for technical research, but until the past few
months, hasn’t done so for social sciences. Yet it has solicited in that
area, albeit quietly: Last summer, one scholarly association begged
discretion when sharing information on a Facebook pilot project to study
tech’s impact in developing economies. Its email read, “Facebook is not
widely publicizing the program.”

In 2014, the prestigious Proceedings of the National Academy of Sciences
published a massive study, co-authored by two Facebook researchers and an
outside academic, that found emotions were “contagious” online, that people
who saw sad posts were more likely to make sad posts. The catch: the
results came from an experiment run on 689,003 Facebook users, where
researchers secretly tweaked the algorithm of Facebook’s news feed to show
some cheerier content than others. People were angry, protesting that they
didn’t give Facebook permission to manipulate their emotions.

The company first said people allowed such studies by agreeing to its terms
of service, and then eventually apologized. While the academic journal
didn’t retract the paper, it issued an “Editorial Expression of Concern.”

To get federal research funding, universities must run testing on humans
through what’s known as an institutional review board, which includes at
least one outside expert, approves the ethics of the study and ensures
subjects provide informed consent. Companies don’t have to run research
through IRBs. The emotional-contagion study fell through the cracks.

The outcry profoundly changed Facebook’s research operations, creating a
review process that was more formal and cautious. It set up a pseudo-IRB of
its own, which doesn’t include an outside expert but does have policy and
PR staff. Facebook also created a new public database of its published
research, which lists more than 470 papers. But that database now has a
notable omission—a December 2015 paper two Facebook employees co-wrote with
Aleksandr Kogan, the professor at the heart of the Cambridge Analytica
scandal. Facebook said it believes the study was inadvertently never posted
and is working to ensure other papers aren't left off in the future.



In March, Gary King, a Harvard University political science professor, met
with some Facebook executives about trying to get the company to share more
data with academics. It wasn't the first time he'd made his case, but he
left the meeting with no commitment.

A few days later, the Cambridge Analytica scandal broke, and soon Facebook
was on the phone with King. Maybe it was time to cooperate, at least to
understand what happens in elections. Since then, King and a Stanford
University law professor have developed a complicated new structure to give
more researchers access to Facebook’s data on the elections and let
scholars publish whatever they find. The resulting structure is baroque,
involving a new “commission” of scholars Facebook will help pick, an
outside academic council that will award research projects, and seven
independent U.S. foundations to fund the work. “Negotiating this was kind
of like the Arab-Israel peace treaty, but with a lot more partners,” King
said.

The new effort, which has yet to propose its first research project, is the
most open approach Facebook’s taken yet. “We hope that will be a model that
replicates not just within Facebook but across the industry,” Facebook’s
Ginsberg said. “It’s a way to make data available for social science
research in a way that means that it’s both independent and maintains
privacy.”

But the new approach will also face an uphill battle to prove its
credibility. The new Facebook research project came together under the
company’s public relations and policy team, not its research group of PhDs
trained in ethics and research design. More than 200 scholars from the
Association of Internet Researchers, a global group of interdisciplinary
academics, have signed a letter saying the effort is too limited in the
questions it’s asking, and also that it risks replicating what sociologists
call the “Matthew effect,” where only scholars from elite universities—like
Harvard and Stanford—get an inside track.

“Facebook’s new initiative is set up in such a way that it will select
projects that address known problems in an area known to be problematic,”
the academics wrote. The research effort, the letter said, also won’t let
the world—or Facebook, for that matter—get ahead of the next big problem.



--
---------------------------------------------------------------
Joly MacFie  218 565 9365 Skype:punkcast
--------------------------------------------------------------
-



More information about the Air-L mailing list