[Air-L] Digital Methods Summer School 2018 -- Call for Participation

Fernando van der Vlist fernando.vandervlist at gmail.com
Mon Mar 19 09:34:24 PDT 2018


Dear all,

The Digital Methods Initiative (DMI) will host its 12th annual Digital
Methods Summer School from July 2-13, 2018 at the University of Amsterdam,
the Netherlands. Below please find the call for participation.

This year’s theme is: "Retraining the machine: Addressing algorithmic
bias". The deadline for application is May 4, 2018. More information is
available at https://bit.ly/dmi18-ss-call or email to
summerschool at digitalmethods.net .

Best regards,

Fernando van der Vlist
Research Associate, Collaborative Research Centre "Media of Cooperation",
University of Siegen
Research Associate, Digital Methods Initiative, University of Amsterdam
Lecturer, New Media and Digital Culture, University of Amsterdam

--

# CALL FOR PARTICIPATION
# DIGITAL METHODS SUMMER SCHOOL 2018
# JULY 2-13, 2018
# UNIVERSITY OF AMSTERDAM

# RETRAINING THE MACHINE
# ADDRESSING ALGORITHMIC BIAS

--

## DIGITAL METHODS SUMMER SCHOOL

This year's Digital Methods Summer School is dedicated to approaches to
studying so-called machine bias. Discussions have been focusing on how to
hold algorithms accountable for discrimination in their outputting of
results such as in the notorious cases of query results for 'professional
hair' (white women's hair-do's) and 'unprofessional hair' (black women's'
hair-do's). Recently, it was found that search engine image results for
'pregnancy' and 'unwanted pregnancy' are similarly divided, with the
pregnancy queries returning white skinned women (mainly bellies,
privileging the baby over the woman). 'Unwanted pregnancy' results in
diverse ethnicities. These are new variations on classic, and still urgent,
search engine critiques (once known as 'googlearchies') which questioned
the hierarchies built into rankings, asking who is being authorised by the
engine to provide the information. That work moves forward at the Summer
School, building on examinations of the volatility of engine results, as in
the Issue Dramaturg project, which put on display the drama of websites
rising and falling in their rankings after algorithmic updates, meant to
fight spam, but having unintended, epistemological consequences. More
recently, Facebook newsfeeds have been the source of critique for their
privileging and burying mechanisms, however much they -- like the engine
returns preceding them -- are not easily captured and documented. Saving
engine results has been against the terms of service; making derivative
works out of engine results also breaks the user contract. Saving, or
recording, social media (newsfeed) rolls seems even less practicable given
how feeds are even more personalised, presumably resisting generalisable
findings. User surveys pointing out unexpected newsfeed results have led to
calls for 'algorithmic auditing', a precursor to machine bias critique. As
reported in the technical press, querying social media ad interfaces shows
highly segmented audiences (including racist ones such as publics to target
for 'jew haters' among other available keyword audiences for sale).

These ad interface results could be repurposed to show which population
segments (as defined by the platforms) are driving the content choices
reflected in the results served. How large are these discriminatory
segments? Capturing, auditing, or repurposing results are diagnostic
practices, identifying under which circumstances machines could or ought to
be retrained. The larger question, however, concerns how to retrain the
machine. One approach lies in query design -- fashioning queries so as to
're-bias' the results. Others concern corpus development. For example in
stock photography efforts have been to reimagine ('re-image') women (in the
well-known case of Getty Images' 'Lean In Collection'), however much the
images are often used out of context, as has been found. Yet another one
concerns training and maturing research accounts to trigger controlled
algorithmic responses.

The Digital Methods Summer School is interested in contributing not only to
interpretations of celebrated cases of algorithmic or machine bias, but
also providing diagnostic, query-related, research account and
corpus-building research practices that seek to address the matter more
conceptually.

Expanding the case study collection is also of interest; age discrimination
in Facebook ad interfaces (an American theme) is a recent example of a
telling case study of in-built rather than organic machine bias, but the
international landscape may contribute more to bias detection, as is the
aim of the Summer School. In Twitter there are feminist bots striving to
keep the #metoo space serious, since the spam has arrived. Which other
practices of remaining on topic may be found, and how may their success and
and complications be characterised? There is also the question of the
ramifications of conceptual contributions to re-biasing for big data
science. Which practical contributions could be made to big data critique?

## APPLICATIONS: KEY DATES

To apply for the Digital Methods Summer School 2018, please use the
University of Amsterdam Summer School form. If that form is not working,
please send (i) a one-page letter explaining how digital methods training
would benefit your current work, (ii) enclose a CV (with full postal
address), (iii) a copy of your passport (details page only), (iv) a
headshot photo, and (v) a 100-word bio (to be included in the Summer School
welcome package). Mark your application 'DMI Training Certificate Program,'
and send to summerschool at digitalmethods.net.

* 4 May: Deadline for applications.
* 7 May: Notifications. Accepted participants will later receive a welcome
package in mid June, which includes a reader, a day-to-day schedule, and a
face book of all participants.
* 18 June: Deadline for summer school fee payments. Participants must send
a proof of payment by this date.

The cost of the Summer School is EUR 895 and is open to PhD candidates and
motivated scholars as well as to research master's students and advanced
master's students. Data journalists, artists, and research professionals
are also welcome to apply. Accepted applicants will be informed of the bank
transfer details upon notice of acceptance to the Summer School on 7 May.
Note: University of Amsterdam students are exempt from tuition and should
state on the application form (under tuition fee remarks) that they wish to
apply for a fee waiver. Please also provide your student number.

Any questions may be addressed to the Summer School coordinators, Esther
Weltevrede and Fernando van der Vlist: summerschool at digitalmethods.net.
Informal queries may be sent to this email address as well.



More information about the Air-L mailing list