[Air-L] new materials for teaching/thinking about AI, Algorithms and Accountability

Christian Sandvig csandvig at umich.edu
Fri Dec 1 08:53:10 PST 2017


Dear AoIR colleagues,

Following up on my September emails, a number of recordings and resources
related to the conference are online. Two of them had wide participation
from the Internet, so thank you to the Internet and the air-l mailing list
for that.

Some of the things available now:

The Compressed Auditing Algorithms Film Festival
 - a ~45 minute program of short films and excerpts investigating
depictions of algorithms in popular media and education (all videos online)

The Computer Says No: The Bad News About Online Discrimination in
Algorithmic Systems
 - a recording of a 90-minute interactive roundtable intended for a general
audience

How to Watch Them Watching You: Researching Social Media, Online Platforms,
and Algorithmic Systems From the Outside
 - a recording of a 90-minute interactive roundtable intended for Internet
researchers

Just Google Me*
 - an alternative business card that you can hand out instead of your own,
if you want to

Algorithms Dérive
 - an app-based self-guided, collaboratively-produced tour intended to
promote reflection about algorithms

The Top 10 Kinds of Fairness
 - A 20-minute talk promoting a new research directions for the FAT
(Fairness, Accountability, Transparency) technical community. It argues for
an expansive definition of fairness that goes beyond “statistical” fairness
or “comparative” fairness, and for automated monitoring for fairness as a
research problem.

All are at...

http://auditingalgorithms.science/


...with more things on the way.

I hope this is helpful,
Christian




---------- Forwarded message ----------
From: Christian Sandvig <csandvig at umich.edu>
Date: Thu, Sep 28, 2017 at 10:41 AM
Subject: Reminder: Starts TODAY: Live-streams of THE COMPUTER SAYS NO + HOW
TO WATCH THEM WATCHING YOU (4pm EDT)
To: air-l at listserv.aoir.org



Dear AoIR colleagues, A reminder that the first event (of two) starts today
at 4 p.m. Eastern Daylight Time (UTC/GMT -4 hours). Hope to "see" you
"there" via the question tool or on twitter!  Feel free to forward. Best,
Christian


---------- Forwarded message ----------
From: Christian Sandvig <csandvig at umich.edu>
Date: Thu, Sep 21, 2017 at 6:38 PM
Subject: Mark Your Calendar: Live-streams of THE COMPUTER SAYS NO + HOW TO
WATCH THEM WATCHING YOU (9/28-9/29)
To: air-l at listserv.aoir.org



Hello AoIR colleagues,

We'll be hosting a workshop with some familiar AoIR people next week. I'm
writing to this list because we will be live-streaming two of our public
events on YouTube and I am wondering if you would like to "attend."  We
will be taking questions from the Internet via Twitter and the BKC question
tool. Even if you are not in the right timezone, I hope this will be of
interest. I dare Europe to stay awake for the first one, and Australians to
stay awake for the second one.

Live-Streamed Events:

THE COMPUTER SAYS NO:
The Bad News About Online Discrimination in Algorithmic Systems
Thursday, September 28, 2017
4:00-5:30 p.m. Eastern Daylight Time (UTC/GMT -4 hours)
http://auditingalgorithms.science/?p=53

HOW TO WATCH THEM WATCHING YOU:
Researching Social Media, Online Platforms, and Algorithmic Systems From
the Outside
Friday, September 29, 2017
10-11:30 a.m. Eastern Daylight Time (UTC/GMT -4 hours)
http://auditingalgorithms.science/?p=64

The overall occasion is a workshop entitled "Auditing Algorithms: Adding
Accountability to Automated Authority," explained here:
http://auditingalgorithms.science/  Its goal is a white paper about this
area of research, which I expect to emailing this list about again in the
future. We are also working on some weirder participatory online activities
-- if you like these topics you'll want to jump on in on them. More on that
after these events. For more context, here is a related quote:

"The equations of big-data algorithms have permeated almost every aspect of
our lives. A massive industry has grown up to comb and combine huge data
sets — documenting, for example, Internet habits — to generate profiles of
individuals. These often target advertising, but also inform decisions on
credit, insurance and more. They help to control the news or adverts we
see, and whether we get hired or fired. They can determine whether
surveillance and law-enforcement agencies flag us as likely activists or
dissidents — or potential security or criminal threats….Largely absent from
the widespread use of such algorithms are the rules and safeguards that
govern almost every other aspect of life in a democracy. There is an
asymmetry in algorithmic power and accountability…Fortunately, a strong
movement for greater algorithmic accountability is now under way.
Researchers hope to find ways to audit for bias….Society needs to discuss
in earnest how to rid software and machines of human bugs."

–Unsigned Editorial, Nature (2016)


Hope you'll be "there",
Christian
(on behalf of the co-organizers)


--
http://www-personal.umich.edu/~csandvig/



More information about the Air-L mailing list