[Air-L] Ethical Tech: Bias, Algorithms and Social Justice (event @King's College London)

Feldman, Zeena zeena.feldman at kcl.ac.uk
Tue Jan 21 02:28:35 PST 2020


Dear friends and colleagues,

For those in London on Wednesday, 5 February, please join us for a discussion on 'Ethical Tech: Bias, Algorithms and Social Justice' with Dr Shauna Concannon (CRASSH, University of Cambridge) and Dr Clara Crivellaro (Newcastle University). The event is free and you can register at https://www.eventbrite.co.uk/e/ethical-tech-bias-algorithms-and-social-justice-tickets-89999411663.

Where: King’s College London (The Strand, King’s Building, room K2.31 Nash Lecture Theatre, London WC2R 2LS)
When: Wednesday, 5 February, 17:30 to 19:00 (followed by a wine reception)

Details:

Ethical Tech: Bias, Algorithms and Social Justice
with Dr Shauna Concannon and Dr Clara Crivellaro (chaired by Dr Zeena Feldman, KCL)

'Beginning with Bias: Unpacking the Complexities of Developing Ethical AI Systems'
Dr Shauna Concannon (CRASSH, University of Cambridge)
Language-based Artificially Intelligent systems are typically trained on linguistic data generated by people; consequently, they routinely reinforce existing social biases. The European Commission’s guidance on trustworthy AI instructs that socially constructed biases must be removed from datasets used to train AI systems. However, the biases inherent in linguistic data are vast and varied, and developing a balanced and unbiased dataset may be practically impossible to implement. Furthermore, debiasing as a concept is opaque and what it means in practice for language-based systems is unclear. The goal of bias removal is to ensure systems operate fairly, but it also involves instigating a conscientious change from how language is used and represented (in the relevant dataset) to how it should be. In this talk I will explore the ethical and societal implications of artificially intelligent language-related technologies. Drawing on critical and feminist theoretical perspectives, and insights from practical efforts to mitigate bias from neural language systems, I will reflect on the complexities and considerations required to develop artificially intelligent systems that are ethical and just.

'Exploring fairness and justice in digital innovation'
Dr Clara Crivellaro (University of Newcastle)
In this talk, I will explore how a social justice perspective can help engage with the structures and dominant world views that underline much digital innovation and the replication of social issues. I will first position the work within current discourses on digital innovation, ethics and equity in order to outline critical lines of inquiry. Then, drawing from a range of projects and activities, I will discuss ways in which we can open up possibilities and proactively begin to create the conditions for technologies to support social innovation and social justice’ aspirations and goals.

Registration at https://www.eventbrite.co.uk/e/ethical-tech-bias-algorithms-and-social-justice-tickets-89999411663.


All best wishes,
Zeena


Dr Zeena Feldman
Lecturer in Digital Culture
King’s College London
Department of Digital Humanities
The Strand
Chesham Building, 0.03
London WC2R 2LS

zeena.feldman at kcl.ac.uk<mailto:zeena.feldman at kcl.ac.uk>

https://www.kcl.ac.uk/people/dr-zeena-feldman

New book: Digital Food Cultures (with Deborah Lupton) – available for pre-order now https://www.routledge.com/Digital-Food-Cultures-1st-Edition/Lupton-Feldman/p/book/9781138392595

The Quitting Social Media project: https://quittingsocialmedia.wordpress.com<https://quittingsocialmedia.wordpress.com/>

Art & the Politics of Visibility (IB Tauris/Bloomsbury, 2017): https://www.bloomsbury.com/uk/art-and-the-politics-of-visibility-9781780769066



More information about the Air-L mailing list