[Air-L] WEBINAR 13th October - Facial Recognition: the Big Picture

Justine Gangneux justinegangneux at gmail.com
Wed Sep 23 00:39:47 PDT 2020


Dear AoIR colleagues,

I hope that this email finds you well. See below information about an event
which will be of interest to some of you.

Kind regards,

Justine

----------------------------------------------------------------------------------------------------------
Tuesday, October 13, 2020, 5.30 pm (BST)

FACIAL RECOGNITION – THE BIG PICTURE
https://www.openrightsgroup.org/events/org-glasgow-presents-facial-recognition-the-big-picture/


Recently, activists and civil rights organisations such as Liberty
<https://www.libertyhumanrights.org.uk/wp-content/uploads/2020/02/LIBERTYS-BRIEFING-ON-FACIAL-RECOGNITION-November-2019-CURRENT.pdf>,
Open Rights Group
<https://bigbrotherwatch.org.uk/wp-content/uploads/2019/11/Big-Brother-Watch-and-Open-Rights-Group-Joint-Submission-to-the-Scottish-Justice-Sub-Committee-on-Policing-inquiry-into-Facial-Recognition-November-2019.pdf>and
WebRoots
<https://webrootsdemocracy.files.wordpress.com/2020/08/unmasking-facial-recognition-webroots-democracy.pdf>
have gathered and published evidence of the ways in which automated Facial
Recognition technologies risk infringing human rights. Following on from
their efforts, we have started to see a real push back against the trial
and use of these technologies by law enforcement in public spaces. For
example, its use by the police in Wales has been ruled unlawful and deemed
to be a violation of human rights by the Court of Appeal of England and
Wales in August 2020 <https://www.bbc.co.uk/news/uk-wales-53734716>; Police
Scotland has given up on its plans to deploy Facial Recognition
technologies because of how they discriminate based on gender and race
<https://digitalpublications.parliament.scot/Committees/Report/JSP/2020/2/11/Facial-recognition--how-policing-in-Scotland-makes-use-of-this-technology>;
US cities
<https://cities-today.com/should-cities-ban-facial-recognition-technology/>
like Boston, San Francisco, or Portland have banned these technologies on
the same basis. These debates have become even more salient in the context
of the recent pro-democracy demonstrations in Hong Kong
<https://www.nytimes.com/2019/07/26/technology/hong-kong-protests-facial-recognition-surveillance.html>,
and the Black Lives Matter campaign for racial justice in the US
<https://www.wired.com/story/defending-black-lives-means-banning-facial-recognition/>
.

It is therefore more important than ever to continue and broaden a societal
conversation on the implications of using Facial Recognition technologies,
the ways in which these technologies are deployed and regulated in
different contexts, the risks that they pose, and how they can reinforce
existing inequalities.

Join us online for a discussion which will tackle these questions from a
historical, legal, techno-social, and human rights perspective.
SPEAKERS

Benedetta Catanzariti is a PhD candidate in Science, Technology and
Innovation Studies at the University of Edinburgh, researching on the
relationship between surveillance, AI and society. Her academic background
is in philosophy and she is particularly interested in the way technology
shapes our identity and contributes to reinforce or, alternatively,
dismantle social inequalities. She is currently looking at the design of
the classification techniques underpinning the development and use of
automated facial and affect recognition systems.

@techno_katt

Areeq Chowdhury is the founder and director of WebRoots Democracy
<https://webrootsdemocracy.org/>, a think tank advocating for progressive
and inclusive technology policy. Areeq Chowdry has worked at the Foreign
and Commonwealth Office; the Department for Digital, Culture, Media and
Sport; London City Hall; the UK Parliament; KPMG; and Future Advocacy. He
has also provided commentary on technology policy issues for a range of
media outlets including  Al Jazeera, the BBC, and Sky News.

@AreeqChowdhury

Lachlan D. Urquhart is a Lecturer in Technology Law at the University of
Edinburgh.  He is also a core member of the Centre for Data, Culture and
Society <https://www.cdcs.ed.ac.uk/> and Director of the eLL.M in
Information Technology Law. Lachlan is currently working on a major research
project
<https://www.law.ed.ac.uk/news-events/news/emotional-ai-research-team-awarded-esrc-co-funded-project>
entitled ‘Emotional AI in Cities: Cross Cultural Lessons from UK and Japan
on Designing for An Ethical Life’ which examines the socio-technical,
governance and cultural dimensions of affect sensing technologies in urban
life. His work sits at the intersection of computer science, information
technology law, and computer ethics, and focuses on the technical,
sociological, and interactional implications of living with interactive
computing.

@mooseabyte
REGISTER HERE: https://register.gotowebinar.com/register/532653320182875150



More information about the Air-L mailing list