[Air-L] CfP Content Moderation and Its Discontents

Jacob Johanssen johanssenjacob at gmail.com
Tue Mar 28 02:30:44 PDT 2023


*Content Moderation and Its Discontents*

*An International Conference, St. Mary’s University (London, UK), 23 June
2023 *

*Call for Papers*



In recent years, content moderation on social media has become a contested
topic. As forms of online extremism, harassment and hate speech have
increased on major platforms, community guidelines and AI-based detection
and moderation policies have been introduced to tighten regulation, yet
such plans often fall short or are actively circumvented, as Elon Musk’s
efforts at undoing what he regards as excessive restriction of “free
speech” on Twitter demonstrate for instance. Some spaces on the internet
are also notorious for their lax or inexistent moderation policies, such as
Reddit or 4chan. In times of culture wars, many have also claimed an
erosion of free speech in a climate of “cancel culture” in which it has
allegedly become impossible to share particular viewpoints or ideas.
Equally many have disputed the very existence of cancel culture and have
pointed out that many instances of deplatforming or cancellation are mere
shifts in discourses whereby voices who have been marginalised are now
claiming their rightful place on the discursive stage.



This conference takes the complex relationship between the facilitation and
moderation of content and particularly of content that can broadly be
labelled “problematic” as a starting point.  Opinions among politicians,
tech experts, academics and others differ in relation to what types of
content should be flagged or removed by platforms or if too restrictive
regulation constitutes censorship. A blanket ban on “problematic” online
content is unrealistic and perhaps counterproductive. While illegal content
or posts that constitute hate speech such as death threats should surely be
removed, the question of content moderation becomes more complex when it
comes to “problematic” content such as images of self-harm, “pro-Ana”
communities or the open discussion of mental health problems like
suicidality or depression.

This interdisciplinary conference invites papers that deal with the
complexity of content moderation and associated topics from different
perspectives.



We invite critical papers that can address but are not limited to:

-         The dynamics and functions of content moderation (flagging, up
and downvoting, removing, etc.)
-         Content moderators, labour and trauma
-         Practices of in/visibility: shadow banning, soft blocking and
other forms of disengagement
-         Self and collective regulation (e.g. in fora)
-         Content moderation, regulation and the law
-         Culture wars and cancel culture
-         Polarisation, filter bubbles and echo chambers
-         Problematic content and mental health communities
-         Shitstorms and online shaming
-         Psychodynamics of social media and platforms



This conference is organised by Jacob Johanssen (St. Mary’s University),
Daniela Nadj (St. Mary’s University) and Susanne Benzel (Sigmund Freud
Institute, University of Frankfurt) as part of the research project *Mapping
Online Mental Illness Communities: History, Representation and Questions of
Regulation*. Lunch and refreshments will be provided. It will be a hybrid
event. In-person attendance is encouraged and virtual participation is
possible.



Please submit an abstract of 250-500 words by *08 May* to the organisers:
jacob.johanssen at stmarys.ac.uk, daniela.nadj at stmarys.ac.uk and
benzel at sigmund-freud-institut.de


More information about the Air-L mailing list