New Publication: Navigating the gray areas of content moderation


We are excited to announce the release of the first study from Andrea Stockinger's dissertation project. This study, co-authored with Svenja Schäfer and Sophie Lecheler, delves into the intriguing world of online content moderation, exploring its challenges and the impact of AI-based tools.

Full Article.

Professional content moderators are responsible for limiting the negative effects of online discussions on news platforms and social media. However, little is known about how they adjust to platform and company moderation strategies while viewing and dealing with uncivil comments. Using qualitative interviews (N = 18), this study examines which types of comments professional moderators classify as actionable, which (automated) strategies they use to moderate them, and how these perceptions and strategies differ between organizations, platforms, and individuals. Our results show that moderators divide content requiring intervention into clearly problematic and “gray area” comments. They (automatically) delete clear cases but use interactive or motivational moderation techniques for “gray areas.” While moderators crave more advanced technologies, they deem them incapable of addressing context-heavy comments. These findings highlight the need for nuanced regulations, emphasize the crucial role of moderators in shaping public discourse, and offer practical implications for (semi-)automated content moderation strategies.