Partner: The Greens/EFA in the European Parliament
Content moderation as a practice is common across online platforms that heavily rely on user-generated content, such as social media platforms, online marketplaces, communities and forums, etc. These online platforms abound in content created and shared organically by users such as text, images, video, posts, tweets, reviews. Major social media platforms rely on teams of content moderators to ensure that objectionable content such as explicit images, pornography and spam are removed before they reach a large audience. Moderation can occur before or after content is posted to a platform. Content governance models of large international online platforms focus on the deletion of content, which represents numerous challenges for freedom of expression and the digital rights of users. Many alternative content governance models that can lead to better outcomes are used by smaller platforms.
Therefore, this study will investigate alternative approaches to content moderation and content governance models of community-led platforms. By doing so, the study could contribute to making some of these innovations better known and ensure that they are meaningfully considered within the debate on the EU Digital Services Act (DSA).
Keywords: Content Moderation, Alternative Content Governance Models, Online Platforms
This report will be available soon for download.