Pieter Wolters is an associate professor at the Radboud Business Law Institute and Radboud's interdisciplinary research hub on digitalization and society (iHub). He studied law at the Radboud University, the University of North Carolina at Chapel Hill and the University of Michigan. He obtained his PhD in law in 2013. His research focusses on the private law aspects of digitalisation, data protection and cybersecurity. His recent research includes the online formation of companies, data sharing and data transfers, the role of cybersecurity in hybrid conflicts and content moderation and the European legal framework for cybersecurity.
Content moderation by online platforms and a resilient democracy
Online platforms can be a boon to democracy. They facilitate the dissemination of information, especially for individuals, minorities and small organisations that would not have an effective platform otherwise. At the same time, online platforms can also have negative effects. They can increase polarisation and facilitate the dissemination of disinformation and hate speech. For a long time, online platforms hardly had any legal responsibilities to address these risks.
The increasing relevance of online platforms has led to exceptions to this premise. The European Union adopted several instruments that require online platforms to take more responsibility. This includes the general Digital Services Act (DSA), but also more specific instruments such as the Strengthened Code of Practice on Disinformation, Code of conduct on countering illegal hate speech online, the Regulation on the transparency and targeting of political advertising and the Media freedom act. These instruments include relatively vague ‘content moderation rules’ that have to be interpreted and elaborated upon in guidelines, best practices, recommendations, policies and structured dialogues.
The result is a complicated multi-level legal framework of content moderation rules. Although the various rules have been analysed in isolation, a thorough and recent overview does not exist. This contribution will provide such an overview of the European legal framework. Furthermore, it will assess the effectiveness of this legal framework. The legal framework is considered effective if it can prevent the negative effects of online platforms without unduly affecting the positive effects.
This contribution shows that the effectiveness of the general rules of the DSA is limited. Their general character, extensive safeguards for fundamental rights and focus on illegal content makes them unsuitable for the prevention of content that negatively affects the resilience of democracy. In contrast, the more specific instruments facilitate more targeted rules that protect democracy against the dissemination of undesirable content without unduly affecting the necessary rule of law preconditions. At the same time, even these specific rules still need extensive interpretation and elaboration.