Using Terms and Conditions to apply Fundamental Rights to Content Moderation

German Law Journal, 2023

Abstract

Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platformʼs terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards. If this is possible Article 14 may fulfil its revolutionary potential.

Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions

Bibtex

Article{nokey, title = {Using Terms and Conditions to apply Fundamental Rights to Content Moderation}, author = {Quintais, J. and Appelman, N. and Fahy, R.}, doi = {https://doi.org/10.1017/glj.2023.53}, year = {2023}, date = {2023-07-11}, journal = {German Law Journal}, abstract = {Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platformʼs terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards. If this is possible Article 14 may fulfil its revolutionary potential.}, keywords = {Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions}, }

Copyright Content Moderation in the EU: Conclusions and Recommendations download

Quintais, J., Katzenbach, C., Schwemer, S., Dergacheva, D., Riis, T., Mezei, P. & Harkai, I.
2023

Abstract

This report is a deliverable in the reCreating Europe project. The report describes and summarizes the results of our research on the mapping of the EU legal framework and intermediaries’ practices on copyright content moderation and removal. In particular, this report summarizes the results of our previous deliverables and tasks, namely: (1) our Final Report on mapping of EU legal framework and intermediaries’ practices on copyright content moderation and removal; and (2) our Final Evaluation and Measuring Report - impact of moderation practices and technologies on access and diversity. Our previous reports contain a detailed description of the legal and empirical methodology underpinning our research and findings. This report focuses on bringing together these findings in a concise format and advancing policy recommendations.

Content moderation, Copyright, Digital services act, Digital Single Market, intermediaries, Online platforms, terms and conditions

Bibtex

Report{nokey, title = {Copyright Content Moderation in the EU: Conclusions and Recommendations}, author = {Quintais, J. and Katzenbach, C. and Schwemer, S. and Dergacheva, D. and Riis, T. and Mezei, P. and Harkai, I.}, url = {https://www.ivir.nl/publications/copyright-content-moderation-in-the-eu-conclusions-and-recommendations/ssrn-id4403423/}, year = {2023}, date = {2023-03-30}, abstract = {This report is a deliverable in the reCreating Europe project. The report describes and summarizes the results of our research on the mapping of the EU legal framework and intermediaries’ practices on copyright content moderation and removal. In particular, this report summarizes the results of our previous deliverables and tasks, namely: (1) our Final Report on mapping of EU legal framework and intermediaries’ practices on copyright content moderation and removal; and (2) our Final Evaluation and Measuring Report - impact of moderation practices and technologies on access and diversity. Our previous reports contain a detailed description of the legal and empirical methodology underpinning our research and findings. This report focuses on bringing together these findings in a concise format and advancing policy recommendations.}, keywords = {Content moderation, Copyright, Digital services act, Digital Single Market, intermediaries, Online platforms, terms and conditions}, }

Impact of content moderation practices and technologies on access and diversity external link

Schwemer, S., Katzenbach, C., Dergacheva, D., Riis, T. & Quintais, J.
2023

Abstract

This Report presents the results of research carried out as part of Work Package 6 “Intermediaries: Copyright Content Moderation and Removal at Scale in the Digital Single Market: What Impact on Access to Culture?” of the project “ReCreating Europe”, particularly on Tasks 6.3 (Evaluating Legal Frameworks on the Different Levels (EU vs. national, public vs. private) and 6.4 (Measuring the impact of moderation practices and technologies on access and diversity). This work centers on a normative analysis of the existing public and private legal frameworks with regard to intermediaries and cultural diversity, and on the actual impact on intermediaries’ content moderation on diversity.

Content moderation, Copyright, Digital services act, Digital Single Market, intermediaries, Online platforms, terms and conditions

Bibtex

Report{nokey, title = {Impact of content moderation practices and technologies on access and diversity}, author = {Schwemer, S. and Katzenbach, C. and Dergacheva, D. and Riis, T. and Quintais, J.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4380345}, year = {2023}, date = {2023-03-23}, abstract = {This Report presents the results of research carried out as part of Work Package 6 “Intermediaries: Copyright Content Moderation and Removal at Scale in the Digital Single Market: What Impact on Access to Culture?” of the project “ReCreating Europe”, particularly on Tasks 6.3 (Evaluating Legal Frameworks on the Different Levels (EU vs. national, public vs. private) and 6.4 (Measuring the impact of moderation practices and technologies on access and diversity). This work centers on a normative analysis of the existing public and private legal frameworks with regard to intermediaries and cultural diversity, and on the actual impact on intermediaries’ content moderation on diversity.}, keywords = {Content moderation, Copyright, Digital services act, Digital Single Market, intermediaries, Online platforms, terms and conditions}, }

Using Terms and Conditions to Apply Fundamental Rights to Content Moderation external link

German Law Journal (forthcoming), 2022

Abstract

Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platform's terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards, and therefore allowing Article 14 to fulfil its revolutionary potential.

Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions

Bibtex

Article{nokey, title = {Using Terms and Conditions to Apply Fundamental Rights to Content Moderation}, author = {Quintais, J. and Appelman, N. and Fahy, R.}, url = {https://osf.io/f2n7m/}, year = {2022}, date = {2022-11-25}, journal = {German Law Journal (forthcoming)}, abstract = {Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platform\'s terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards, and therefore allowing Article 14 to fulfil its revolutionary potential.}, keywords = {Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions}, }

Using Terms and Conditions to apply Fundamental Rights to Content Moderation: Is Article 12 DSA a Paper Tiger? external link

Digital services act, DSA, frontpage, Fundamental rights, Online platforms, terms and conditions

Bibtex

Online publication{Appelman2021, title = {Using Terms and Conditions to apply Fundamental Rights to Content Moderation: Is Article 12 DSA a Paper Tiger?}, author = {Appelman, N. and Quintais, J. and Fahy, R.}, url = {https://verfassungsblog.de/power-dsa-dma-06/}, doi = {https://doi.org/10.17176/20210901-233103-0.}, year = {0901}, date = {2021-09-01}, keywords = {Digital services act, DSA, frontpage, Fundamental rights, Online platforms, terms and conditions}, }