Using Terms and Conditions to apply Fundamental Rights to Content Moderation

German Law Journal, 2023

Abstract

Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platformʼs terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards. If this is possible Article 14 may fulfil its revolutionary potential.

Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions

Bibtex

Article{nokey, title = {Using Terms and Conditions to apply Fundamental Rights to Content Moderation}, author = {Quintais, J. and Appelman, N. and Fahy, R.}, doi = {https://doi.org/10.1017/glj.2023.53}, year = {2023}, date = {2023-07-11}, journal = {German Law Journal}, abstract = {Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platformʼs terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards. If this is possible Article 14 may fulfil its revolutionary potential.}, keywords = {Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions}, }

How platforms govern users’ copyright-protected content: Exploring the power of private ordering and its implications download

Quintais, J., De Gregorio, G. & Magalhães, J.C.
Computer Law & Security Review, vol. 48, 2023

Abstract

Online platforms provide primary points of access to information and other content in the digital age. They foster users’ ability to share ideas and opinions while offering opportunities for cultural and creative industries. In Europe, ownership and use of such expressions is partly governed by a complex web of legislation, sectoral self- and co-regulatory norms. To an important degree, it is also governed by private norms defined by contractual agreements and informal relationships between users and platforms. By adopting policies usually defined as Terms of Service and Community Guidelines, platforms almost unilaterally set use, moderation and enforcement rules, structures and practices (including through algorithmic systems) that govern the access and dissemination of protected content by their users. This private governance of essential means of access, dissemination and expression to (and through) creative content is hardly equitable, though. In fact, it is an expression of how platforms control what users – including users-creators – can say and disseminate online, and how they can monetise their content. As platform power grows, EU law is adjusting by moving towards enhancing the responsibility of platforms for content they host. One crucial example of this is Article 17 of the new Copyright Directive (2019/790), which fundamentally changes the regime and liability of “online content-sharing service providers” (OCSSPs). This complex regime, complemented by rules in the Digital Services Act, sets out a new environment for OCSSPs to design and carry out content moderation, as well as to define their contractual relationship with users, including creators. The latter relationship is characterized by significant power imbalance in favour of platforms, calling into question whether the law can and should do more to protect users-creators. This article addresses the power of large-scale platforms in EU law over their users’ copyright-protected content and its effects on the governance of that content, including on its exploitation and some of its implications for freedom of expression. Our analysis combines legal and empirical methods. We carry our doctrinal legal research to clarify the complex legal regime that governs platforms’ contractual obligations to users and content moderation activities, including the space available for private ordering, with a focus on EU law. From the empirical perspective, we conducted a thematic analysis of most versions of the Terms of Services published over time by the three largest social media platforms in number of users – Facebook, Instagram and YouTube – so as to identify and examine the rules these companies have established to regulate user-generated content, and the ways in which such provisions shifted in the past two decades. In so doing, we unveil how foundational this sort of regulation has always been to platforms’ functioning and how it contributes to defining a system of content exploitation.

CDSM Directive, Content moderation, Copyright, creators, Digital services act, online content, Online platforms, platform regulation, private ordering, terms of service

Bibtex

Article{nokey, title = {How platforms govern users’ copyright-protected content: Exploring the power of private ordering and its implications}, author = {Quintais, J. and De Gregorio, G. and Magalhães, J.C.}, url = {https://www.ivir.nl/publications/how-platforms-govern-users-copyright-protected-content-exploring-the-power-of-private-ordering-and-its-implications/computer_law_and_security_review_2023/}, doi = {https://doi.org/10.1016/j.clsr.2023.105792}, year = {2023}, date = {2023-02-24}, journal = {Computer Law & Security Review}, volume = {48}, pages = {}, abstract = {Online platforms provide primary points of access to information and other content in the digital age. They foster users’ ability to share ideas and opinions while offering opportunities for cultural and creative industries. In Europe, ownership and use of such expressions is partly governed by a complex web of legislation, sectoral self- and co-regulatory norms. To an important degree, it is also governed by private norms defined by contractual agreements and informal relationships between users and platforms. By adopting policies usually defined as Terms of Service and Community Guidelines, platforms almost unilaterally set use, moderation and enforcement rules, structures and practices (including through algorithmic systems) that govern the access and dissemination of protected content by their users. This private governance of essential means of access, dissemination and expression to (and through) creative content is hardly equitable, though. In fact, it is an expression of how platforms control what users – including users-creators – can say and disseminate online, and how they can monetise their content. As platform power grows, EU law is adjusting by moving towards enhancing the responsibility of platforms for content they host. One crucial example of this is Article 17 of the new Copyright Directive (2019/790), which fundamentally changes the regime and liability of “online content-sharing service providers” (OCSSPs). This complex regime, complemented by rules in the Digital Services Act, sets out a new environment for OCSSPs to design and carry out content moderation, as well as to define their contractual relationship with users, including creators. The latter relationship is characterized by significant power imbalance in favour of platforms, calling into question whether the law can and should do more to protect users-creators. This article addresses the power of large-scale platforms in EU law over their users’ copyright-protected content and its effects on the governance of that content, including on its exploitation and some of its implications for freedom of expression. Our analysis combines legal and empirical methods. We carry our doctrinal legal research to clarify the complex legal regime that governs platforms’ contractual obligations to users and content moderation activities, including the space available for private ordering, with a focus on EU law. From the empirical perspective, we conducted a thematic analysis of most versions of the Terms of Services published over time by the three largest social media platforms in number of users – Facebook, Instagram and YouTube – so as to identify and examine the rules these companies have established to regulate user-generated content, and the ways in which such provisions shifted in the past two decades. In so doing, we unveil how foundational this sort of regulation has always been to platforms’ functioning and how it contributes to defining a system of content exploitation.}, keywords = {CDSM Directive, Content moderation, Copyright, creators, Digital services act, online content, Online platforms, platform regulation, private ordering, terms of service}, }

Using Terms and Conditions to Apply Fundamental Rights to Content Moderation external link

German Law Journal (forthcoming), 2022

Abstract

Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platform's terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards, and therefore allowing Article 14 to fulfil its revolutionary potential.

Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions

Bibtex

Article{nokey, title = {Using Terms and Conditions to Apply Fundamental Rights to Content Moderation}, author = {Quintais, J. and Appelman, N. and Fahy, R.}, url = {https://osf.io/f2n7m/}, year = {2022}, date = {2022-11-25}, journal = {German Law Journal (forthcoming)}, abstract = {Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platform\'s terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards, and therefore allowing Article 14 to fulfil its revolutionary potential.}, keywords = {Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions}, }

Should We Regulate Digital Platforms? A New Framework for Evaluating Policy Options external link

Nooren, P., Gorp, N. van, van Eijk, N. & Fahy, R.
Policy & Internet, vol. 2018, pp: 264-301, 2018

Abstract

The economic and societal impact of digital platforms raises a number of questions for policymakers, including whether existing regulatory approaches and instruments are sufficient to promote and safeguard public interests. This article develops a practical framework that provides structure and guidance to policymakers who design policies for the digital economy. The framework differs from other approaches in taking the digital business models of platforms as the starting point for the analysis. The framework consists of three pillars, namely determining a platform's characteristics, relating these to public interests, and formulating policy options. The framework then invokes a return‐path analysis for assessing how the interventions affect the business model, whether it has the desired effect on public interests, and ensuring it has no undesired side‐effects on public interests. The framework puts forward two key messages for current discussions on digital platforms. First, one should look at the underlying characteristics of platforms rather than trying to understand digital platforms as a single category. Second, policymakers should explore existing rules and policy options, as they seem fit to deal with several characteristics of digital platforms in a time frame that matches the rapid development of platform technologies and business models.

business model analysis, competition policy, consumer protection, digital platforms, frontpage, platform regulation, public interests, Technologie en recht

Bibtex

Article{Nooren2018, title = {Should We Regulate Digital Platforms? A New Framework for Evaluating Policy Options}, author = {Nooren, P. and Gorp, N. van and van Eijk, N. and Fahy, R.}, url = {https://www.ivir.nl/publicaties/download/Policy_and_Internet_2018.pdf}, doi = {https://doi.org/https://doi.org/10.1002/poi3.177}, year = {0911}, date = {2018-09-11}, journal = {Policy & Internet}, volume = {2018}, pages = {264-301}, abstract = {The economic and societal impact of digital platforms raises a number of questions for policymakers, including whether existing regulatory approaches and instruments are sufficient to promote and safeguard public interests. This article develops a practical framework that provides structure and guidance to policymakers who design policies for the digital economy. The framework differs from other approaches in taking the digital business models of platforms as the starting point for the analysis. The framework consists of three pillars, namely determining a platform\'s characteristics, relating these to public interests, and formulating policy options. The framework then invokes a return‐path analysis for assessing how the interventions affect the business model, whether it has the desired effect on public interests, and ensuring it has no undesired side‐effects on public interests. The framework puts forward two key messages for current discussions on digital platforms. First, one should look at the underlying characteristics of platforms rather than trying to understand digital platforms as a single category. Second, policymakers should explore existing rules and policy options, as they seem fit to deal with several characteristics of digital platforms in a time frame that matches the rapid development of platform technologies and business models.}, keywords = {business model analysis, competition policy, consumer protection, digital platforms, frontpage, platform regulation, public interests, Technologie en recht}, }