Outsourcing Human Rights Obligations and Concealing Human Rights Deficits: The Example of Monetizing User-Generated Content Under the CDSM Directive and the Digital Services Act external link

Senftleben, M., Quintais, J. & Meiring, A.

Abstract

With the shift from the traditional safe harbor for hosting to statutory content filtering and licensing obligations, EU copyright law has substantially curtailed the freedom of users to upload and share their content creations. Seeking to avoid overbroad inroads into freedom of expression, EU law obliges online platforms and the creative industry to take into account human rights when coordinating their content filtering actions. Platforms must also establish complaint and redress procedures for users. The European Commission will initiate stakeholder dialogues to identify best practices. These “safety valves” in the legislative package, however, are mere fig leaves. Instead of safeguarding human rights, the EU legislator outsources human rights obligations to the platform industry. At the same time, the burden of policing content moderation systems is imposed on users who are unlikely to bring complaints in each individual case. The new legislative design in the EU will thus “conceal” human rights violations instead of bringing them to light. Nonetheless, the DSA rests on the same – highly problematic – approach. Against this background, the paper discusses the weakening – and potential loss – of fundamental freedoms as a result of the departure from the traditional notice-and-takedown approach. Adding a new element to the ongoing debate on content licensing and filtering, the analysis will devote particular attention to the fact that EU law, for the most part, has left untouched the private power of platforms to determine the “house rules” governing the most popular copyright-owner reaction to detected matches between protected works and content uploads: the (algorithmic) monetization of that content. Addressing the “legal vacuum” in the field of content monetization, the analysis explores outsourcing and concealment risks in this unregulated space. Focusing on large-scale platforms for user-generated content, such as YouTube, Instagram and TikTok, two normative problems come to the fore: (1) the fact that rightholders, when opting for monetization, de facto monetize not only their own rights but also the creative input of users; (2) the fact that user creativity remains unremunerated as long as the monetization option is only available to rightholders. As a result of this configuration, the monetization mechanism disregards users’ right to (intellectual) property and discriminates against user creativity. Against this background, we discuss whether the DSA provisions that seek to ensure transparency of content moderation actions and terms and conditions offer useful sources of information that could empower users. Moreover, we raise the question whether the detailed regulation of platform actions in the DSA may resolve the described human rights dilemmas to some extent.

Artificial intelligence, Content moderation, Copyright, derivative works, discrimination, Freedom of expression, Human rights, liability, user-generated content

RIS

Save .RIS

Bibtex

Save .bib

An end to shadow banning? Transparency rights in the Digital Services Act between content moderation and curation download

Computer Law & Security Review, vol. 48, 2023

Abstract

This paper offers a legal perspective on the phenomenon of shadow banning: content moderation sanctions which are undetectable to those affected. Drawing on recent social science research, it connects current concerns about shadow banning to novel visibility management techniques in content moderation, such as delisting and demotion. Conventional moderation techniques such as outright content removal or account suspension can be observed by those affected, but these new visibility often cannot. This lends newfound significance to the legal question of moderation transparency rights. The EU Digital Services Act (DSA) is analysed in this light, as the first major legislation to regulate transparency of visibility remedies. In effect, its due process framework prohibits shadow banning with only limited exceptions. In doing so, the DSA surfaces tensions between two competing models for content moderation: as rule-bound administration or as adversarial security conflict. I discuss possible interpretations and trade-offs for this regime, and then turn to a more fundamental problem: how to define visibility reduction as a category of content moderation actions. The concept of visibility reduction or ‘demotions’ is central to both the shadow banning imaginary and to the DSA's safeguards, but its meaning is far from straightforward. Responding to claims that demotion is entirely relative, and therefore not actionable as a category of content moderation sanctions, I show how visibility reduction can still be regulated when defined as ex post adjustments to engagement-based relevance scores. Still, regulating demotion in this way will not cover all exercises of ranking power, since it manifests not only in individual cases of moderation but also through structural acts of content curation; not just by reducing visibility, but by producing visibility.

content curation, Content moderation, Digital Services Act (DSA), Online platforms, Transparency

RIS

Save .RIS

Bibtex

Save .bib

Copyright Content Moderation in the EU: Conclusions and Recommendations download

Quintais, J., Katzenbach, C., Schwemer, S., Dergacheva, D., Riis, T., Mezei, P. & Harkai, I.
2023

Abstract

This report is a deliverable in the reCreating Europe project. The report describes and summarizes the results of our research on the mapping of the EU legal framework and intermediaries’ practices on copyright content moderation and removal. In particular, this report summarizes the results of our previous deliverables and tasks, namely: (1) our Final Report on mapping of EU legal framework and intermediaries’ practices on copyright content moderation and removal; and (2) our Final Evaluation and Measuring Report - impact of moderation practices and technologies on access and diversity. Our previous reports contain a detailed description of the legal and empirical methodology underpinning our research and findings. This report focuses on bringing together these findings in a concise format and advancing policy recommendations.

Content moderation, Copyright, Digital Services Act (DSA), Digital Single Market, intermediaries, Online platforms, terms and conditions

RIS

Save .RIS

Bibtex

Save .bib

Impact of content moderation practices and technologies on access and diversity external link

Schwemer, S., Katzenbach, C., Dergacheva, D., Riis, T. & Quintais, J.
2023

Abstract

This Report presents the results of research carried out as part of Work Package 6 “Intermediaries: Copyright Content Moderation and Removal at Scale in the Digital Single Market: What Impact on Access to Culture?” of the project “ReCreating Europe”, particularly on Tasks 6.3 (Evaluating Legal Frameworks on the Different Levels (EU vs. national, public vs. private) and 6.4 (Measuring the impact of moderation practices and technologies on access and diversity). This work centers on a normative analysis of the existing public and private legal frameworks with regard to intermediaries and cultural diversity, and on the actual impact on intermediaries’ content moderation on diversity.

Content moderation, Copyright, Digital Services Act (DSA), Digital Single Market, intermediaries, Online platforms, terms and conditions

RIS

Save .RIS

Bibtex

Save .bib

How platforms govern users’ copyright-protected content: Exploring the power of private ordering and its implications download

Quintais, J., De Gregorio, G. & Magalhães, J.C.
Computer Law & Security Review, vol. 48, 2023

Abstract

Online platforms provide primary points of access to information and other content in the digital age. They foster users’ ability to share ideas and opinions while offering opportunities for cultural and creative industries. In Europe, ownership and use of such expressions is partly governed by a complex web of legislation, sectoral self- and co-regulatory norms. To an important degree, it is also governed by private norms defined by contractual agreements and informal relationships between users and platforms. By adopting policies usually defined as Terms of Service and Community Guidelines, platforms almost unilaterally set use, moderation and enforcement rules, structures and practices (including through algorithmic systems) that govern the access and dissemination of protected content by their users. This private governance of essential means of access, dissemination and expression to (and through) creative content is hardly equitable, though. In fact, it is an expression of how platforms control what users – including users-creators – can say and disseminate online, and how they can monetise their content. As platform power grows, EU law is adjusting by moving towards enhancing the responsibility of platforms for content they host. One crucial example of this is Article 17 of the new Copyright Directive (2019/790), which fundamentally changes the regime and liability of “online content-sharing service providers” (OCSSPs). This complex regime, complemented by rules in the Digital Services Act, sets out a new environment for OCSSPs to design and carry out content moderation, as well as to define their contractual relationship with users, including creators. The latter relationship is characterized by significant power imbalance in favour of platforms, calling into question whether the law can and should do more to protect users-creators. This article addresses the power of large-scale platforms in EU law over their users’ copyright-protected content and its effects on the governance of that content, including on its exploitation and some of its implications for freedom of expression. Our analysis combines legal and empirical methods. We carry our doctrinal legal research to clarify the complex legal regime that governs platforms’ contractual obligations to users and content moderation activities, including the space available for private ordering, with a focus on EU law. From the empirical perspective, we conducted a thematic analysis of most versions of the Terms of Services published over time by the three largest social media platforms in number of users – Facebook, Instagram and YouTube – so as to identify and examine the rules these companies have established to regulate user-generated content, and the ways in which such provisions shifted in the past two decades. In so doing, we unveil how foundational this sort of regulation has always been to platforms’ functioning and how it contributes to defining a system of content exploitation.

CDSM Directive, Content moderation, Copyright, creators, Digital Services Act (DSA), online content, Online platforms, platform regulation, private ordering, terms of service

RIS

Save .RIS

Bibtex

Save .bib

Using Terms and Conditions to Apply Fundamental Rights to Content Moderation external link

German Law Journal (forthcoming), 2022

Abstract

Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platform's terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards, and therefore allowing Article 14 to fulfil its revolutionary potential.

Content moderation, Digital Services Act (DSA), Freedom of expression, Online platforms, platform regulation, terms and conditions

RIS

Save .RIS

Bibtex

Save .bib

Article 12 DSA: Will platforms be required to apply EU fundamental rights in content moderation decisions? external link

Content moderation, Digital Services Act (DSA), frontpage, Fundamental rights

RIS

Save .RIS

Bibtex

Save .bib

The Interplay between the Digital Services Act and Sector Regulation: How Special is Copyright? external link

Quintais, J. & Schwemer, S.
European Journal of Risk Regulation, vol. 13, iss. : 2, pp: 191-217, 2022

Abstract

On 15 December 2020, the European Commission published its proposal for a Regulation on a Single Market for Digital Services (Digital Services Act). It carries out a regulatory overhaul of the 21-year- old horizontal rules on intermediary liability in the Directive and introduces new due diligence obligations for intermediary services. Our analysis illuminates an important point that has so far received little attention: how would the Digital Services Act’s rules interact with existing sector-specific lex specialis rules? In this paper, we look specifically at the intersection of the Digital Services Act with the regime for online content sharing service providers (OCSSPs) set forth in art. 17 of Directive (EU) 2019/790 on copyright in the Digital Single Market (CDSM Directive). At first glance, these regimes do not appear to overlap as the rules on copyright are lex specialis to the Digital Services Act. A closer look shows a more complex and nuanced picture. Our analysis concludes that the DSA will apply to OCSSPs insofar as it contains rules that regulate matters not covered by art. 17 CDSM Directive, as well as specific rules on matters where art. 17 leaves margin of discretion to Member States. This includes, to varying degrees, rules in the DSA relating to the liability of intermediary providers and to due diligence obligations for online platforms of different sizes. Importantly, we consider that such rules apply even where art. 17 CDSM Directive contains specific (but less precise) regulation on the matter. From a normative perspective, this might be a desirable outcome, to the extent that the DSA aims to establish “uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected”. Based on our analysis, we suggest a number of clarifications that might be help achieve that goal.

Art. 17 CDSM Directive, Content moderation, Copyright, Digital Services Act (DSA), frontpage, Online platforms

RIS

Save .RIS

Bibtex

Save .bib

Intermediary Liability and Trade Mark Infringement – Proliferation of Filter Obligations in Civil Law Jurisdictions? external link

1126, pp: 381-403

Abstract

The erosion of the safe harbour for hosting in the EU Directive on Copyright in the Digital Single Market (CDSM Directive) leads to a remarkable climate change in the field of EU copyright law and the civil law jurisdictions of continental EU Member States. Inevitably, it raises the question of potential repercussions on the safe harbour for hosting and filtering standards in trademark cases. Even though online marketplaces are explicitly exempted from the new copyright rules and the CDSM Directive is not intended to neutralize the safe harbour for hosting in trademark cases, the adoption of a more restrictive approach in copyright law may quicken the appetite of trademark proprietors for similar measures in trademark law. The extension of the new copyright approach to trademark cases, however, is unlikely to yield satisfactory results.Due to the different conceptual contours of trademark rights, a system mimicking the filtering obligations following from the CDSM Directive would give trademark proprietors excessive control over the use of their trademarks in the digital environment. Such an overbroad system of automated, algorithmic filtering would encroach upon the fundamental guarantee of freedom of expression and freedom of competition. It is likely to have a chilling effect on legitimate descriptive use of trademarks, comparative advertising, advertising by resellers, information about alternative offers in the marketplace, and use criticizing or commenting upon trademarked products. As a result, consumers would receive less diverse information on goods and services and the free movement of goods and services in the internal market would be curtailed. The reliability of the internet as an independent source of trademark-related information would be put at risk. The analysis, thus, leads to the insight that a proliferation of the new filtering obligations in copyright law is undesirable and should be avoided.

algorithmic enforcement, confusion, Content moderation, descriptive use, dilution, exhaustion of trademark rights, filtering obligations, free movement of goods and services, freedom of commercial expression, freedom of competition, frontpage, market transparency, Merkenrecht, parallel imports, platform economy

RIS

Save .RIS

Bibtex

Save .bib