Using Terms and Conditions to Apply Fundamental Rights to Content Moderation external link

German Law Journal (forthcoming), 2022

Abstract

Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platform's terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards, and therefore allowing Article 14 to fulfil its revolutionary potential.

Content moderation, Digital Services Act (DSA), Freedom of expression, Online platforms, platform regulation, terms and conditions

RIS

Save .RIS

Bibtex

Save .bib

European Copyright Society – Comment on Copyright and the Digital Services Act Proposal external link

Peukert, A., Husovec, M., Kretschmer, M., Mezei, P. & Quintais, J.
IIC - International Review of Intellectual Property and Competition Law , vol. 53, iss. : 3, pp: 358-376, 2022

Auteursrecht, Digital Services Act (DSA), european copyright society, frontpage

RIS

Save .RIS

Bibtex

Save .bib

European Copyright Society (ECS): Comment on Copyright and the Digital Services Act Proposal external link

Peukert, A., Husovec, M., Kretschmer, M., Mezei, P. & Quintais, J.
Kluwer Copyright Blog, 2022

Auteursrecht, Digital Services Act (DSA), frontpage

RIS

Save .RIS

Bibtex

Save .bib

From Risk to Reward? The DSA’s risk-based approach to disinformation external link

Pentney, K. & McGonagle, T.
Unravelling the Digital Services Act package', M. Cappello (ed.), IRIS Special, Strasbourg: European Audiovisual Observatory, 1028, pp: 40-57

desinformatie, Digital Services Act (DSA), frontpage, Mediarecht

RIS

Save .RIS

Bibtex

Save .bib

Using Terms and Conditions to apply Fundamental Rights to Content Moderation: Is Article 12 DSA a Paper Tiger? external link

Digital Services Act (DSA), frontpage, Fundamental rights, Online platforms, terms and conditions

RIS

Save .RIS

Bibtex

Save .bib

Platform ad archives in Article 30 DSA external link

DSA Observatory blog, 2021

Digital Services Act (DSA), frontpage, Platforms

RIS

Save .RIS

Bibtex

Save .bib

Regulation of news recommenders in the Digital Services Act: empowering David against the Very Large Online Goliath external link

Helberger, N., Drunen, M. van, Vrijenhoek, S. & Möller, J.
Internet Policy Review, 2021

Digital Services Act (DSA), frontpage, Mediarecht, news recommenders, Regulering

RIS

Save .RIS

Bibtex

Save .bib

Article 12 DSA: Will platforms be required to apply EU fundamental rights in content moderation decisions? external link

Content moderation, Digital Services Act (DSA), frontpage, Fundamental rights

RIS

Save .RIS

Bibtex

Save .bib

The Interplay between the Digital Services Act and Sector Regulation: How Special is Copyright? external link

Quintais, J. & Schwemer, S.
European Journal of Risk Regulation, vol. 13, iss. : 2, pp: 191-217, 2022

Abstract

On 15 December 2020, the European Commission published its proposal for a Regulation on a Single Market for Digital Services (Digital Services Act). It carries out a regulatory overhaul of the 21-year- old horizontal rules on intermediary liability in the Directive and introduces new due diligence obligations for intermediary services. Our analysis illuminates an important point that has so far received little attention: how would the Digital Services Act’s rules interact with existing sector-specific lex specialis rules? In this paper, we look specifically at the intersection of the Digital Services Act with the regime for online content sharing service providers (OCSSPs) set forth in art. 17 of Directive (EU) 2019/790 on copyright in the Digital Single Market (CDSM Directive). At first glance, these regimes do not appear to overlap as the rules on copyright are lex specialis to the Digital Services Act. A closer look shows a more complex and nuanced picture. Our analysis concludes that the DSA will apply to OCSSPs insofar as it contains rules that regulate matters not covered by art. 17 CDSM Directive, as well as specific rules on matters where art. 17 leaves margin of discretion to Member States. This includes, to varying degrees, rules in the DSA relating to the liability of intermediary providers and to due diligence obligations for online platforms of different sizes. Importantly, we consider that such rules apply even where art. 17 CDSM Directive contains specific (but less precise) regulation on the matter. From a normative perspective, this might be a desirable outcome, to the extent that the DSA aims to establish “uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected”. Based on our analysis, we suggest a number of clarifications that might be help achieve that goal.

Art. 17 CDSM Directive, Content moderation, Copyright, Digital Services Act (DSA), frontpage, Online platforms

RIS

Save .RIS

Bibtex

Save .bib