Outsourcing Human Rights Obligations and Concealing Human Rights Deficits: The Example of Monetizing User-Generated Content Under the CDSM Directive and the Digital Services Act external link

Senftleben, M., Quintais, J. & Meiring, A.

Abstract

With the shift from the traditional safe harbor for hosting to statutory content filtering and licensing obligations, EU copyright law has substantially curtailed the freedom of users to upload and share their content creations. Seeking to avoid overbroad inroads into freedom of expression, EU law obliges online platforms and the creative industry to take into account human rights when coordinating their content filtering actions. Platforms must also establish complaint and redress procedures for users. The European Commission will initiate stakeholder dialogues to identify best practices. These “safety valves” in the legislative package, however, are mere fig leaves. Instead of safeguarding human rights, the EU legislator outsources human rights obligations to the platform industry. At the same time, the burden of policing content moderation systems is imposed on users who are unlikely to bring complaints in each individual case. The new legislative design in the EU will thus “conceal” human rights violations instead of bringing them to light. Nonetheless, the DSA rests on the same – highly problematic – approach. Against this background, the paper discusses the weakening – and potential loss – of fundamental freedoms as a result of the departure from the traditional notice-and-takedown approach. Adding a new element to the ongoing debate on content licensing and filtering, the analysis will devote particular attention to the fact that EU law, for the most part, has left untouched the private power of platforms to determine the “house rules” governing the most popular copyright-owner reaction to detected matches between protected works and content uploads: the (algorithmic) monetization of that content. Addressing the “legal vacuum” in the field of content monetization, the analysis explores outsourcing and concealment risks in this unregulated space. Focusing on large-scale platforms for user-generated content, such as YouTube, Instagram and TikTok, two normative problems come to the fore: (1) the fact that rightholders, when opting for monetization, de facto monetize not only their own rights but also the creative input of users; (2) the fact that user creativity remains unremunerated as long as the monetization option is only available to rightholders. As a result of this configuration, the monetization mechanism disregards users’ right to (intellectual) property and discriminates against user creativity. Against this background, we discuss whether the DSA provisions that seek to ensure transparency of content moderation actions and terms and conditions offer useful sources of information that could empower users. Moreover, we raise the question whether the detailed regulation of platform actions in the DSA may resolve the described human rights dilemmas to some extent.

Artificial intelligence, Content moderation, Copyright, derivative works, discrimination, Freedom of expression, Human rights, liability, user-generated content

RIS

Save .RIS

Bibtex

Save .bib

SLAPPed by the GDPR: protecting public interest journalism in the face of GDPR-based strategic litigation against public participation

Journal of Media Law, vol. 14, iss. : 2, pp: 378-405, 2022

Abstract

Strategic litigation against public participation is a threat to public interest journalism. Although typically a defamation claim underpins a SLAPP, the GDPR may serve as an alternative basis. This paper explores how public interest journalism is protected, and could be better protected, from abusive GDPR proceedings. The GDPR addresses the tension between data protection and freedom of expression by providing for a journalistic exemption. However, narrow national implementations of this provision leave the GDPR open for abuse. By analysing GDPR proceedings against newspaper Forbes Hungary, the paper illustrates how the GDPR can be instrumentalised as a SLAPP strategy. As European anti-SLAPP initiatives are finetuned, abusive GDPR proceedings need to be recognised as emerging forms of SLAPPs, requiring more attention to inadequate engagement with European freedom of expression standards in national implementations of the GDPR, data protection authorities’ role in facilitating SLAPPs, and the chilling effects of GDPR sanctions.

Data protection, Freedom of expression, GDPR, journalistic exemption, SLAPPS

RIS

Save .RIS

Bibtex

Save .bib

Using Terms and Conditions to Apply Fundamental Rights to Content Moderation external link

German Law Journal (forthcoming), 2022

Abstract

Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platform's terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards, and therefore allowing Article 14 to fulfil its revolutionary potential.

Content moderation, Digital Services Act (DSA), Freedom of expression, Online platforms, platform regulation, terms and conditions

RIS

Save .RIS

Bibtex

Save .bib

A Freedom of Expression Right to Register “Immoral” Trademarks and Trademarks Contrary to Public Order

IIC, vol. 52, iss. : 7, pp: 893–914, 2021

Abstract

Recently, in a judgment on the “Fack Ju Göhte” case, the Court of Justice of the European Union (CJEU) acknowledged that freedom of expression must be taken into account when applying the absolute ground for refusal of trademark registration related to public policy or to accepted principles of morality. Even prior to this pronouncement by the CJEU, the European Court of Human Rights (ECtHR) had already confirmed that the refusal of trademark registration, as such, implicates the speech rights of trademark applicants. The European Union Intellectual Property Office (EUIPO), likewise, had admitted on a number of occasions that the trademark applicant seeking registration of an “immoral” trademark or a trademark contrary to public order has a right “to freely employ words and images in the signs it wishes to register as trademarks”. This article explains what the freedom of expression grounding of the rights of trademark applicants to the so-called “immoral” trademarks and/or trademarks contrary to public order might mean for the future of these absolute grounds for refusal of trademark registration in Europe. It does so by reviewing, first, whether the wording and practical application of these grounds for refusal comply with the standards that can be derived from Art. 10 (freedom of expression) of the European Convention on Human Rights (ECHR). It then examines particularities of the free speech analysis with regards to religious, sexually obscene or otherwise “immoral” signs, as well as with regards to the signs amounting to hate speech or other speech presumably dangerous to public order.

Freedom of expression, Trademark law

RIS

Save .RIS

Bibtex

Save .bib

The Pelham Chronicles: Sampling, Copyright and Fundamental Rights external link

Journal of Intellectual Property Law & Practice, vol. 16, num: 3, pp: 213-225, 2021

Abstract

On 29 July 2019 the Court of Justice of the European Union (CJEU or Court) rendered its long-awaited judgment in Pelham. This judgement was published together, but not jointly, with those on Spiegel Online and Funke Medien. A bit less than a year later, on 30 April 2020, the German Federal Court of Justice (Bundesgerichtshof or BGH), which had referred the cases to Luxembourg, rendered its judgments in all three cases. There are obvious parallels between these judgments, and their combined relevance for the interpretation of European copyright law in the light of EU fundamental rights cannot be understated. This article focuses on Pelham, or the “Metall auf Metall” saga, as it is known in Germany. It analyses the relevant aspects and impact of Pelham in EU copyright law and examines how the BGH implemented the guidance provided by the CJEU. Where relevant, we draw the parallels to Funke Medien and Spiegel Online. Pelham gave the Court the opportunity to define the scope of the related right of reproduction of phonogram producers in art. 2(c) of Directive 2001/29/EC (InfoSoc Directive). The question whether such right enjoys the same scope of protection as the reproduction right for authorial works had made its way through the German courts for a remarkable two decades. This saga included a constitutional complaint, which in 2016 answered the question in the affirmative. The BGH’s preliminary reference to the CJEU was particularly important because on the back of the reproduction question it sought to clarify issues with fundamental rights implications, in particular the scope of the quotation right or defence and its application to musical creativity in the form of sampling. This article proceeds as follows. After this introduction, we briefly revisit the Pelham saga in its journey through the German and European courts, providing he context to the underlying legal issues (2). We then turn to the interpretation of the scope of the reproduction and distribution rights for phonograms (3) before examining the CJEU’s assessment of the systematic nature of exceptions and limitations (E&Ls) (4). We then discuss the wider implications of Pelham on the role of fundamental right in copyright law (5). We conclude with some doctrinal and practical observations on the wider implications of the “Metall auf Metall”-saga (6).

Copyright, EU law, Freedom of expression, frontpage, Fundamental rights, Funke Medien, limitations and exceptions, music sampling, Pelham, Spiegel Online

RIS

Save .RIS

Bibtex

Save .bib

The freedom of expression perspectives on intellectual property in Europe

Abstract

PhD thesis, CEIPI, Strasbourg.

Europe, Freedom of expression, Intellectual property

RIS

Save .RIS

Bibtex

Save .bib

The Odyssey of the Prohibition on General Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce Directive and Article 17 of the Directive on Copyright in the Digital Single Market external link

Abstract

EU law provides explicitly that intermediaries may not be obliged to monitor their service in a general manner in order to detect and prevent the illegal activity of their users. However, a misunderstanding of the difference between monitoring specific content and monitoring FOR specific content is a recurrent theme in the debate on intermediary liability and a central driver of the controversy surrounding it. Rightly understood, a prohibited general monitoring obligation arises whenever content – no matter how specifically it is defined – must be identified among the totality of the content on a platform. The moment platform content must be screened in its entirety, the monitoring obligation acquires an excessive, general nature. Against this background, a content moderation duty can only be deemed permissible if it is specific in respect of both the protected subject matter and potential infringers. This requirement of 'double specificity' is of particular importance because it prevents encroachments upon fundamental rights. The jurisprudence of the Court of Justice of the European Union has shed light on the anchorage of the general monitoring ban in primary EU law, in particular the right to the protection of personal data, the freedom of expression and information, the freedom to conduct a business, and the free movement of goods and services in the internal market. Due to their higher rank in the norm hierarchy, these legal guarantees constitute common ground for the application of the general monitoring prohibition in secondary EU legislation, namely Article 15(1) of the E-Commerce Directive ('ECD') and Article 17(8) of the Directive on Copyright in the Digital Single Market ('CDSMD'). With regard to the Digital Services Act (‘DSA’), this result of the analysis implies that any further manifestation of the general monitoring ban in the DSA would have to be construed and applied – in the light of applicable CJEU case law – as a safeguard against encroachments upon the aforementioned fundamental rights and freedoms. If the final text of the DSA does not contain a reiteration of the prohibition of general monitoring obligations known from Article 15(1) ECD and Article 17(8) CDSMD, the regulation of internet service provider liability, duties of care and injunctions would still have to avoid inroads into the aforementioned fundamental rights and freedoms and observe the principle of proportionality. The double specificity requirement plays a central role in this respect.

algorithmic enforcement, Auteursrecht, censorship, Content moderation, Copyright, defamation, Digital Services Act (DSA), filtering, Freedom of expression, frontpage, general monitoring, hosting service, injunctive relief, intermediary liability, notice and stay down, notice and take down, safe harbour, trade mark, user-generated content

RIS

Save .RIS

Bibtex

Save .bib

European Court of Human Rights rules that collateral website blocking violates freedom of expression

Journal of Intellectual Property Law & Practice, vol. 15, iss. : 10, pp: 774–775, 2020

Freedom of expression, Human rights

RIS

Save .RIS

Bibtex

Save .bib

Selected Aspects of Implementing Article 17 of the Directive on Copyright in the Digital Single Market into National Law – Comment of the European Copyright Society external link

Metzger, A., Senftleben, M., Derclaye E., Dreier, T., Geiger, C., Griffiths, J., Hilty, R., Hugenholtz, P.B., Riis, T., Rognstad, O.A., Strowel, A.M., Synodinou, T. & Xalabarder, R.
2020

Abstract

The national implementation of Article 17 of the Directive on Copyright in the Digital Single Market (DSMD) poses particular challenges. Article 17 is one of the most complex – and most controversial – provisions of the new legislative package which EU Member States must transpose into national law by 7 June 2021. Seeking to contribute to the debate on implementation options, the European Copyright Society addresses several core aspects of Article 17 that may play an important role in the national implementation process. It deals with the concept of online content-sharing service providers (OCSSPs) before embarking on a discussion of the licensing and content moderation duties which OCSSPs must fulfil in accordance with Article 17(1) and (4). The analysis also focuses on the copyright limitations mentioned in Article 17(7) that support the creation and dissemination of transformative user-generated content (UGC). It also discusses the appropriate configuration of complaint and redress mechanisms set forth in Article 17(9) that seek to reduce the risk of unjustified content removals. Finally, the European Copyright Society addresses the possibility of implementing direct remuneration claims for authors and performers, and explores the private international law aspect of applicable law – an impact factor that is often overlooked in the debate.

algorithmic enforcement, applicable law, collective copyright management, content hosting, Content moderation, copyright contract law, EU copyright law, filtering mechanisms, Freedom of expression, Licensing, notice-and-takedown, private international law, transformative use, user-generated content

RIS

Save .RIS

Bibtex

Save .bib

Implications of AI-driven tools in the media for freedom of expression external link

Abstract

Background Paper to the Ministerial Conference "Artificial Intelligence - Intelligent Politics: Challenges and opportunities for media and democracy, Cyprus, 28-29 May 2020."

Artificial intelligence, Freedom of expression, frontpage, Media law

RIS

Save .RIS

Bibtex

Save .bib