How the EU Outsources the Task of Human Rights Protection to Platforms and Users: The Case of UGC Monetization external link

Senftleben, M., Quintais, J. & Meiring, A.
Berkeley Technology Law Journal, vol. 38, iss. : 3, pp: 933-1010, 2024

Abstract

With the shift from the traditional safe harbor for hosting to statutory content filtering and licensing obligations, EU copyright law has substantially curtailed the freedom of users to upload and share their content creations. Seeking to avoid overbroad inroads into freedom of expression, EU law obliges online platforms and the creative industry to take into account human rights when coordinating their content filtering actions. Platforms must also establish complaint and redress procedures for users. The European Commission will initiate stakeholder dialogues to identify best practices. These “safety valves” in the legislative package, however, are mere fig leaves. Instead of safeguarding human rights, the EU legislator outsources human rights obligations to the platform industry. At the same time, the burden of policing content moderation systems is imposed on users who are unlikely to bring complaints in each individual case. The new legislative design in the EU will thus “conceal” human rights violations instead of bringing them to light. Nonetheless, the DSA rests on the same – highly problematic – approach. Against this background, the paper discusses the weakening – and potential loss – of fundamental freedoms as a result of the departure from the traditional notice-and-takedown approach. Adding a new element to the ongoing debate on content licensing and filtering, the analysis will devote particular attention to the fact that EU law, for the most part, has left untouched the private power of platforms to determine the “house rules” governing the most popular copyright-owner reaction to detected matches between protected works and content uploads: the (algorithmic) monetization of that content. Addressing the “legal vacuum” in the field of content monetization, the analysis explores outsourcing and concealment risks in this unregulated space. Focusing on large-scale platforms for user-generated content, such as YouTube, Instagram and TikTok, two normative problems come to the fore: (1) the fact that rightholders, when opting for monetization, de facto monetize not only their own rights but also the creative input of users; (2) the fact that user creativity remains unremunerated as long as the monetization option is only available to rightholders. As a result of this configuration, the monetization mechanism disregards users’ right to (intellectual) property and discriminates against user creativity. Against this background, we discuss whether the DSA provisions that seek to ensure transparency of content moderation actions and terms and conditions offer useful sources of information that could empower users. Moreover, we raise the question whether the detailed regulation of platform actions in the DSA may resolve the described human rights dilemmas to some extent.

Artificial intelligence, Content moderation, Copyright, derivative works, discrimination, Freedom of expression, Human rights, liability, proportionality, user-generated content

Bibtex

Article{nokey, title = {How the EU Outsources the Task of Human Rights Protection to Platforms and Users: The Case of UGC Monetization}, author = {Senftleben, M. and Quintais, J. and Meiring, A.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4421150}, year = {2024}, date = {2024-01-23}, journal = {Berkeley Technology Law Journal}, volume = {38}, issue = {3}, pages = {933-1010}, abstract = {With the shift from the traditional safe harbor for hosting to statutory content filtering and licensing obligations, EU copyright law has substantially curtailed the freedom of users to upload and share their content creations. Seeking to avoid overbroad inroads into freedom of expression, EU law obliges online platforms and the creative industry to take into account human rights when coordinating their content filtering actions. Platforms must also establish complaint and redress procedures for users. The European Commission will initiate stakeholder dialogues to identify best practices. These “safety valves” in the legislative package, however, are mere fig leaves. Instead of safeguarding human rights, the EU legislator outsources human rights obligations to the platform industry. At the same time, the burden of policing content moderation systems is imposed on users who are unlikely to bring complaints in each individual case. The new legislative design in the EU will thus “conceal” human rights violations instead of bringing them to light. Nonetheless, the DSA rests on the same – highly problematic – approach. Against this background, the paper discusses the weakening – and potential loss – of fundamental freedoms as a result of the departure from the traditional notice-and-takedown approach. Adding a new element to the ongoing debate on content licensing and filtering, the analysis will devote particular attention to the fact that EU law, for the most part, has left untouched the private power of platforms to determine the “house rules” governing the most popular copyright-owner reaction to detected matches between protected works and content uploads: the (algorithmic) monetization of that content. Addressing the “legal vacuum” in the field of content monetization, the analysis explores outsourcing and concealment risks in this unregulated space. Focusing on large-scale platforms for user-generated content, such as YouTube, Instagram and TikTok, two normative problems come to the fore: (1) the fact that rightholders, when opting for monetization, de facto monetize not only their own rights but also the creative input of users; (2) the fact that user creativity remains unremunerated as long as the monetization option is only available to rightholders. As a result of this configuration, the monetization mechanism disregards users’ right to (intellectual) property and discriminates against user creativity. Against this background, we discuss whether the DSA provisions that seek to ensure transparency of content moderation actions and terms and conditions offer useful sources of information that could empower users. Moreover, we raise the question whether the detailed regulation of platform actions in the DSA may resolve the described human rights dilemmas to some extent.}, keywords = {Artificial intelligence, Content moderation, Copyright, derivative works, discrimination, Freedom of expression, Human rights, liability, proportionality, user-generated content}, }

The right to encryption: Privacy as preventing unlawful access external link

Computer Law & Security Review, vol. 49, 2023

Abstract

Encryption technologies are a fundamental building block of modern digital infrastructure, but plans to curb these technologies continue to spring up. Even in the European Union, where their application is by now firmly embedded in legislation, lawmakers are again calling for measures which would impact these technologies. One of the most important arguments in this debate are human rights, most notably the rights to privacy and to freedom of expression. And although some authors have in the past explored how encryption technologies support human rights, this connection is not yet firmly grounded in an analysis of European human rights case law. This contribution aims to fill this gap, developing a framework for assessing restrictions of encryption technologies under the rights to privacy and freedom of expression as protected under the European Convention of Human Rights (the Convention) and the Charter of Fundamental rights in the European Union (the Charter). In the first section, the relevant function of encryption technologies, restricting access to information (called confidentiality), is discussed. In the second section, an overview of some governmental policies and practices impacting these technologies is provided. This continues with a discussion of the case law on the rights to privacy, data protection and freedom of expression, arguing that these rights are not only about ensuring lawful access by governments to protected information, but also about preventing unlawful access by others. And because encryption technologies are an important technology to reduce the risk of this unlawful access, it is then proposed that this risk is central to the assessment of governance measures in the field of encryption technologies. The article concludes by recommending that states perform an in-depth assessement of this when proposing new measures, and that courts when reviewing them also place the risk of unlawful access central to the analysis of interference and proportionality.

communications confidentiality, encryption, Freedom of expression, Human rights, Privacy, unlawful access

Bibtex

Article{nokey, title = {The right to encryption: Privacy as preventing unlawful access}, author = {van Daalen, O.}, url = {https://www.sciencedirect.com/science/article/pii/S0267364923000146}, doi = {https://doi.org/10.1016/j.clsr.2023.105804}, year = {2023}, date = {2023-05-23}, journal = {Computer Law & Security Review}, volume = {49}, pages = {}, abstract = {Encryption technologies are a fundamental building block of modern digital infrastructure, but plans to curb these technologies continue to spring up. Even in the European Union, where their application is by now firmly embedded in legislation, lawmakers are again calling for measures which would impact these technologies. One of the most important arguments in this debate are human rights, most notably the rights to privacy and to freedom of expression. And although some authors have in the past explored how encryption technologies support human rights, this connection is not yet firmly grounded in an analysis of European human rights case law. This contribution aims to fill this gap, developing a framework for assessing restrictions of encryption technologies under the rights to privacy and freedom of expression as protected under the European Convention of Human Rights (the Convention) and the Charter of Fundamental rights in the European Union (the Charter). In the first section, the relevant function of encryption technologies, restricting access to information (called confidentiality), is discussed. In the second section, an overview of some governmental policies and practices impacting these technologies is provided. This continues with a discussion of the case law on the rights to privacy, data protection and freedom of expression, arguing that these rights are not only about ensuring lawful access by governments to protected information, but also about preventing unlawful access by others. And because encryption technologies are an important technology to reduce the risk of this unlawful access, it is then proposed that this risk is central to the assessment of governance measures in the field of encryption technologies. The article concludes by recommending that states perform an in-depth assessement of this when proposing new measures, and that courts when reviewing them also place the risk of unlawful access central to the analysis of interference and proportionality.}, keywords = {communications confidentiality, encryption, Freedom of expression, Human rights, Privacy, unlawful access}, }

Outsourcing Human Rights Obligations and Concealing Human Rights Deficits: The Example of Monetizing User-Generated Content Under the CDSM Directive and the Digital Services Act external link

Senftleben, M., Quintais, J. & Meiring, A.

Abstract

With the shift from the traditional safe harbor for hosting to statutory content filtering and licensing obligations, EU copyright law has substantially curtailed the freedom of users to upload and share their content creations. Seeking to avoid overbroad inroads into freedom of expression, EU law obliges online platforms and the creative industry to take into account human rights when coordinating their content filtering actions. Platforms must also establish complaint and redress procedures for users. The European Commission will initiate stakeholder dialogues to identify best practices. These “safety valves” in the legislative package, however, are mere fig leaves. Instead of safeguarding human rights, the EU legislator outsources human rights obligations to the platform industry. At the same time, the burden of policing content moderation systems is imposed on users who are unlikely to bring complaints in each individual case. The new legislative design in the EU will thus “conceal” human rights violations instead of bringing them to light. Nonetheless, the DSA rests on the same – highly problematic – approach. Against this background, the paper discusses the weakening – and potential loss – of fundamental freedoms as a result of the departure from the traditional notice-and-takedown approach. Adding a new element to the ongoing debate on content licensing and filtering, the analysis will devote particular attention to the fact that EU law, for the most part, has left untouched the private power of platforms to determine the “house rules” governing the most popular copyright-owner reaction to detected matches between protected works and content uploads: the (algorithmic) monetization of that content. Addressing the “legal vacuum” in the field of content monetization, the analysis explores outsourcing and concealment risks in this unregulated space. Focusing on large-scale platforms for user-generated content, such as YouTube, Instagram and TikTok, two normative problems come to the fore: (1) the fact that rightholders, when opting for monetization, de facto monetize not only their own rights but also the creative input of users; (2) the fact that user creativity remains unremunerated as long as the monetization option is only available to rightholders. As a result of this configuration, the monetization mechanism disregards users’ right to (intellectual) property and discriminates against user creativity. Against this background, we discuss whether the DSA provisions that seek to ensure transparency of content moderation actions and terms and conditions offer useful sources of information that could empower users. Moreover, we raise the question whether the detailed regulation of platform actions in the DSA may resolve the described human rights dilemmas to some extent.

Artificial intelligence, Content moderation, Copyright, derivative works, discrimination, Freedom of expression, Human rights, liability, user-generated content

Bibtex

Online publication{nokey, title = {Outsourcing Human Rights Obligations and Concealing Human Rights Deficits: The Example of Monetizing User-Generated Content Under the CDSM Directive and the Digital Services Act}, author = {Senftleben, M. and Quintais, J. and Meiring, A.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4421150}, year = {}, date = {DATE ERROR: pub_date = }, abstract = {With the shift from the traditional safe harbor for hosting to statutory content filtering and licensing obligations, EU copyright law has substantially curtailed the freedom of users to upload and share their content creations. Seeking to avoid overbroad inroads into freedom of expression, EU law obliges online platforms and the creative industry to take into account human rights when coordinating their content filtering actions. Platforms must also establish complaint and redress procedures for users. The European Commission will initiate stakeholder dialogues to identify best practices. These “safety valves” in the legislative package, however, are mere fig leaves. Instead of safeguarding human rights, the EU legislator outsources human rights obligations to the platform industry. At the same time, the burden of policing content moderation systems is imposed on users who are unlikely to bring complaints in each individual case. The new legislative design in the EU will thus “conceal” human rights violations instead of bringing them to light. Nonetheless, the DSA rests on the same – highly problematic – approach. Against this background, the paper discusses the weakening – and potential loss – of fundamental freedoms as a result of the departure from the traditional notice-and-takedown approach. Adding a new element to the ongoing debate on content licensing and filtering, the analysis will devote particular attention to the fact that EU law, for the most part, has left untouched the private power of platforms to determine the “house rules” governing the most popular copyright-owner reaction to detected matches between protected works and content uploads: the (algorithmic) monetization of that content. Addressing the “legal vacuum” in the field of content monetization, the analysis explores outsourcing and concealment risks in this unregulated space. Focusing on large-scale platforms for user-generated content, such as YouTube, Instagram and TikTok, two normative problems come to the fore: (1) the fact that rightholders, when opting for monetization, de facto monetize not only their own rights but also the creative input of users; (2) the fact that user creativity remains unremunerated as long as the monetization option is only available to rightholders. As a result of this configuration, the monetization mechanism disregards users’ right to (intellectual) property and discriminates against user creativity. Against this background, we discuss whether the DSA provisions that seek to ensure transparency of content moderation actions and terms and conditions offer useful sources of information that could empower users. Moreover, we raise the question whether the detailed regulation of platform actions in the DSA may resolve the described human rights dilemmas to some extent.}, keywords = {Artificial intelligence, Content moderation, Copyright, derivative works, discrimination, Freedom of expression, Human rights, liability, user-generated content}, }

Export control of cybersurveillance items in the new dual-use regulation: The challenges of applying human rights logic to export control external link

Computer Law & Security Review, vol. 48, 2023

Abstract

In 2021, the Recast Dual-Use Regulation entered into force. The regulation includes a heavily debated new provision on the export control of so-called cybersurveillance items. This provision departs from the traditional logic of export control rules in multiple ways. Most importantly, it positions human rights considerations as an important factor in the export control of a flexible range of technologies. This article explores the operation, implications and challenges of this new human rights-orientated approach to export control of digital surveillance technologies. Taking the definition of cybersurveillance items as a starting point of the analysis, the article draws on surveillance-related case law of the European Court of Human Rights and the Court of Justice of the European Union, to define the potential scope of application of the open-ended cybersurveillance concept of the Regulation. By exploring how this concept maps to technologies often connected with human rights infringements, such as facial recognition, location tracking and open-source intelligence, the article highlights the challenges of applying this new approach and underscores the need for its further development in practice.

cybersurveillance, Human rights, Regulation

Bibtex

Article{nokey, title = {Export control of cybersurveillance items in the new dual-use regulation: The challenges of applying human rights logic to export control}, author = {van Daalen, O. and van Hoboken, J. and Rucz, M.}, doi = {https://doi.org/10.1016/j.clsr.2022.105789}, year = {2023}, date = {2023-04-21}, journal = {Computer Law & Security Review}, volume = {48}, pages = {}, abstract = {In 2021, the Recast Dual-Use Regulation entered into force. The regulation includes a heavily debated new provision on the export control of so-called cybersurveillance items. This provision departs from the traditional logic of export control rules in multiple ways. Most importantly, it positions human rights considerations as an important factor in the export control of a flexible range of technologies. This article explores the operation, implications and challenges of this new human rights-orientated approach to export control of digital surveillance technologies. Taking the definition of cybersurveillance items as a starting point of the analysis, the article draws on surveillance-related case law of the European Court of Human Rights and the Court of Justice of the European Union, to define the potential scope of application of the open-ended cybersurveillance concept of the Regulation. By exploring how this concept maps to technologies often connected with human rights infringements, such as facial recognition, location tracking and open-source intelligence, the article highlights the challenges of applying this new approach and underscores the need for its further development in practice.}, keywords = {cybersurveillance, Human rights, Regulation}, }

Protection of Intellectual Property Rights per Protocol No. 1 of the Convention for the Protection of Human Rights and Fundamental Freedoms

GRUR International, vol. 72, iss. : 3, pp: 323-324, 2023

Human rights, Intellectual property

Bibtex

Case note{nokey, title = {Protection of Intellectual Property Rights per Protocol No. 1 of the Convention for the Protection of Human Rights and Fundamental Freedoms}, author = {Izyumenko, E.}, doi = {https://doi.org/10.1093/grurint/ikac144}, year = {2023}, date = {2023-02-02}, journal = {GRUR International}, volume = {72}, issue = {3}, pages = {323-324}, keywords = {Human rights, Intellectual property}, }

Europe’s Human Rights Court rules for the first time on a breach of a copyright holder’s right to property in a private dispute

Journal of Intellectual Property Law & Practice, vol. 17, iss. : 11, pp: 896–898, 2022

Abstract

The European Court of Human Rights has recently ruled that the domestic courts’ failure to justify the grounds for dismissing the applicant’s copyright infringement claim in a private-party dispute concerning the unauthorized online reproduction of the applicant’s book breached the latter’s human right to property. Notably, the Court was not satisfied with the fact that the national courts had not persuasively explained their conclusions regarding the applicability in the applicant’s case of digital exhaustion and of copyright exceptions for libraries and private copying.

Copyright, Human rights

Bibtex

Article{nokey, title = {Europe’s Human Rights Court rules for the first time on a breach of a copyright holder’s right to property in a private dispute}, author = {Izyumenko, E.}, doi = {https://doi.org/10.1093/jiplp/jpac093}, year = {2022}, date = {2022-10-17}, journal = {Journal of Intellectual Property Law & Practice}, volume = {17}, issue = {11}, pages = {896–898}, abstract = {The European Court of Human Rights has recently ruled that the domestic courts’ failure to justify the grounds for dismissing the applicant’s copyright infringement claim in a private-party dispute concerning the unauthorized online reproduction of the applicant’s book breached the latter’s human right to property. Notably, the Court was not satisfied with the fact that the national courts had not persuasively explained their conclusions regarding the applicability in the applicant’s case of digital exhaustion and of copyright exceptions for libraries and private copying.}, keywords = {Copyright, Human rights}, }

Governing “European values” inside data flows: : interdisciplinary perspectives external link

Irion, K., Kolk, A., Buri, M. & Milan, S.
Internet Policy Review, vol. 10, num: 3, 2021

Abstract

This editorial introduces ten research articles, which form part of this special issue, exploring the governance of “European values” inside data flows. Protecting fundamental human rights and critical public interests that undergird European societies in a global digital ecosystem poses complex challenges, especially because the United States and China are leading in novel technologies. We envision a research agenda calling upon different disciplines to further identify and understand European values that can adequately perform under conditions of transnational data flows.

Artificial intelligence, Data flows, Data governance, Digital connectivity, European Union, European values, Human rights, Internet governance, Personal data protection, Public policy, Societal values

Bibtex

Article{Irion2021e, title = {Governing “European values” inside data flows: : interdisciplinary perspectives}, author = {Irion, K. and Kolk, A. and Buri, M. and Milan, S.}, url = {https://policyreview.info/european-values}, doi = {https://doi.org/10.14763/2021.3.1582}, year = {1011}, date = {2021-10-11}, journal = {Internet Policy Review}, volume = {10}, number = {3}, pages = {}, abstract = {This editorial introduces ten research articles, which form part of this special issue, exploring the governance of “European values” inside data flows. Protecting fundamental human rights and critical public interests that undergird European societies in a global digital ecosystem poses complex challenges, especially because the United States and China are leading in novel technologies. We envision a research agenda calling upon different disciplines to further identify and understand European values that can adequately perform under conditions of transnational data flows.}, keywords = {Artificial intelligence, Data flows, Data governance, Digital connectivity, European Union, European values, Human rights, Internet governance, Personal data protection, Public policy, Societal values}, }

Panta Rhei: A European Perspective on Ensuring a High Level of Protection of Human Rights in a World in Which Everything Flows external link

Big Data and Global Trade Law, Cambridge University Press, 2021

Abstract

Human rights do remain valid currency in how we approach planetary-scale computation and accompanying data flows. Today’s system of human rights protection, however, is highly dependent on domestic legal institutions, which unravel faster than the reconstruction of fitting transnational governance institutions. The chapter takes a critical look at the construction of the data flow metaphor as a policy concept inside international trade law. Subsequently, it explores how the respect for human rights ties in with national constitutionalism that becomes increasingly challenged by the transnational dynamic of digital era transactions. Lastly, the chapter turns to international trade law and why its ambitions to govern cross-border data flows will likely not advance efforts to generate respect for human rights. In conclusion, the chapter advocates for a rebalancing act that recognizes human rights inside international trade law.

Artificial intelligence, EU law, frontpage, Human rights, Transparency, WTO law

Bibtex

Chapter{Irion2021bb, title = {Panta Rhei: A European Perspective on Ensuring a High Level of Protection of Human Rights in a World in Which Everything Flows}, author = {Irion, K.}, url = {https://www.cambridge.org/core/books/big-data-and-global-trade-law/panta-rhei/B0E5D7851240E0D2F4562B3C6DFF3011}, doi = {https://doi.org/https://doi.org/10.1017/9781108919234.015}, year = {2021}, date = {2021-07-05}, abstract = {Human rights do remain valid currency in how we approach planetary-scale computation and accompanying data flows. Today’s system of human rights protection, however, is highly dependent on domestic legal institutions, which unravel faster than the reconstruction of fitting transnational governance institutions. The chapter takes a critical look at the construction of the data flow metaphor as a policy concept inside international trade law. Subsequently, it explores how the respect for human rights ties in with national constitutionalism that becomes increasingly challenged by the transnational dynamic of digital era transactions. Lastly, the chapter turns to international trade law and why its ambitions to govern cross-border data flows will likely not advance efforts to generate respect for human rights. In conclusion, the chapter advocates for a rebalancing act that recognizes human rights inside international trade law.}, keywords = {Artificial intelligence, EU law, frontpage, Human rights, Transparency, WTO law}, }

Panta rhei: A European Perspective on Ensuring a High-Level of Protection of Digital Human Rights in a World in Which Everything Flows external link

Amsterdam Law School Research Paper No. 2020, num: 38, 2020

Artificial intelligence, data flow, EU law, Human rights, WTO law

Bibtex

Article{Irion2020d, title = {Panta rhei: A European Perspective on Ensuring a High-Level of Protection of Digital Human Rights in a World in Which Everything Flows}, author = {Irion, K.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3638864}, year = {2020}, date = {2020-11-30}, journal = {Amsterdam Law School Research Paper No. 2020}, number = {38}, keywords = {Artificial intelligence, data flow, EU law, Human rights, WTO law}, }

European Court of Human Rights rules that collateral website blocking violates freedom of expression

Journal of Intellectual Property Law & Practice, vol. 15, iss. : 10, pp: 774–775, 2020

Freedom of expression, Human rights

Bibtex

Article{nokey, title = {European Court of Human Rights rules that collateral website blocking violates freedom of expression}, author = {Izyumenko, E.}, doi = {https://doi.org/10.1093/jiplp/jpaa135}, year = {2020}, date = {2020-10-23}, journal = {Journal of Intellectual Property Law & Practice}, volume = {15}, issue = {10}, pages = {774–775}, keywords = {Freedom of expression, Human rights}, }