Generative AI, Copyright and the AI Act external link

Abstract

This paper examines the copyright-relevant rules of the recently published Artificial Intelligence (AI) Act for the EU copyright acquis. The aim of the paper is to provide a critical overview of the relationship between the AI Act and EU copyright law, while highlighting potential gray areas and blind spots for legal interpretation and future policy-making. The paper proceeds as follows. After a short introduction, Section 2 outlines the basic copyright issues of generative AI and the relevant copyright acquis rules that interface with the AI Act. It mentions potential copyright issues with the input or training stage, the model, and outputs. The AI Act rules are mostly relevant for the training of AI models, and the Regulation primarily interfaces with the text and data mining (TDM) exceptions in Articles 3 and 4 of the Copyright in the Digital Single Market Directive (CDSMD). Section 3 then briefly explains the AI Act’s structure and core definitions as they pertain to copyright law. Section 4 is the heart of the paper. It covers in some detail the interface between the AI Act and EU copyright law, namely: the clarification that TDM is involved in training AI models (4.1); the outline of the key copyright obligations in the AI Act (4.2); the obligation to put in place policies to respect copyright law, especially regarding TDM opt-outs (4.3); the projected extraterritorial effect of such obligations (4.4); the transparency obligations (4.5); how the AI Act envisions compliance with such obligations (4.6); and potential enforcement and remedies (4.7). Section 5 offers some concluding remarks, focusing on the inadequacy of the current regime to address one of its main concerns: the fair remuneration of authors and performers.

AI Act, Content moderation, Copyright, DSA, Generative AI, text and data mining, Transparency

Bibtex

Working paper{nokey, title = {Generative AI, Copyright and the AI Act}, author = {Quintais, J.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4912701}, year = {2024}, date = {2024-08-01}, abstract = {This paper examines the copyright-relevant rules of the recently published Artificial Intelligence (AI) Act for the EU copyright acquis. The aim of the paper is to provide a critical overview of the relationship between the AI Act and EU copyright law, while highlighting potential gray areas and blind spots for legal interpretation and future policy-making. The paper proceeds as follows. After a short introduction, Section 2 outlines the basic copyright issues of generative AI and the relevant copyright acquis rules that interface with the AI Act. It mentions potential copyright issues with the input or training stage, the model, and outputs. The AI Act rules are mostly relevant for the training of AI models, and the Regulation primarily interfaces with the text and data mining (TDM) exceptions in Articles 3 and 4 of the Copyright in the Digital Single Market Directive (CDSMD). Section 3 then briefly explains the AI Act’s structure and core definitions as they pertain to copyright law. Section 4 is the heart of the paper. It covers in some detail the interface between the AI Act and EU copyright law, namely: the clarification that TDM is involved in training AI models (4.1); the outline of the key copyright obligations in the AI Act (4.2); the obligation to put in place policies to respect copyright law, especially regarding TDM opt-outs (4.3); the projected extraterritorial effect of such obligations (4.4); the transparency obligations (4.5); how the AI Act envisions compliance with such obligations (4.6); and potential enforcement and remedies (4.7). Section 5 offers some concluding remarks, focusing on the inadequacy of the current regime to address one of its main concerns: the fair remuneration of authors and performers.}, keywords = {AI Act, Content moderation, Copyright, DSA, Generative AI, text and data mining, Transparency}, }

How to design data access for researchers: A legal and software development perspective

Drunen, M. van & Noroozian, A.
Computer Law & Security Review, vol. 52, 2024

Abstract

Public scrutiny of platforms has been limited by a lack of transparency. In response, EU law increasingly requires platforms to provide data to researchers. The Digital Services Act and the proposed Regulation on the Transparency and Targeting of Political Advertising in particular require platforms to provide access to data through ad libraries and in response to data access requests. However, these obligations leave platforms considerable discretion to determine how access to data is provided. As the history of platforms’ self-regulated data access projects shows, the technical choices involved in designing data access significantly affect how researchers can use the provided data to scrutinise platforms. Ignoring the way data access is designed therefore creates a danger that platforms’ ability to limit research into their services simply shifts from controlling what data is available to researchers, to how data access is provided. This article explores how the Digital Services Act and proposed Political Advertising Regulation should be used to control the operationalisation of data access obligations that enable researchers to scrutinise platforms. It argues the operationalisation of data access regimes should not only be seen as a legal problem, but also as a software design problem. To that end it explores how software development principles may inform the operationalisation of data access obligations. The article closes by exploring the legal mechanisms available in the Digital Services Act and proposed Political Advertising Regulation to exercise control over the design of data access regimes, and makes five recommendations for ways in which these mechanisms should be used to enable research into platforms.

data access, DSA, Platforms, Transparency

Bibtex

Article{nokey, title = {How to design data access for researchers: A legal and software development perspective}, author = {Drunen, M. van and Noroozian, A.}, doi = {https://doi.org/10.1016/j.clsr.2024.105946}, year = {2024}, date = {2024-04-01}, journal = {Computer Law & Security Review}, volume = {52}, pages = {}, abstract = {Public scrutiny of platforms has been limited by a lack of transparency. In response, EU law increasingly requires platforms to provide data to researchers. The Digital Services Act and the proposed Regulation on the Transparency and Targeting of Political Advertising in particular require platforms to provide access to data through ad libraries and in response to data access requests. However, these obligations leave platforms considerable discretion to determine how access to data is provided. As the history of platforms’ self-regulated data access projects shows, the technical choices involved in designing data access significantly affect how researchers can use the provided data to scrutinise platforms. Ignoring the way data access is designed therefore creates a danger that platforms’ ability to limit research into their services simply shifts from controlling what data is available to researchers, to how data access is provided. This article explores how the Digital Services Act and proposed Political Advertising Regulation should be used to control the operationalisation of data access obligations that enable researchers to scrutinise platforms. It argues the operationalisation of data access regimes should not only be seen as a legal problem, but also as a software design problem. To that end it explores how software development principles may inform the operationalisation of data access obligations. The article closes by exploring the legal mechanisms available in the Digital Services Act and proposed Political Advertising Regulation to exercise control over the design of data access regimes, and makes five recommendations for ways in which these mechanisms should be used to enable research into platforms.}, keywords = {data access, DSA, Platforms, Transparency}, }

From the DMCA to the DSA: A Transatlantic Dialogue on Online Platform Regulation and Copyright external link

Verfassungsblog, 2024

Copyright, DMCA, DSA, Online platforms

Bibtex

Online publication{nokey, title = {From the DMCA to the DSA: A Transatlantic Dialogue on Online Platform Regulation and Copyright}, author = {Quintais, J.}, url = {https://verfassungsblog.de/from-the-dmca-to-the-dsa/?s=09}, year = {2024}, date = {2024-02-19}, journal = {Verfassungsblog}, keywords = {Copyright, DMCA, DSA, Online platforms}, }

Opinie: De DSA en desinformatie: meer dan censuur alleen download

Mediaforum, num: 5, pp: 157, 2023

censuur, DSA

Bibtex

Article{nokey, title = {Opinie: De DSA en desinformatie: meer dan censuur alleen}, author = {Leerssen, P.}, url = {https://www.ivir.nl/publications/opinie-de-dsa-en-desinformatie-meer-dan-censuur-alleen/opinie_mediaforum_2023_5/}, year = {2023}, date = {2023-11-17}, journal = {Mediaforum}, number = {5}, keywords = {censuur, DSA}, }

An end to shadow banning? Transparency rights in the Digital Services Act between content moderation and curation download

Computer Law & Security Review, vol. 48, 2023

Abstract

This paper offers a legal perspective on the phenomenon of shadow banning: content moderation sanctions which are undetectable to those affected. Drawing on recent social science research, it connects current concerns about shadow banning to novel visibility management techniques in content moderation, such as delisting and demotion. Conventional moderation techniques such as outright content removal or account suspension can be observed by those affected, but these new visibility often cannot. This lends newfound significance to the legal question of moderation transparency rights. The EU Digital Services Act (DSA) is analysed in this light, as the first major legislation to regulate transparency of visibility remedies. In effect, its due process framework prohibits shadow banning with only limited exceptions. In doing so, the DSA surfaces tensions between two competing models for content moderation: as rule-bound administration or as adversarial security conflict. I discuss possible interpretations and trade-offs for this regime, and then turn to a more fundamental problem: how to define visibility reduction as a category of content moderation actions. The concept of visibility reduction or ‘demotions’ is central to both the shadow banning imaginary and to the DSA's safeguards, but its meaning is far from straightforward. Responding to claims that demotion is entirely relative, and therefore not actionable as a category of content moderation sanctions, I show how visibility reduction can still be regulated when defined as ex post adjustments to engagement-based relevance scores. Still, regulating demotion in this way will not cover all exercises of ranking power, since it manifests not only in individual cases of moderation but also through structural acts of content curation; not just by reducing visibility, but by producing visibility.

content curation, Content moderation, DSA, Online platforms, Transparency

Bibtex

Article{nokey, title = {An end to shadow banning? Transparency rights in the Digital Services Act between content moderation and curation}, author = {Leerssen, P.}, url = {https://www.ivir.nl/publications/comment-an-end-to-shadow-banning-transparency-rights-in-the-digital-services-act-between-content-moderation-and-curation/endtoshadowbanning/}, doi = {https://doi.org/10.1016/j.clsr.2023.105790}, year = {2023}, date = {2023-04-11}, journal = {Computer Law & Security Review}, volume = {48}, pages = {}, abstract = {This paper offers a legal perspective on the phenomenon of shadow banning: content moderation sanctions which are undetectable to those affected. Drawing on recent social science research, it connects current concerns about shadow banning to novel visibility management techniques in content moderation, such as delisting and demotion. Conventional moderation techniques such as outright content removal or account suspension can be observed by those affected, but these new visibility often cannot. This lends newfound significance to the legal question of moderation transparency rights. The EU Digital Services Act (DSA) is analysed in this light, as the first major legislation to regulate transparency of visibility remedies. In effect, its due process framework prohibits shadow banning with only limited exceptions. In doing so, the DSA surfaces tensions between two competing models for content moderation: as rule-bound administration or as adversarial security conflict. I discuss possible interpretations and trade-offs for this regime, and then turn to a more fundamental problem: how to define visibility reduction as a category of content moderation actions. The concept of visibility reduction or ‘demotions’ is central to both the shadow banning imaginary and to the DSA\'s safeguards, but its meaning is far from straightforward. Responding to claims that demotion is entirely relative, and therefore not actionable as a category of content moderation sanctions, I show how visibility reduction can still be regulated when defined as ex post adjustments to engagement-based relevance scores. Still, regulating demotion in this way will not cover all exercises of ranking power, since it manifests not only in individual cases of moderation but also through structural acts of content curation; not just by reducing visibility, but by producing visibility.}, keywords = {content curation, Content moderation, DSA, Online platforms, Transparency}, }

Putting the DSA into Practice: Enforcement, Access to Justice and Global Implications external link

Verfassungsbooks, 2023, ISBN: 9783757517960

Abstract

The Digital Services Act was finally published in the Official Journal of the European Union on 27 October 2022. This publication marks the end of a years-long drafting and negotiation process, and opens a new chapter: that of its enforcement, practicable access to justice, and potential to set global precedents. The Act has been portrayed as Europe’s new „Digital Constitution“, which affirms the primacy of democratic rulemaking over the private transnational ordering mechanisms of Big Tech. With it, the European Union aims once again to set a global standard in the regulation of the digital environment. But will the Digital Services Act be able to live up to its expectations, and under what conditions?

big tech, DSA, enforcement

Bibtex

Book{nokey, title = {Putting the DSA into Practice: Enforcement, Access to Justice and Global Implications}, author = {van Hoboken, J. and Quintais, J. and Appelman, N. and Fahy, R. and Buri, I. and Straub, M.}, url = {https://verfassungsblog.de/wp-content/uploads/2023/02/vHoboken-et-al_Putting-the-DSA-into-Practice.pdf}, doi = {https://doi.org/10.17176/20230208-093135-0}, year = {2023}, date = {2023-02-17}, abstract = {The Digital Services Act was finally published in the Official Journal of the European Union on 27 October 2022. This publication marks the end of a years-long drafting and negotiation process, and opens a new chapter: that of its enforcement, practicable access to justice, and potential to set global precedents. The Act has been portrayed as Europe’s new „Digital Constitution“, which affirms the primacy of democratic rulemaking over the private transnational ordering mechanisms of Big Tech. With it, the European Union aims once again to set a global standard in the regulation of the digital environment. But will the Digital Services Act be able to live up to its expectations, and under what conditions?}, keywords = {big tech, DSA, enforcement}, }

Using Terms and Conditions to apply Fundamental Rights to Content Moderation: Is Article 12 DSA a Paper Tiger? external link

Digital services act, DSA, frontpage, Fundamental rights, Online platforms, terms and conditions

Bibtex

Online publication{Appelman2021, title = {Using Terms and Conditions to apply Fundamental Rights to Content Moderation: Is Article 12 DSA a Paper Tiger?}, author = {Appelman, N. and Quintais, J. and Fahy, R.}, url = {https://verfassungsblog.de/power-dsa-dma-06/}, doi = {https://doi.org/10.17176/20210901-233103-0.}, year = {0901}, date = {2021-09-01}, keywords = {Digital services act, DSA, frontpage, Fundamental rights, Online platforms, terms and conditions}, }

Article 12 DSA: Will platforms be required to apply EU fundamental rights in content moderation decisions? external link

Content moderation, Digital services act, DSA, frontpage, Fundamental rights

Bibtex

Online publication{Quintais2021f, title = {Article 12 DSA: Will platforms be required to apply EU fundamental rights in content moderation decisions?}, author = {Quintais, J. and Appelman, N. and Fahy, R.}, url = {https://dsa-observatory.eu/2021/05/31/article-12-dsa-will-platforms-be-required-to-apply-eu-fundamental-rights-in-content-moderation-decisions/}, year = {0531}, date = {2021-05-31}, keywords = {Content moderation, Digital services act, DSA, frontpage, Fundamental rights}, }