From the DMCA to the DSA: A Transatlantic Dialogue on Online Platform Regulation and Copyright external link

Verfassungs, 2024

Copyright, DMCA, DSA, Online platforms

Bibtex

Online publication{nokey, title = {From the DMCA to the DSA: A Transatlantic Dialogue on Online Platform Regulation and Copyright}, author = {Quintais, J.}, url = {https://verfassungsblog.de/from-the-dmca-to-the-dsa/?s=09}, year = {2024}, date = {2024-02-19}, journal = {Verfassungs}, keywords = {Copyright, DMCA, DSA, Online platforms}, }

Opinie: De DSA en desinformatie: meer dan censuur alleen download

Mediaforum, num: 5, pp: 157, 2023

censuur, DSA

Bibtex

Article{nokey, title = {Opinie: De DSA en desinformatie: meer dan censuur alleen}, author = {Leerssen, P.}, url = {https://www.ivir.nl/publications/opinie-de-dsa-en-desinformatie-meer-dan-censuur-alleen/opinie_mediaforum_2023_5/}, year = {2023}, date = {2023-11-17}, journal = {Mediaforum}, number = {5}, keywords = {censuur, DSA}, }

An end to shadow banning? Transparency rights in the Digital Services Act between content moderation and curation download

Computer Law & Security Review, vol. 48, 2023

Abstract

This paper offers a legal perspective on the phenomenon of shadow banning: content moderation sanctions which are undetectable to those affected. Drawing on recent social science research, it connects current concerns about shadow banning to novel visibility management techniques in content moderation, such as delisting and demotion. Conventional moderation techniques such as outright content removal or account suspension can be observed by those affected, but these new visibility often cannot. This lends newfound significance to the legal question of moderation transparency rights. The EU Digital Services Act (DSA) is analysed in this light, as the first major legislation to regulate transparency of visibility remedies. In effect, its due process framework prohibits shadow banning with only limited exceptions. In doing so, the DSA surfaces tensions between two competing models for content moderation: as rule-bound administration or as adversarial security conflict. I discuss possible interpretations and trade-offs for this regime, and then turn to a more fundamental problem: how to define visibility reduction as a category of content moderation actions. The concept of visibility reduction or ‘demotions’ is central to both the shadow banning imaginary and to the DSA's safeguards, but its meaning is far from straightforward. Responding to claims that demotion is entirely relative, and therefore not actionable as a category of content moderation sanctions, I show how visibility reduction can still be regulated when defined as ex post adjustments to engagement-based relevance scores. Still, regulating demotion in this way will not cover all exercises of ranking power, since it manifests not only in individual cases of moderation but also through structural acts of content curation; not just by reducing visibility, but by producing visibility.

content curation, Content moderation, DSA, Online platforms, Transparency

Bibtex

Article{nokey, title = {An end to shadow banning? Transparency rights in the Digital Services Act between content moderation and curation}, author = {Leerssen, P.}, url = {https://www.ivir.nl/publications/comment-an-end-to-shadow-banning-transparency-rights-in-the-digital-services-act-between-content-moderation-and-curation/endtoshadowbanning/}, doi = {https://doi.org/10.1016/j.clsr.2023.105790}, year = {2023}, date = {2023-04-11}, journal = {Computer Law & Security Review}, volume = {48}, pages = {}, abstract = {This paper offers a legal perspective on the phenomenon of shadow banning: content moderation sanctions which are undetectable to those affected. Drawing on recent social science research, it connects current concerns about shadow banning to novel visibility management techniques in content moderation, such as delisting and demotion. Conventional moderation techniques such as outright content removal or account suspension can be observed by those affected, but these new visibility often cannot. This lends newfound significance to the legal question of moderation transparency rights. The EU Digital Services Act (DSA) is analysed in this light, as the first major legislation to regulate transparency of visibility remedies. In effect, its due process framework prohibits shadow banning with only limited exceptions. In doing so, the DSA surfaces tensions between two competing models for content moderation: as rule-bound administration or as adversarial security conflict. I discuss possible interpretations and trade-offs for this regime, and then turn to a more fundamental problem: how to define visibility reduction as a category of content moderation actions. The concept of visibility reduction or ‘demotions’ is central to both the shadow banning imaginary and to the DSA\'s safeguards, but its meaning is far from straightforward. Responding to claims that demotion is entirely relative, and therefore not actionable as a category of content moderation sanctions, I show how visibility reduction can still be regulated when defined as ex post adjustments to engagement-based relevance scores. Still, regulating demotion in this way will not cover all exercises of ranking power, since it manifests not only in individual cases of moderation but also through structural acts of content curation; not just by reducing visibility, but by producing visibility.}, keywords = {content curation, Content moderation, DSA, Online platforms, Transparency}, }

Putting the DSA into Practice: Enforcement, Access to Justice and Global Implications external link

Verfassungsbooks, 2023, ISBN: 9783757517960

Abstract

The Digital Services Act was finally published in the Official Journal of the European Union on 27 October 2022. This publication marks the end of a years-long drafting and negotiation process, and opens a new chapter: that of its enforcement, practicable access to justice, and potential to set global precedents. The Act has been portrayed as Europe’s new „Digital Constitution“, which affirms the primacy of democratic rulemaking over the private transnational ordering mechanisms of Big Tech. With it, the European Union aims once again to set a global standard in the regulation of the digital environment. But will the Digital Services Act be able to live up to its expectations, and under what conditions?

big tech, DSA, enforcement

Bibtex

Book{nokey, title = {Putting the DSA into Practice: Enforcement, Access to Justice and Global Implications}, author = {van Hoboken, J. and Quintais, J. and Appelman, N. and Fahy, R. and Buri, I. and Straub, M.}, url = {https://verfassungsblog.de/wp-content/uploads/2023/02/vHoboken-et-al_Putting-the-DSA-into-Practice.pdf}, doi = {https://doi.org/10.17176/20230208-093135-0}, year = {2023}, date = {2023-02-17}, abstract = {The Digital Services Act was finally published in the Official Journal of the European Union on 27 October 2022. This publication marks the end of a years-long drafting and negotiation process, and opens a new chapter: that of its enforcement, practicable access to justice, and potential to set global precedents. The Act has been portrayed as Europe’s new „Digital Constitution“, which affirms the primacy of democratic rulemaking over the private transnational ordering mechanisms of Big Tech. With it, the European Union aims once again to set a global standard in the regulation of the digital environment. But will the Digital Services Act be able to live up to its expectations, and under what conditions?}, keywords = {big tech, DSA, enforcement}, }

Using Terms and Conditions to apply Fundamental Rights to Content Moderation: Is Article 12 DSA a Paper Tiger? external link

Digital services act, DSA, frontpage, Fundamental rights, Online platforms, terms and conditions

Bibtex

Online publication{Appelman2021, title = {Using Terms and Conditions to apply Fundamental Rights to Content Moderation: Is Article 12 DSA a Paper Tiger?}, author = {Appelman, N. and Quintais, J. and Fahy, R.}, url = {https://verfassungsblog.de/power-dsa-dma-06/}, doi = {https://doi.org/10.17176/20210901-233103-0.}, year = {0901}, date = {2021-09-01}, keywords = {Digital services act, DSA, frontpage, Fundamental rights, Online platforms, terms and conditions}, }

Article 12 DSA: Will platforms be required to apply EU fundamental rights in content moderation decisions? external link

Content moderation, Digital services act, DSA, frontpage, Fundamental rights

Bibtex

Online publication{Quintais2021f, title = {Article 12 DSA: Will platforms be required to apply EU fundamental rights in content moderation decisions?}, author = {Quintais, J. and Appelman, N. and Fahy, R.}, url = {https://dsa-observatory.eu/2021/05/31/article-12-dsa-will-platforms-be-required-to-apply-eu-fundamental-rights-in-content-moderation-decisions/}, year = {0531}, date = {2021-05-31}, keywords = {Content moderation, Digital services act, DSA, frontpage, Fundamental rights}, }