‘Must-carry’, special treatment and freedom of expression on online platforms: a European story external link

Kuczerawy, A. & Quintais, J.
European Law Open, pp: 1-34, 2025

Abstract

This paper examines the evolution and implications of ‘must-carry’ obligations in the regulation of online platforms, with a focus on Europe. These obligations, which restrict platforms’ discretion to remove or deprioritise certain content, represent a novel regulatory response to the growing power of platforms in shaping public discourse. The analysis traces developments at EU and national levels. At the EU level, it considers rejected must-carry proposals during the drafting of the Digital Services Act (DSA) and the adoption of Article 18 of the European Media Freedom Act (EMFA), which grants privileges to recognised media service providers. At the national level, it examines Germany’s prohibition on content discrimination, the UK’s Online Safety Act, and Poland’s abandoned legislative proposal on freedom of expression online. Case law from courts in the Netherlands, Germany, Italy, and Poland further illustrates the emergence of judicially crafted duties resembling must-carry obligations. The paper argues that these measures are best understood as special treatment rules that privilege particular speakers, notably media organisations and politicians, by limiting platform autonomy in content moderation. While intended to safeguard pluralism and access to trustworthy information, such rules risk creating a two-tier system of expression in which established voices receive disproportionate protection while ordinary users remain vulnerable. Protections for politicians raise concerns about shielding powerful actors from justified moderation, whereas media privileges, though more defensible, remain limited in scope and potentially counterproductive, especially when exploited by outlets disseminating disinformation. The conclusion is that compelled inclusion and preferential treatment are unlikely to offer sustainable solutions to the structural imbalances between platforms, media providers, and politicians. More durable approaches should focus on strengthening journalism through financial and structural support, fostering innovation and local media, and prioritising user empowerment measures. Only systemic safeguards of this kind can effectively promote pluralism, accountability, and resilience in the digital public sphere.

Digital Services Act (DSA), European Media Freedom Act, Freedom of expression, must carry, platform regulation

RIS

Save .RIS

Bibtex

Save .bib

Fundamental Rights in Out-of-Court Dispute Settlement under the Digital Services Act external link

Ruschemeier, H. & Quintais, J.
2025

Abstract

This paper argues that certified out-of-court dispute settlement (ODS) bodies under Article 21 of the Digital Services Act (DSA) should apply a structured fundamental rights review to platform content moderation, operationalised through the concept of case salience. Situating ODS within the DSA's broader regulatory architecture-particularly Articles 14(4), 17, and 20-the paper contends that Article 21 provides the procedural complement to Article 14(4)'s substantive duty to enforce terms of service "diligently, objectively and proportionately, with due regard to fundamental rights." Rather than extending the Charter of Fundamental Rights of the European Union (CFR) horizontally in a direct sense, ODS bodies give effect to Charter-conforming statutory obligations owed by platforms, interpreted in light of Article 52(1) CFR. Drawing on jurisprudence from the Court of Justice of the European Union (CJEU), the European Court of Human Rights (ECtHR), and national courts, the paper shows how freedom of expression and information interacts with countervailing rights-such as the freedom to conduct a business, privacy and data protection, and human dignity-in the context of online moderation. It proposes an intensity-of-review model: a deeper, meritsbased proportionality analysis for high-impact cases (e.g. political speech, account suspensions, issues of systemic relevance), and a lighter, procedural-sufficiency check for routine disputes. The paper emphasises that ODS remains non-judicial and operates without prejudice to Article 47 CFR and the availability of national court remedies. Over time, reasoned ODS decisions could evolve into a body of soft law, enhancing consistency and transparency in platform accountability. Ultimately, ODS bodies under the DSA represent a novel experiment in multi-actor rights protection. Their success will depend on whether they can reconcile accessibility, efficiency, and rights-based rigour, ensuring that content moderation in Europe evolves in line with the constitutional values of the Charter.

Content moderation, Digital Services Act (DSA), Fundamental rights

RIS

Save .RIS

Bibtex

Save .bib

Waiting for the DSA’s Big Enforcement Moment external link

DSA Observatory, 2025

Abstract

This blog post explores the issue of DSA enforcement by the European Commission, focusing on the law’s systemic risk management provisions. It first briefly sketches the Commission’s role in regulatory oversight of the systemic risk framework and then sums up enforcement efforts to date, considering also the role of geopolitics in the Commission’s enforcement calculus.

Digital Services Act (DSA)

RIS

Save .RIS

Bibtex

Save .bib

Trading nuance for scale? Platform observability and content governance under the DSA external link

Papaevangelou, C. & Votta, F.
Internet Policy Review, vol. 14, iss. : 3, 2025

Abstract

The Digital Services Act (DSA) marks a paradigmatic shift in platform governance, introducing mechanisms like the Statement of Reasons (SoRs) database to foster transparency and observability of platforms’ content moderation practices. This study investigates the DSA Transparency Database as a regulatory mechanism for enabling observability, focusing on the automation and territorial application of content moderation across the EU/EEA. By analysing 439 million SoRs from eight Very Large Online Platforms (VLOPs), we find that the vast majority of content moderation decisions are enforced automatically and uniformly across the EU/EEA. We also identify significant discrepancies in content moderation strategies across VLOPs, with TikTok, YouTube and X exhibiting the most distinct practices, which are further analysed in the paper. Our findings reveal a strong correlation between automation and the speed of content moderation, automation and the territorial scope of decisions. We also highlight several limitations of the database, notably the lack of language-specific data and inconsistencies in how SoRs are reported by VLOPs. We conclude that despite such shortcomings, the DSA and its Transparency Database may enable a wider constellation of stakeholders to participate in platform governance, paving the way for more meaningful platform observability.

Content moderation, Digital Services Act (DSA), platform governance, Transparency

RIS

Save .RIS

Bibtex

Save .bib

The Regulation of Disinformation Under the Digital Services Act external link

Media and Communication, vol. 13, 2025

Abstract

This article critically examines the regulation of disinformation under the EU’s Digital Services Act (DSA). It begins by analysing how the DSA applies to disinformation, discussing how the DSA facilitates the removal of illegal disinformation, and on the other hand, how it can protect users’ freedom of expression against the removal of certain content classified as disinformation. The article then moves to the DSA’s special risk‐based rules, which apply to Very Large Online Platforms in relation to mitigation of systemic risks relating to disinformation, and are to be enforced by the European Commission. We analyse recent regulatory action by the Commission in tackling disinformation within its DSA competencies, and assess these actions from a fundamental rights perspective, focusing on freedom of expression guaranteed under the EU Charter of Fundamental Rights and the European Convention on Human Rights.

Digital Services Act (DSA), disinformation, Freedom of expression, Online platforms

RIS

Save .RIS

Bibtex

Save .bib

More Than Justifications an Analysis of Information Needs in Explanations and Motivations to Disable Personalization external link

Resendez, V., Kieslich, K., Helberger, N. & Vreese, C.H. de
Journalism Studies, vol. 26, iss. : 11, pp: 1304-1312, 2025

Abstract

There is consensus that algorithmic news recommenders should be explainable to inform news readers of potential risks. However, debates continue over which information users need and which stakeholders should access this information. As the debate continues, researchers also call for more control over algorithmic news recommender systems, for example, by turning off personalized recommendations. Despite this call, it is unclear the extent to which news readers will use this feature. To add nuance to the discussion, we analyzed 586 responses to two open-ended questions: i) what information needs to contribute to trustworthiness perceptions of new recommendations, and ii) whether people want the ability to turn off personalization. Our results indicate that most participants found knowing the sources of news items important for trusting a recommendation system. Additionally, more than half of the participants were inclined to disable personalization. The most common reasons to turn off personalization included concerns about bias or filter bubbles and a preference to consume generalized news. These findings suggest that news readers have different information needs for explanations when interacting with an algorithmic news recommender and that many news readers prefer to disable the usage of personalized news recommendations.

control, Digital Services Act (DSA), news recommenders, Personalisation, trust

RIS

Save .RIS

Bibtex

Save .bib

Online Behavioural Advertising, Consumer Empowerment and Fair Competition: Are the DSA Transparency Obligations the Right Answer? download

Izyumenko, E., Senftleben, M., Schutte, N., Smit, E.G., Noort, G. van & Velzen, L. van
Journal of European Consumer and Market Law (EuCML), vol. 14, iss. : 2, pp: 46-59, 2025

Competition law, Consumer law, Digital Services Act (DSA), online behavioural advertising, Transparency

RIS

Save .RIS

Bibtex

Save .bib

Generative AI, Copyright and the AI Act external link

Computer Law & Security Review, vol. 56, num: 106107, 2025

Abstract

This paper provides a critical analysis of the Artificial Intelligence (AI) Act's implications for the European Union (EU) copyright acquis, aiming to clarify the complex relationship between AI regulation and copyright law while identifying areas of legal ambiguity and gaps that may influence future policymaking. The discussion begins with an overview of fundamental copyright concerns related to generative AI, focusing on issues that arise during the input, model, and output stages, and how these concerns intersect with the text and data mining (TDM) exceptions under the Copyright in the Digital Single Market Directive (CDSMD). The paper then explores the AI Act's structure and key definitions relevant to copyright law. The core analysis addresses the AI Act's impact on copyright, including the role of TDM in AI model training, the copyright obligations imposed by the Act, requirements for respecting copyright law—particularly TDM opt-outs—and the extraterritorial implications of these provisions. It also examines transparency obligations, compliance mechanisms, and the enforcement framework. The paper further critiques the current regime's inadequacies, particularly concerning the fair remuneration of creators, and evaluates potential improvements such as collective licensing and bargaining. It also assesses legislative reform proposals, such as statutory licensing and AI output levies, and concludes with reflections on future directions for integrating AI governance with copyright protection.

AI Act, Content moderation, Copyright, Digital Services Act (DSA), Generative AI, Text and Data Mining (TDM), Transparency

RIS

Save .RIS

Bibtex

Save .bib

“Must-carry”, Special Treatment and Freedom of Expression on Online Platforms: A European Story

Kuczerawy, A. & Quintais, J.
2024

Abstract

This paper examines the role of "must-carry" obligations in the regulation of online platforms, arguing that these obligations are better understood as special treatment rules rather than direct analogues of traditional broadcasting regulation. By analysing the development of such rules within the European Union, particularly through the Digital Services Act (DSA) and the European Media Freedom Act (EMFA), the paper explores how these provisions aim to safeguard freedom of expression, ensure access to trustworthy information, enhance media pluralism, and regulate platform behaviour. The analysis extends to national-level laws and court decisions in Germany, The Netherlands, the United Kingdom, and Poland, illustrating how these countries have grappled with similar challenges in applying and contextualizing special treatment rules. Through a detailed examination of these frameworks, the paper critiques the risks of these rules, including their potential to entrench power imbalances, amplify state narratives, and complicate efforts to counter disinformation. Additionally, the paper highlights the broader implications of granting privileged status to legacy media and political actors, questioning whether such measures align with democratic principles and the rule of law. Ultimately, the paper argues that while these rules may offer a response to platform dominance, their implementation risks undermining the equality of speech and shifting the focus of freedom of expression toward a privilege for select groups. The paper is currently under peer review so please contact the authors for a copy of the preprint. We'll upload it again once the review is complete.

Content moderation, Digital Services Act (DSA), EU law, European Media Freedom Act, must carry, platform regulation

RIS

Save .RIS

Bibtex

Save .bib

Contesting personalized recommender systems: a cross-country analysis of user preferences external link

Starke, C., Metikoš, L., Helberger, N. & Vreese, C.H. de
Information, Communication & Society, 2024

Abstract

Very Large Online Platforms (VLOPs) such as Instagram, TikTok, and YouTube wield substantial influence over digital information flows using sophisticated algorithmic recommender systems (RS). As these systems curate personalized content, concerns have emerged about their propensity to amplify polarizing or inappropriate content, spread misinformation, and infringe on users’ privacy. To address these concerns, the European Union (EU) has recently introduced a new regulatory framework through the Digital Services Act (DSA). These proposed policies are designed to bolster user agency by offering contestability mechanisms against personalized RS. As their effectiveness ultimately requires individual users to take specific actions, this empirical study investigates users’ intention to contest personalized RS. The results of a pre-registered survey across six countries – Brazil, Germany, Japan, South Korea, the UK, and the USA – involving 6,217 respondents yield key insights: (1) Approximately 20% of users would opt out of using personalized RS, (2) the intention for algorithmic contestation is associated with individual characteristics such as users’ attitudes towards and awareness of personalized RS as well as their privacy concerns, (3) German respondents are particularly inclined to contest personalized RS. We conclude that amending Art. 38 of the DSA may contribute to leveraging its effectiveness in fostering accessible user contestation and algorithmic transparency.

Algorithmic contestation, Digital Services Act (DSA), Personalisation, recommender systems

RIS

Save .RIS

Bibtex

Save .bib