Dun & Bradstreet: A Pyrrhic Victory for the Contestation of AI under the GDPR external link

The Law, Ethics & Policy of AI Blog, 2025

Abstract

The CJEU’s ruling in Dun & Bradstreet clarifies how the GDPR’s ‘right to an explanation’ should enable individuals to contest AI-based decision-making. It states that explanations need to be understandable while also respecting trade secrets and privacy concerns in a balanced manner. However, the Court excludes the disclosure of in-depth technical information and also introduces a burdensome balancing procedure. These requirements both strengthen and weaken the ability of individuals to independently assess impactful AI systems, leading to a pyrrhic victory for contestation.

ai, GDPR, Privacy

Bibtex

The Right to an Explanation in Practice: Insights from Case Law for the GDPR and the AI Act external link

Law, Innovation, and Technology , pp: 1-36, 2025

Abstract

The right to an explanation under the GDPR has been much discussed in legal-doctrinal scholarship. This paper expands upon this academic discourse, by providing insights into what questions the application of the right to an explanation has raised in legal practice. By looking at cases brought before various judicial bodies and data protection authorities across the European Union, we discuss questions regarding the scope, content, and balancing exercise of the right to an explanation. We argue, moreover, that these questions also raise important interpretative issues regarding the right to an explanation under the AI Act. Similar to the GDPR, the AI Act's right to an explanation leaves many legal questions unanswered. Therefore, the insights from the already established case law under the GDPR, can help us to understand better how the AI Act's right to an explanation should be understood in practice.

AI Act, case law, GDPR, Privacy

Bibtex

Expert perspectives on GDPR compliance in the context of smart homes and vulnerable persons

Information & Communications Technology Law, 2023

Abstract

This article introduces information gathered through 21 semi-structured interviews conducted with UK, EU and international professionals in the field of General Data Protection Regulation (GDPR) compliance and technology design, with a focus on the smart home context and vulnerable people using smart products. Those discussions gave various insights and perspectives into how the two communities (lawyers and technologists) view intricate practical data protection challenges in this specific setting. The variety of interviewees allowed to compare different approaches to data protection compliance topics. Answers to the following questions were provided: when organisations develop and/or deploy smart devices that use personal data, do they take into consideration the needs of vulnerable groups of people to comply with the GDPR? What are the underlying issues linked to the practical data protection law challenges faced by organisations working on smart devices used by vulnerable persons? How do experts perceive data protection law-related problems in this context?

Data protection, GDPR, Internet of Things, smart devices

Bibtex

SLAPPed by the GDPR: protecting public interest journalism in the face of GDPR-based strategic litigation against public participation

Journal of Media Law, vol. 14, iss. : 2, pp: 378-405, 2022

Abstract

Strategic litigation against public participation is a threat to public interest journalism. Although typically a defamation claim underpins a SLAPP, the GDPR may serve as an alternative basis. This paper explores how public interest journalism is protected, and could be better protected, from abusive GDPR proceedings. The GDPR addresses the tension between data protection and freedom of expression by providing for a journalistic exemption. However, narrow national implementations of this provision leave the GDPR open for abuse. By analysing GDPR proceedings against newspaper Forbes Hungary, the paper illustrates how the GDPR can be instrumentalised as a SLAPP strategy. As European anti-SLAPP initiatives are finetuned, abusive GDPR proceedings need to be recognised as emerging forms of SLAPPs, requiring more attention to inadequate engagement with European freedom of expression standards in national implementations of the GDPR, data protection authorities’ role in facilitating SLAPPs, and the chilling effects of GDPR sanctions.

Data protection, Freedom of expression, GDPR, journalistic exemption, SLAPPS

Bibtex

The Right to Lodge a Data Protection Complaint: Ok, But Then What? An empirical study of current practices under the GDPR external link

European Data Protection Scholars Network
2022

Abstract

This study examines current Data Protection Authorities' (DPA) practices related to their obligation to facilitate the submission of complaints, granting special attention to the connection between this obligation and the right to an effective judicial remedy against DPAs. It combines legal analysis and the observation of DPA websites, together with insights obtained from the online public register of decisions adopted under the ʻone-stop-shopʼ mechanism. This study was commissioned by Access Now.

Data Protection Authorities, frontpage, GDPR, remedy, right to an effective remedy

Bibtex

A Matter of (Joint) control? Virtual assistants and the general data protection regulation external link

Computer Law & Security Review, vol. 45, 2022

Abstract

This article provides an overview and critical examination of the rules for determining who qualifies as controller or joint controller under the General Data Protection Regulation. Using Google Assistant – an artificial intelligence-driven virtual assistant – as a case study, we argue that these rules are overreaching and difficult to apply in the present-day information society and Internet of Things environments. First, as a consequence of recent developments in case law and supervisory guidance, these rules lead to a complex and ambiguous test to determine (joint) control. Second, due to advances in technological applications and business models, it is increasingly challenging to apply such rules to contemporary processing operations. In particular, as illustrated by the Google Assistant, individuals will likely be qualified as joint controllers, together with Google and also third-party developers, for at least the collection and possible transmission of other individuals’ personal data via the virtual assistant. Third, we identify follow-on issues relating to the apportionment of responsibilities between joint controllers and the effective and complete protection of data subjects. We conclude by questioning whether the framework for determining who qualifies as controller or joint controller is future-proof and normatively desirable.

frontpage, GDPR, Privacy, Recht op gegevensbescherming

Bibtex

The General Data Protection Regulation though the lens of digital sovereignty external link

2022

Abstract

This short contribution will present and discuss the European Union’s (EU) General Data Protection Regulation (GDPR) through the lens of ‘digital sovereignty. When high-ranking representatives of EU institutions endorsed digital sovereignty this has been interpreted as a signpost for a new-found assertiveness in EU digital policy. However, digital sovereignty is conceptually fuzzy and is used to animate a wide spectrum of geopolitical, normative, and industrial ambitions. In the context of the GDPR it makes sense to operationalize digital sovereignty as the ability of rules to assert authority in a global and interdependent digital ecosystem. Conceived this way, I will reflect on how the GDPR wields transnational capacity by design in the form of safeguards against inbound and outbound circumvention.

Digital sovereignty, GDPR, transfer of personal data, transnational capacity

Bibtex

Personal data ordering in context: the interaction of meso-level data governance regimes with macro frameworks external link

Internet Policy Review, vol. 10, num: 3, 2021

Abstract

The technological infrastructures enabling the collection, processing, and trading of data have fuelled a rapid innovation of data governance models. We differentiate between macro, meso, and micro level models, which correspond to major political blocks; societal-, industry-, or community level systems, and individual approaches, respectively. We focus on meso-level models, which coalesce around: (1) organisations prioritising their own interests over interests of other stakeholders; (2) organisations offering technological and legal tools aiming to empower individuals; (3) community-based data intermediaries fostering collective rights and interests. In this article we assess these meso-level models, and discuss their interaction with the macro-level legal frameworks that have evolved in the US, the EU, and China. The legal landscape has largely remained inconsistent and fragmented, with enforcement struggling to keep up with the latest developments. We argue, first, that the success of meso-logics is largely defined by global economic competition, and, second, that these meso-logics may potentially put the EU’s macro-level framework with its mixed internal market and fundamental rights-oriented model under pressure. We conclude that, given the relative absence of a strong macro level-framework and an intensive competition of governance models at meso-level, it may be challenging to avoid compromises to the European macro framework.

Data governance, Data intermediaries, Data ordering, Data sovereignty, GDPR

Bibtex

Personalised pricing: The demise of the fixed price? external link

Abstract

An online seller or platform is technically able to offer every consumer a different price for the same product, based on information it has about the customers. Such online price discrimination exacerbates concerns regarding the fairness and morality of price discrimination, and the possible need for regulation. In this chapter, we discuss the underlying basis of price discrimination in economic theory, and its popular perception. Our surveys show that consumers are critical and suspicious of online price discrimination. A majority consider it unacceptable and unfair, and are in favour of a ban. When stores apply online price discrimination, most consumers think they should be informed about it. We argue that the General Data Protection Regulation (GDPR) applies to the most controversial forms of online price discrimination, and not only requires companies to disclose their use of price discrimination, but also requires companies to ask customers for their prior consent. Industry practice, however, does not show any adoption of these two principles.

algorithms, frontpage, GDPR, gegevensbescherming, Personalisation, Price discrimination, Privacy

Bibtex

Decentralised Data Processing: Personal Data Stores and the GDPR external link

Janssen, H., Cobbe, J., Norval, C. & Singh, J.
International Data Privacy Law, vol. 10, num: 4, pp: 356-384, 2021

Abstract

When it comes to online services, users have limited control over how their personal data is processed. This is partly due to the nature of the business models of those services, where data is typically stored and aggregated in data centres. This has recently led to the development of technologies aiming at leveraging user control over the processing of their personal data. Personal Data Stores (“PDSs”) represent a class of these technologies; PDSs provide users with a device, enabling them to capture, aggregate and manage their personal data. The device provides tools for users to control and monitor access, sharing and computation over data on their device. The motivation for PDSs are described as (i) to assist users with their confidentiality and privacy concerns, and/or (ii) to provide opportunities for users to transact with or otherwise monetise their data. While PDSs potentially might enable some degree of user empowerment, they raise interesting considerations and uncertainties in relation to the responsibilities under the General Data Protection Regulation (GDPR). More specifically, the designations of responsibilities among key parties involved in PDS ecosystems are unclear. Further, the technical architecture of PDSs appears to restrict certain lawful grounds for processing, while technical means to identify certain category data, as proposed by some, may remain theoretical. We explore the considerations, uncertainties, and limitations of PDSs with respect to some key obligations under the GDPR. As PDS technologies continue to develop and proliferate, potentially providing an alternative to centralised approaches to data processing, we identify issues which require consideration by regulators, PDS platform providers and technologists.

GDPR, Privacy

Bibtex