Keyword: Privacy
Personal Data Stores and the GDPR’s lawful grounds for processing personal data
Abstract
Personal Data Stores (‘PDSs’) entail users having a (physical or virtual) device within which they themselves can, in theory, capture, aggregate, and control the access to and the transfer of personal data. Their aim is to empower users in relation to their personal data, strengthening their opportunities for data protection, privacy, and/or to facilitate trade and monetisation. As PDS technologies develop, it is important to consider their role in relation to issues of data protection. The General Data Protection Regulation requires that the processing of user data be predicated on one of its defined lawful bases, whereby the Regulation does not favour any one basis over another. We explore how PDS architectures relate to these lawful bases, and observe that they tend to favour the bases that require direct user involvement. This paper considers issues that the envisaged architectural choices surrounding the lawful grounds may entail.
Links
Data protection, decentralisation, lawful grounds for processing, personal data stores, Privacy, Transparency
RIS
Bibtex
The right to encryption: Privacy as preventing unlawful access external link
Abstract
Encryption technologies are a fundamental building block of modern digital infrastructure, but plans to curb these technologies continue to spring up. Even in the European Union, where their application is by now firmly embedded in legislation, lawmakers are again calling for measures which would impact these technologies. One of the most important arguments in this debate are human rights, most notably the rights to privacy and to freedom of expression. And although some authors have in the past explored how encryption technologies support human rights, this connection is not yet firmly grounded in an analysis of European human rights case law. This contribution aims to fill this gap, developing a framework for assessing restrictions of encryption technologies under the rights to privacy and freedom of expression as protected under the European Convention of Human Rights (the Convention) and the Charter of Fundamental rights in the European Union (the Charter). In the first section, the relevant function of encryption technologies, restricting access to information (called confidentiality), is discussed. In the second section, an overview of some governmental policies and practices impacting these technologies is provided. This continues with a discussion of the case law on the rights to privacy, data protection and freedom of expression, arguing that these rights are not only about ensuring lawful access by governments to protected information, but also about preventing unlawful access by others. And because encryption technologies are an important technology to reduce the risk of this unlawful access, it is then proposed that this risk is central to the assessment of governance measures in the field of encryption technologies. The article concludes by recommending that states perform an in-depth assessement of this when proposing new measures, and that courts when reviewing them also place the risk of unlawful access central to the analysis of interference and proportionality.
Links
communications confidentiality, encryption, Freedom of expression, Human rights, Privacy, unlawful access
RIS
Bibtex
Fundamental rights assessment of the framework for detection orders under the CSAM proposal download
Shielding citizens? Understanding the impact of political advertisement transparency information
Abstract
Online targeted advertising leverages an information asymmetry between the advertiser and the recipient. Policymakers in the European Union and the United States aim to decrease this asymmetry by requiring information transparency information alongside political advertisements, in the hope of activating citizens’ persuasion knowledge. However, the proposed regulations all present different directions with regard to the required content of transparency information. Consequently, not all proposed interventions will be (equally) effective. Moreover, there is a chance that transparent information has additional consequences, such as increasing privacy concerns or decreasing advertising effectiveness. Using an online experiment (N = 1331), this study addresses these challenges and finds that two regulatory interventions (DSA and HAA) increase persuasion knowledge, while the chance of raising privacy concerns or lowering advertisement effectiveness is present but slim. Results suggest transparency information interventions have some promise, but at the same time underline the limitations of user-facing transparency interventions.
Links
information disclosures, online advertising, persuasion knowledge, political attitudes, Privacy, Transparency
RIS
Bibtex
Annotatie bij Hoge Raad 25 februari 2022 (Google) download
Abstract
Privacyrecht. Algemene Verordening Gegevensbescherming (AVG); verzoek verwijdering zoekresultaten; gevoelige persoonsgegevens (art. 10 AVG); maatstaf. Proceskosten in AVG-zaken; doeltreffende voorziening (art. 79 AVG en art. 47
Handvest Grondrechten EU).
Links
Annotaties, AVG, Privacy, zoekresultaten
RIS
Bibtex
Annotatie Hoge Raad 3 december 2021 (Hoist Finance AB) download
Abstract
Prejudiciële beslissing op voet art. 392 Rv. Algemene verordening gegevensbescherming (AVG). Rechtsgrond verwerking persoonsgegevens in kredietregistratiestelsel BKR; recht op gegevenswissing; recht op bezwaar.
Links
Annotaties, AVG, Privacy
RIS
Bibtex
Defining the scope of AI ADM system risk assessment external link
A Matter of (Joint) control? Virtual assistants and the general data protection regulation external link
Abstract
This article provides an overview and critical examination of the rules for determining who qualifies as controller or joint controller under the General Data Protection Regulation. Using Google Assistant – an artificial intelligence-driven virtual assistant – as a case study, we argue that these rules are overreaching and difficult to apply in the present-day information society and Internet of Things environments. First, as a consequence of recent developments in case law and supervisory guidance, these rules lead to a complex and ambiguous test to determine (joint) control. Second, due to advances in technological applications and business models, it is increasingly challenging to apply such rules to contemporary processing operations. In particular, as illustrated by the Google Assistant, individuals will likely be qualified as joint controllers, together with Google and also third-party developers, for at least the collection and possible transmission of other individuals’ personal data via the virtual assistant. Third, we identify follow-on issues relating to the apportionment of responsibilities between joint controllers and the effective and complete protection of data subjects. We conclude by questioning whether the framework for determining who qualifies as controller or joint controller is future-proof and normatively desirable.
frontpage, GDPR, Privacy, Recht op gegevensbescherming