Heleen Janssen is a researcher within the Blockchain Society & Policy Research Team. She has a background and strong interest in law, technology & society. Her research focus will revolve around the governance of emerging data intermediaries that seek to empower individuals, communities, or perhaps SMEs, by offering them (tech & legal) data governance models. These may envisage an alternative to the (often) opaque business models, which entail systemic asymmetries of information and power. Heleen will focus on the role and influence of regulators in responding to questions and issues about power and control, which potentially arise from within and around the data governance models in these tech/legal data intermediaries.
Heleen is also working as an associate researcher and affiliate lecturer at the Department of Computer Science and Technology, in the Compliant and Accountable Systems Research Group of the University of Cambridge. She is also a member of the Microsoft Cloud Computing Research Centre.
From 2019 – 2020, she worked as coordinating legal specialist in the team “Emerging technologies, Public Values and Fundamental rights” at the Department of Digital Government, at the Ministry of the Interior. Among her responsibilities were her active participation on behalf of the Dutch government in the Comité ad Hoc on Artificial Intelligence (CAHAI, Council of Europe). She also prepared the government's response to EU’s White Paper on Artificial Intelligence (2020), and tasked a research group to develop a fundamental rights impact assessment that public bodies should use whenever they envisage the use of algorithms (e.g. in their decision-making, or in tendering procedures).
From 2004 – 2018 Heleen was (senior) legal adviser at the Department of Constitutional Affairs & Legislation. Among key responsibilities were lead authorship of the modernisation of the constitutional right to communication secrecy, and the initiation of the project “fundamental rights and algorithms”, which resulted in a more strategic positioning of the Minister of the Interior on this topic. On behalf of the Dutch government, she led negotiations with regard to the modernisation of the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (Council of Europe, 2012-2016).
Before working for the government, Heleen wrote her PhD about constitutional interpretation (Maastricht University, 1997 – 2002). She was a visiting PhD candidate at Harvard Law School, Cambridge, US (2000). Earlier, she received a grant from the German Bundestag to study German law (1995 – 1996) in Germany (University of Tübingen; High Court in Düsseldorf; European Law Academy, Trier).
|Bodó, B., Giannopoulou, A., Irion, K., Janssen, H.|
In: Internet Policy Review, 10 (3), 2021.
(| | )
The technological infrastructures enabling the collection, processing, and trading of data have fuelled a rapid innovation of data governance models. We differentiate between macro, meso, and micro level models, which correspond to major political blocks; societal-, industry-, or community level systems, and individual approaches, respectively. We focus on meso-level models, which coalesce around: (1) organisations prioritising their own interests over interests of other stakeholders; (2) organisations offering technological and legal tools aiming to empower individuals; (3) community-based data intermediaries fostering collective rights and interests. In this article we assess these meso-level models, and discuss their interaction with the macro-level legal frameworks that have evolved in the US, the EU, and China. The legal landscape has largely remained inconsistent and fragmented, with enforcement struggling to keep up with the latest developments. We argue, first, that the success of meso-logics is largely defined by global economic competition, and, second, that these meso-logics may potentially put the EU’s macro-level framework with its mixed internal market and fundamental rights-oriented model under pressure. We conclude that, given the relative absence of a strong macro level-framework and an intensive competition of governance models at meso-level, it may be challenging to avoid compromises to the European macro framework.
|Cobbe, J., Janssen, H., Singh, J.|
Personal Data Stores: a user-centric privacy utopia?
In: Internet Policy Review, Forthcoming.
|Cobbe, J., Janssen, H., Norval, C., Singh, J.|
In: International Data Privacy Law, 10 (4), pp. 356-384, 2021.
(| | )
When it comes to online services, users have limited control over how their personal data is processed. This is partly due to the nature of the business models of those services, where data is typically stored and aggregated in data centres. This has recently led to the development of technologies aiming at leveraging user control over the processing of their personal data.
Personal Data Stores (“PDSs”) represent a class of these technologies; PDSs provide users with a device, enabling them to capture, aggregate and manage their personal data. The device provides tools for users to control and monitor access, sharing and computation over data on their device. The motivation for PDSs are described as (i) to assist users with their confidentiality and privacy concerns, and/or (ii) to provide opportunities for users to transact with or otherwise monetise their data.
While PDSs potentially might enable some degree of user empowerment, they raise interesting considerations and uncertainties in relation to the responsibilities under the General Data Protection Regulation (GDPR). More specifically, the designations of responsibilities among key parties involved in PDS ecosystems are unclear. Further, the technical architecture of PDSs appears to restrict certain lawful grounds for processing, while technical means to identify certain category data, as proposed by some, may remain theoretical.
We explore the considerations, uncertainties, and limitations of PDSs with respect to some key obligations under the GDPR. As PDS technologies continue to develop and proliferate, potentially providing an alternative to centralised approaches to data processing, we identify issues which require consideration by regulators, PDS platform providers and technologists.
|Cobbe, J., Janssen, H., Seng Ah Lee, M., Singh, J.|
In: Computer, 53 (10), pp. 47-58, 2020.
(| | )
Driven by the promise of increased efficiencies and cost-savings, the public sector has shown much interest in automated decision-making (ADM) technologies. However, the rule of law and fundamental principles of good government are being lost along the way.
In: International Data Privacy Law, 10 (1), pp. 76-106, 2020.
(| | )
Companies and other private institutions see great and promising profits in the use of automated decision-making (‘ADM’) for commercial-, financial- or efficiency in work processing purposes. Meanwhile, ADM based on a data subjects’ personal data may (severely) impact its fundamental rights and freedoms. The General Data Protection Regulation (GDPR) provides for a regulatory framework that applies whenever a controller considers and deploys ADM onto individuals on the basis of their personal data. In the design stage of the intended ADM, article 35 (3)(a) obliges a controller to apply a Data Protection Impact Assessment (DPIA), part of which is an assessment of ADM’s impact on individual rights and freedoms. Article 22 GDPR determines under what conditions ADM is allowed and endows data subjects with increased protection.
Research among companies of various sizes has shown that there is (legal) insecurity about the interpretation of the GDPR (including the provisions relevant to ADM). The first objective of the author is to detect ways forward by offering practical handles to execute a DPIA that includes a slidable assessment of impacts on data subjects’ fundamental rights. This assessment is based on four benchmarks that should help to assess the gravity of potential impacts, i.e. i) to determine the impact on the fundamental right(s) at stake, ii) to establish the context in which the ADM is used, iii) the establishment of who is beneficiary of the use of personal data in the ADM and iv) the establishment who is in control over the data flows in the ADM. From the benchmarks an overall fundamental rights impact assessment about ADM should arise. A second objective is to indicate potential factors and measures that a controller should consider in its risk management after the assessment. The proposed approach should help fostering fair, compliant and trustworthy ADM and contains directions for future research.
2003, (Dissertatie Universiteit Maastricht).