Heleen Janssen is a researcher within the Blockchain Society & Policy Research Team. She has a background and strong interest in law, technology & society. Her research focus will revolve around the governance of emerging data intermediaries that seek to empower individuals, communities, or perhaps SMEs, by offering them (tech & legal) data governance models. These may envisage an alternative to the (often) opaque business models, which entail systemic asymmetries of information and power. Heleen will focus on the role and influence of regulators in responding to questions and issues about power and control, which potentially arise from within and around the data governance models in these tech/legal data intermediaries.
Heleen is also working as an associate researcher and affiliate lecturer at the Department of Computer Science and Technology, in the Compliant and Accountable Systems Research Group of the University of Cambridge. She is also a member of the Microsoft Cloud Computing Research Centre. On 1 September 2021 Heleen was appointed for four years as a member of the independent Advisory Committee on Data Protection of the municipality of Amsterdam (Commissie Persoonsgegevens Amsterdam, CPA).
From 2019 – 2020, she worked as coordinating legal specialist in the team “Emerging technologies, Public Values and Fundamental rights” at the Department of Digital Government, at the Ministry of the Interior. Among her responsibilities were her active participation on behalf of the Dutch government in the Comité ad Hoc on Artificial Intelligence (CAHAI, Council of Europe). She also prepared the government's response to EU’s White Paper on Artificial Intelligence (2020), and tasked a research group to develop a fundamental rights impact assessment that public bodies should use whenever they envisage the use of algorithms (e.g. in their decision-making, or in tendering procedures).
From 2004 – 2018 Heleen was (senior) legal adviser at the Department of Constitutional Affairs & Legislation. Among key responsibilities were lead authorship of the modernisation of the constitutional right to communication secrecy, and the initiation of the project “fundamental rights and algorithms”, which resulted in a more strategic positioning of the Minister of the Interior on this topic. On behalf of the Dutch government, she led negotiations with regard to the modernisation of the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (Council of Europe, 2012-2016).
Before working for the government, Heleen wrote her PhD about constitutional interpretation (Maastricht University, 1997 – 2002). She was a visiting PhD candidate at Harvard Law School, Cambridge, US (2000). Earlier, she received a grant from the German Bundestag to study German law (1995 – 1996) in Germany (University of Tübingen; High Court in Düsseldorf; European Law Academy, Trier).
| Bodó, B., Janssen, H.|
In: Policy and Society, 2022.
(| | )
Emerging technologies permeate and potentially disrupt a wide spectrum of our social, economic, and political relations. Various state institutions, including education, law enforcement, and healthcare, increasingly rely on technical components, such as automated decision-making systems, e-government systems, and other digital tools to provide cheap, efficient public services, and supposedly fair, transparent, disinterested, and accountable public administration. The increased interest in various blockchain-based solutions from central bank digital currencies, via tokenized educational credentials, and distributed ledger-based land registries to self-sovereign identities is the latest, still mostly unwritten chapter in a long history of standardized, objectified, automated, technocratic, and technologized public administration. The rapid, (often) unplanned, and uncontrolled technologization of public services (as happened in the hasty adoption of distance-learning and teleconferencing systems during Corona Virus Disease (COVID) lockdowns) raises complex questions about the use of novel technological components, which may or may not be ultimately adequate for the task for which they are used. The question whether we can trust the technical infrastructures the public sector uses when providing public services is a central concern in an age where trust in government is declining: If the government’s artificial intelligence system that detects welfare fraud fails, the public’s confidence in the government is ultimately hit. In this paper, we provide a critical assessment of how the use of potentially untrustworthy (private) technological systems including blockchain-based systems in the public sector may affect trust in government. We then propose several policy options to protect the trust in government even if some of their technological components prove fundamentally untrustworthy.
| Janssen, H.|
Persoonlijke PIMS: privacyfort of luchtkasteel?
In: Privacy & Informatie, no. 5, pp. 214-225, 2021.
Persoonsgegevens worden thans veelal op ondoorzichtige wijze, buiten de controle van de betrokkenen verwerkt. Persoonlijke informatiebeheersystemen (PIMS) willen betrokkenen technologische toepassingen aanreiken, die hun meer controle geven over de verwerking van hun persoonsgegevens. PIMS presenteren zich als alternatief voor de huidige, ‘gecentraliseerde’ wijze van gegevensverwerking, waarbij (grote) organisaties persoonsgegevens op meestal ondoorzichtige wijze verzamelen, analyseren en doorgeven aan derden. PIMS bieden betrokkenen technische instrumenten waarmee zij zelf kunnen controleren en bepalen wanneer en aan wie zijn hun gegevens overdragen, en/of analyses over hun gegevens kunnen laten uitvoeren. Hoewel argumenten voor deze ‘decentralisatie’
aantrekkelijk klinken, rijzen vragen over de mate waarin PIMS de problemen met de huidige gegevensverwerking effectief kunnen bestrijden. In dit artikel ligt de focus bij de vraag in hoeverre deze PIMS de machtsongelijkheid tussen betrokkenen en grote organisaties daadwerkelijk kunnen bestrijden, die als gevolg van de huidige gegevensverwerkingspraktijk zijn ontstaan. PIMS kunnen enig inzicht in en controle over gegevensverwerking bieden, maar desondanks zal de machtsongelijkheid grotendeels blijven voortbestaan.
| Bodó, B., Giannopoulou, A., Irion, K., Janssen, H.|
In: Internet Policy Review, vol. 10, no. 3, 2021.
(| | )
The technological infrastructures enabling the collection, processing, and trading of data have fuelled a rapid innovation of data governance models. We differentiate between macro, meso, and micro level models, which correspond to major political blocks; societal-, industry-, or community level systems, and individual approaches, respectively. We focus on meso-level models, which coalesce around: (1) organisations prioritising their own interests over interests of other stakeholders; (2) organisations offering technological and legal tools aiming to empower individuals; (3) community-based data intermediaries fostering collective rights and interests. In this article we assess these meso-level models, and discuss their interaction with the macro-level legal frameworks that have evolved in the US, the EU, and China. The legal landscape has largely remained inconsistent and fragmented, with enforcement struggling to keep up with the latest developments. We argue, first, that the success of meso-logics is largely defined by global economic competition, and, second, that these meso-logics may potentially put the EU’s macro-level framework with its mixed internal market and fundamental rights-oriented model under pressure. We conclude that, given the relative absence of a strong macro level-framework and an intensive competition of governance models at meso-level, it may be challenging to avoid compromises to the European macro framework.
| Cobbe, J., Janssen, H., Singh, J.|
Personal Data Stores: a user-centric privacy utopia?
In: Internet Policy Review, Forthcoming.
| Cobbe, J., Janssen, H., Norval, C., Singh, J.|
In: International Data Privacy Law, vol. 10, no. 4, pp. 356-384, 2021.
(| | )
When it comes to online services, users have limited control over how their personal data is processed. This is partly due to the nature of the business models of those services, where data is typically stored and aggregated in data centres. This has recently led to the development of technologies aiming at leveraging user control over the processing of their personal data.
Personal Data Stores (“PDSs”) represent a class of these technologies; PDSs provide users with a device, enabling them to capture, aggregate and manage their personal data. The device provides tools for users to control and monitor access, sharing and computation over data on their device. The motivation for PDSs are described as (i) to assist users with their confidentiality and privacy concerns, and/or (ii) to provide opportunities for users to transact with or otherwise monetise their data.
While PDSs potentially might enable some degree of user empowerment, they raise interesting considerations and uncertainties in relation to the responsibilities under the General Data Protection Regulation (GDPR). More specifically, the designations of responsibilities among key parties involved in PDS ecosystems are unclear. Further, the technical architecture of PDSs appears to restrict certain lawful grounds for processing, while technical means to identify certain category data, as proposed by some, may remain theoretical.
We explore the considerations, uncertainties, and limitations of PDSs with respect to some key obligations under the GDPR. As PDS technologies continue to develop and proliferate, potentially providing an alternative to centralised approaches to data processing, we identify issues which require consideration by regulators, PDS platform providers and technologists.
| Cobbe, J., Janssen, H., Seng Ah Lee, M., Singh, J.|
In: Computer, vol. 53, no. 10, pp. 47-58, 2020.
(| | )
Driven by the promise of increased efficiencies and cost-savings, the public sector has shown much interest in automated decision-making (ADM) technologies. However, the rule of law and fundamental principles of good government are being lost along the way.
| Janssen, H.|
In: International Data Privacy Law, vol. 10, no. 1, pp. 76-106, 2020.
(| | )
Companies and other private institutions see great and promising profits in the use of automated decision-making (‘ADM’) for commercial-, financial- or efficiency in work processing purposes. Meanwhile, ADM based on a data subjects’ personal data may (severely) impact its fundamental rights and freedoms. The General Data Protection Regulation (GDPR) provides for a regulatory framework that applies whenever a controller considers and deploys ADM onto individuals on the basis of their personal data. In the design stage of the intended ADM, article 35 (3)(a) obliges a controller to apply a Data Protection Impact Assessment (DPIA), part of which is an assessment of ADM’s impact on individual rights and freedoms. Article 22 GDPR determines under what conditions ADM is allowed and endows data subjects with increased protection.
Research among companies of various sizes has shown that there is (legal) insecurity about the interpretation of the GDPR (including the provisions relevant to ADM). The first objective of the author is to detect ways forward by offering practical handles to execute a DPIA that includes a slidable assessment of impacts on data subjects’ fundamental rights. This assessment is based on four benchmarks that should help to assess the gravity of potential impacts, i.e. i) to determine the impact on the fundamental right(s) at stake, ii) to establish the context in which the ADM is used, iii) the establishment of who is beneficiary of the use of personal data in the ADM and iv) the establishment who is in control over the data flows in the ADM. From the benchmarks an overall fundamental rights impact assessment about ADM should arise. A second objective is to indicate potential factors and measures that a controller should consider in its risk management after the assessment. The proposed approach should help fostering fair, compliant and trustworthy ADM and contains directions for future research.
| Janssen, H.|
2003, (Dissertatie Universiteit Maastricht).