Projects
Heleen Janssen
Heleen Janssen is a researcher within the Blockchain Society & Policy Research Team. She has a background and strong interest in law, technology & society. Her research focus will revolve around the governance of emerging data intermediaries that seek to empower individuals, communities, or perhaps SMEs, by offering them (tech & legal) data governance models. These may envisage an alternative to the (often) opaque business models, which entail systemic asymmetries of information and power. Heleen will focus on the role and influence of regulators in responding to questions and issues about power and control, which potentially arise from within and around the data governance models in these tech/legal data intermediaries.
Heleen is also working as an associate researcher and affiliate lecturer at the Department of Computer Science and Technology, in the Compliant and Accountable Systems Research Group of the University of Cambridge. She is also a member of the Microsoft Cloud Computing Research Centre.
From 2019 – 2020, she worked as coordinating legal specialist in the team “Emerging technologies, Public Values and Fundamental rights” at the Department of Digital Government, at the Ministry of the Interior. Among her responsibilities were her active participation on behalf of the Dutch government in the Comité ad Hoc on Artificial Intelligence (CAHAI, Council of Europe). She also prepared the government's response to EU’s White Paper on Artificial Intelligence (2020), and tasked a research group to develop a fundamental rights impact assessment that public bodies should use whenever they envisage the use of algorithms (e.g. in their decision-making, or in tendering procedures).
From 2004 – 2018 Heleen was (senior) legal adviser at the Department of Constitutional Affairs & Legislation. Among key responsibilities were lead authorship of the modernisation of the constitutional right to communication secrecy, and the initiation of the project “fundamental rights and algorithms”, which resulted in a more strategic positioning of the Minister of the Interior on this topic. On behalf of the Dutch government, she led negotiations with regard to the modernisation of the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (Council of Europe, 2012-2016).
Before working for the government, Heleen wrote her PhD about constitutional interpretation (Maastricht University, 1997 – 2002). She was a visiting PhD candidate at Harvard Law School, Cambridge, US (2000). Earlier, she received a grant from the German Bundestag to study German law (1995 – 1996) in Germany (University of Tübingen; High Court in Düsseldorf; European Law Academy, Trier).
Publications
Cobbe, J., Janssen, H., Norval, C., Singh, J. Decentralised Data Processing: Personal Data Stores and the GDPR International Data Privacy Law, Forthcoming. @article{Janssen2021, title = {Decentralised Data Processing: Personal Data Stores and the GDPR}, author = {Janssen, H. and Cobbe, J. and Norval, C. and Singh, J.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3570895}, year = {2021}, date = {2021-01-04}, journal = {International Data Privacy Law}, abstract = {When it comes to online services, users have limited control over how their personal data is processed. This is partly due to the nature of the business models of those services, where data is typically stored and aggregated in data centres. This has recently led to the development of technologies aiming at leveraging user control over the processing of their personal data. Personal Data Stores (“PDSs”) represent a class of these technologies; PDSs provide users with a device, enabling them to capture, aggregate and manage their personal data. The device provides tools for users to control and monitor access, sharing and computation over data on their device. The motivation for PDSs are described as (i) to assist users with their confidentiality and privacy concerns, and/or (ii) to provide opportunities for users to transact with or otherwise monetise their data. While PDSs potentially might enable some degree of user empowerment, they raise interesting considerations and uncertainties in relation to the responsibilities under the General Data Protection Regulation (GDPR). More specifically, the designations of responsibilities among key parties involved in PDS ecosystems are unclear. Further, the technical architecture of PDSs appears to restrict certain lawful grounds for processing, while technical means to identify certain category data, as proposed by some, may remain theoretical. We explore the considerations, uncertainties, and limitations of PDSs with respect to some key obligations under the GDPR. As PDS technologies continue to develop and proliferate, potentially providing an alternative to centralised approaches to data processing, we identify issues which require consideration by regulators, PDS platform providers and technologists.}, keywords = {}, pubstate = {forthcoming}, tppubtype = {article} } When it comes to online services, users have limited control over how their personal data is processed. This is partly due to the nature of the business models of those services, where data is typically stored and aggregated in data centres. This has recently led to the development of technologies aiming at leveraging user control over the processing of their personal data. Personal Data Stores (“PDSs”) represent a class of these technologies; PDSs provide users with a device, enabling them to capture, aggregate and manage their personal data. The device provides tools for users to control and monitor access, sharing and computation over data on their device. The motivation for PDSs are described as (i) to assist users with their confidentiality and privacy concerns, and/or (ii) to provide opportunities for users to transact with or otherwise monetise their data. While PDSs potentially might enable some degree of user empowerment, they raise interesting considerations and uncertainties in relation to the responsibilities under the General Data Protection Regulation (GDPR). More specifically, the designations of responsibilities among key parties involved in PDS ecosystems are unclear. Further, the technical architecture of PDSs appears to restrict certain lawful grounds for processing, while technical means to identify certain category data, as proposed by some, may remain theoretical. We explore the considerations, uncertainties, and limitations of PDSs with respect to some key obligations under the GDPR. As PDS technologies continue to develop and proliferate, potentially providing an alternative to centralised approaches to data processing, we identify issues which require consideration by regulators, PDS platform providers and technologists. |
Cobbe, J., Janssen, H., Singh, J. Personal Data Stores: a user-centric privacy utopia? Internet Policy Review, Forthcoming. @article{Janssen2021b, title = {Personal Data Stores: a user-centric privacy utopia?}, author = {Janssen, H. and Cobbe, J. and Singh, J.}, year = {2021}, date = {2021-01-04}, journal = {Internet Policy Review}, keywords = {}, pubstate = {forthcoming}, tppubtype = {article} } |
Cobbe, J., Janssen, H., Seng Ah Lee, M., Singh, J. Centering the Law in the Digital State Computer, 53 (10), pp. 47-58, 2020. @article{Cobbe2020, title = {Centering the Law in the Digital State}, author = {Cobbe, J. and Seng Ah Lee, M. and Singh, J. and Janssen, H.}, doi = {10.1109/MC.2020.3006623}, year = {2020}, date = {2020-09-25}, journal = {Computer}, volume = {53}, number = {10}, pages = {47-58}, abstract = {Driven by the promise of increased efficiencies and cost-savings, the public sector has shown much interest in automated decision-making (ADM) technologies. However, the rule of law and fundamental principles of good government are being lost along the way.}, keywords = {}, pubstate = {published}, tppubtype = {article} } Driven by the promise of increased efficiencies and cost-savings, the public sector has shown much interest in automated decision-making (ADM) technologies. However, the rule of law and fundamental principles of good government are being lost along the way. |
Janssen, H. An approach to a fundamental rights impact assessment to automated decision-making International Data Privacy Law, 10 (1), pp. 76-106, 2020. @article{Janssen2020, title = {An approach to a fundamental rights impact assessment to automated decision-making}, author = {Janssen, H.}, doi = {https://doi.org/10.1093/idpl/ipz028}, year = {2020}, date = {2020-03-06}, journal = {International Data Privacy Law}, volume = {10}, number = {1}, pages = {76-106}, abstract = {Companies and other private institutions see great and promising profits in the use of automated decision-making (‘ADM’) for commercial-, financial- or efficiency in work processing purposes. Meanwhile, ADM based on a data subjects’ personal data may (severely) impact its fundamental rights and freedoms. The General Data Protection Regulation (GDPR) provides for a regulatory framework that applies whenever a controller considers and deploys ADM onto individuals on the basis of their personal data. In the design stage of the intended ADM, article 35 (3)(a) obliges a controller to apply a Data Protection Impact Assessment (DPIA), part of which is an assessment of ADM’s impact on individual rights and freedoms. Article 22 GDPR determines under what conditions ADM is allowed and endows data subjects with increased protection. Research among companies of various sizes has shown that there is (legal) insecurity about the interpretation of the GDPR (including the provisions relevant to ADM). The first objective of the author is to detect ways forward by offering practical handles to execute a DPIA that includes a slidable assessment of impacts on data subjects’ fundamental rights. This assessment is based on four benchmarks that should help to assess the gravity of potential impacts, i.e. i) to determine the impact on the fundamental right(s) at stake, ii) to establish the context in which the ADM is used, iii) the establishment of who is beneficiary of the use of personal data in the ADM and iv) the establishment who is in control over the data flows in the ADM. From the benchmarks an overall fundamental rights impact assessment about ADM should arise. A second objective is to indicate potential factors and measures that a controller should consider in its risk management after the assessment. The proposed approach should help fostering fair, compliant and trustworthy ADM and contains directions for future research.}, keywords = {}, pubstate = {published}, tppubtype = {article} } Companies and other private institutions see great and promising profits in the use of automated decision-making (‘ADM’) for commercial-, financial- or efficiency in work processing purposes. Meanwhile, ADM based on a data subjects’ personal data may (severely) impact its fundamental rights and freedoms. The General Data Protection Regulation (GDPR) provides for a regulatory framework that applies whenever a controller considers and deploys ADM onto individuals on the basis of their personal data. In the design stage of the intended ADM, article 35 (3)(a) obliges a controller to apply a Data Protection Impact Assessment (DPIA), part of which is an assessment of ADM’s impact on individual rights and freedoms. Article 22 GDPR determines under what conditions ADM is allowed and endows data subjects with increased protection. Research among companies of various sizes has shown that there is (legal) insecurity about the interpretation of the GDPR (including the provisions relevant to ADM). The first objective of the author is to detect ways forward by offering practical handles to execute a DPIA that includes a slidable assessment of impacts on data subjects’ fundamental rights. This assessment is based on four benchmarks that should help to assess the gravity of potential impacts, i.e. i) to determine the impact on the fundamental right(s) at stake, ii) to establish the context in which the ADM is used, iii) the establishment of who is beneficiary of the use of personal data in the ADM and iv) the establishment who is in control over the data flows in the ADM. From the benchmarks an overall fundamental rights impact assessment about ADM should arise. A second objective is to indicate potential factors and measures that a controller should consider in its risk management after the assessment. The proposed approach should help fostering fair, compliant and trustworthy ADM and contains directions for future research. |
Janssen, H. 2003, (Dissertatie Universiteit Maastricht). @phdthesis{Janssen2003, title = {Constitutionele Interpretatie. Een rechtsvergelijkend onderzoek naar de vaststelling van de reikwijdte van het recht op persoonlijkheid}, author = {Janssen, H.}, url = {https://cris.maastrichtuniversity.nl/ws/portalfiles/portal/38414297/1637229.pdf}, year = {2003}, date = {2003-02-07}, pages = {495}, publisher = {Sdu}, note = {Dissertatie Universiteit Maastricht}, keywords = {}, pubstate = {published}, tppubtype = {phdthesis} } |