Public Registers Caught between Open Government and Data Protection – Personal Data, Principles of Proportionality and the Public Interest download

Lokhorst, G. & van Eechoud, M.
In: Data Protection and Privacy, Volume 12 : Data Protection and Democracy, Hallinan, D., Leenes, R., Gutwirth, S. & De Hert, P. (eds.), Hart Publishing, 2020, ISBN: 9781509932740

Abstract

For governments across the globe, public registers are an increasingly popular means to help achieve a range of objectives. These include safeguarding the independence of judiciary, upholding food hygiene and safety standards, fostering proper use of subsidies, and protecting the public from unqualified professionals. Most public registers are subject to data protection laws because they contain some form of personal data. In the Netherlands, the number of online public register has risen dramatically. On the basis of exploratory research on Dutch public registers, we hypothesised that governments easily assume that public registers serve their designated goals, but rarely adequately assess their effectiveness. A comprehensive analysis of registers confirms that hypothesis is correct. This is problematic from the perspective of the EU’s General Data Protection Regulation (GDPR) and the human right to privacy as enshrined in, for example, Article 8 of the European Convention of Human Rights (ECHR). Both require that the means used to serve a (legitimate) purpose are proportionate to the (potential) privacy harms. Based on the experience in the Netherlands, in this chapter we query to what extent and by which means policy and lawmakers actually test effectiveness of public registers and analyse the privacy implications from the perspective of proportionality. The adoption of open government policies combined with technological possibilities result in a strong growth of online public registers, and possibilities to link data from multiple sources multiply. The potential privacy impacts on individuals need to be better understood and safeguarded already at the design stages of public registers.

Data protection, open government, Personal data, Privacy, public registers

RIS

Save .RIS

Bibtex

Save .bib

Pitching trade against privacy: reconciling EU governance of personal data flows with external trade external link

International Data Privacy Law, vol. 10, num: 3, pp: 201-221, 2020

Abstract

This article positions EU’s external governance of personal data flows against the backdrop of the international controversy on digital trade versus strict privacy laws. Now that the EU has defined its position on horizontal provisions on cross-border data flows and personal data protection, it is both timely and essential to reassess its strategy on the international transfers of personal data in the purview of its future trade agreements. For its own normative approach and regulatory autonomy, the EU has a pivotal role to play in shaping the interface between trade and privacy before the ‘free trade leviathan’ can restrict the policy choices not only of individual states but also of the EU itself. Our contribution aims to break through the present compartmentalization of privacy scholarship and trade lawyers because it situates personal data flows in both disciplines.

Cross-border data flow, Digital trade, EU law, frontpage, GDPR, international trade law, Personal data, Privacy

RIS

Save .RIS

Bibtex

Save .bib

Strengthening legal protection against discrimination by algorithms and artificial intelligence external link

The International Journal of Human Rights, 2020

Abstract

Algorithmic decision-making and other types of artificial intelligence (AI) can be used to predict who will commit crime, who will be a good employee, who will default on a loan, etc. However, algorithmic decision-making can also threaten human rights, such as the right to non-discrimination. The paper evaluates current legal protection in Europe against discriminatory algorithmic decisions. The paper shows that non-discrimination law, in particular through the concept of indirect discrimination, prohibits many types of algorithmic discrimination. Data protection law could also help to defend people against discrimination. Proper enforcement of non-discrimination law and data protection law could help to protect people. However, the paper shows that both legal instruments have severe weaknesses when applied to artificial intelligence. The paper suggests how enforcement of current rules can be improved. The paper also explores whether additional rules are needed. The paper argues for sector-specific – rather than general – rules, and outlines an approach to regulate algorithmic decision-making.

algoritmes, Artificial intelligence, discriminatie, frontpage, GDPR, Privacy

RIS

Save .RIS

Bibtex

Save .bib

The personal information sphere: An integral approach to privacy and related information and communication rights external link

JASIST, vol. 71, num: 9, pp: 1116-1128, 2020

Abstract

Data protection laws, including the European Union General Data Protection Regulation, regulate aspects of online personalization. However, the data protection lens is too narrow to analyze personalization. To define conditions for personalization, we should understand data protection in its larger fundamental rights context, starting with the closely connected right to privacy. If the right to privacy is considered along with other European fundamental rights that protect information and communication flows, namely, communications confidentiality; the right to receive information; and freedom of expression, opinion, and thought, these rights are observed to enable what I call a “personal information sphere” for each person. This notion highlights how privacy interferences affect other fundamental rights. The personal information sphere is grounded in European case law and is thus not just an academic affair. The essence of the personal information sphere is control, yet with a different meaning than mere control as guaranteed by data protection law. The personal information sphere is about people controlling how they situate themselves in information and communication networks. It follows that, to respect privacy and related rights, online personalization providers should actively involve users in the personalization process and enable them to use personalization for personal goals.

Data protection law, frontpage, Fundamental rights, personalization, Privacy

RIS

Save .RIS

Bibtex

Save .bib

Trade and privacy: Complicated bed fellows? How to achieve data protection-proof free trade agreements external link

2016

Privacy, trade

RIS

Save .RIS

Bibtex

Save .bib

Getting Data Subject Rights Right: A submission to the European Data Protection Board from international data rights academics, to inform regulatory guidance external link

Ausloos, J., Veale, M. & Mahieu, R.
JIPITEC, vol. 10, num: 3, 2019

Abstract

We are a group of academics active in research and practice around data rights. We believe that the European Data Protection Board (EDPB) guidance on data rights currently under development is an important point to resolve a variety of tensions and grey areas which, if left unaddressed, may significantly undermine the fundamental right to data protection. All of us were present at the recent stakeholder event on data rights in Brussels on 4 November 2019, and it is in the context and spirit of stakeholder engagement that we have created this document to explore and provide recommendations and examples in this area. This document is based on comprehensive empirical evidence as well as CJEU case law, EDPB (and, previously, Article 29 Working Party) guidance and extensive scientific research into the scope, rationale, effects and general modalities of data rights.

GDPR, gegevensbescherming, Privacy

RIS

Save .RIS

Bibtex

Save .bib

Privacy Protection(ism): The Latest Wave of Trade Constraints on Regulatory Autonomy external link

University of Miami Law Review, vol. 74, num: 2, pp: 416-519, 2020

Abstract

Countries spend billions of dollars each year to strengthen their discursive power to shape international policy debates. They do so because in public policy conversations labels and narratives matter enormously. The “digital protectionism” label has been used in the last decade as a tool to gain the policy upper hand in digital trade policy debates about cross-border flows of personal and other data. Using the Foucauldian framework of discourse analysis, this Article brings a unique perspective on this topic. The Article makes two central arguments. First, the Article argues that the term “protectionism” is not endowed with an inherent meaning but is socially constructed by the power of discourse used in international negotiations, and in the interpretation and application of international trade policy and rules. In other words, there are as many definitions of “(digital) protectionism” as there are discourses. The U.S. and E.U. “digital trade” discourses illustrate this point. Using the same term, those trading partners advance utterly different discourses and agendas: an economic discourse with economic efficiency as the main benchmark (United States), and a more multidisciplinary discourse where both economic efficiency and protection of fundamental rights are equally important (European Union). Second, based on a detailed evaluation of the economic “digital trade” discourse, the Article contends that the coining of the term “digital protectionism” to refer to domestic information governance policies not yet fully covered by trade law disciplines is not a logical step to respond to objectively changing circumstances, but rather a product of that discourse, which is coming to dominate U.S.-led international trade negotiations. The Article demonstrates how this redefinition of “protectionism” has already resulted in the adoption of international trade rules in recent trade agreements further restricting domestic autonomy to protect the rights to privacy and the protection of personal data. The Article suggests that the distinction between privacy and personal data protection and protectionism is a moral question, not a question of economic efficiency. Therefore, when a policy conversation, such as the one on cross-border data flows, involves noneconomic spill-over effects to individual rights, such conversation should not be confined within the straightjacket of trade economics, but rather placed in a broader normative perspective. Finally, the Article argues that, in conducting recently restarted multilateral negotiations on electronic commerce at the World Trade Organization, countries should rethink the goals of international trade for the twenty-first century. Such goals should determine and define the discourse, not the other way around. The discussion should not be about what “protectionism” means but about how far domestic regimes are willing to let trade rules interfere in their autonomy to protect their societal, cultural, and political values.

frontpage, Privacy, protectionism, Regulation, trade

RIS

Save .RIS

Bibtex

Save .bib

The Privacy Disconnect external link

Chapter in: Human Rights in the Age of Platforms, ed. R.F. Jørgensen, Cambridge: The MIT Press, 2019., 0207, pp: 255-284, ISBN: 9780262039055

Privacy

RIS

Save .RIS

Bibtex

Save .bib

Panel discussion at CPDP 2020: We need to talk about filters: algorithmic copyright enforcement vs data protection. external link

Quintais, J., Ducato, R., Mazgal, A., Zuiderveen Borgesius, F. & Hegladóttir, A.
2020

Abstract

The new Copyright in the Digital Single Market (DSM) Directive was published in May 2019. Its most controversial provision is Article 17 (ex 13), which creates a new liability regime for user-generated content platforms, like YouTube and Facebook. The new regime makes these platforms directly liable for their users’ uploads, without the possibility of benefiting from the hosting safe-harbour. This forces platforms to either license all or most of the content uploaded by users (which is near impossible) or to adopt preventive measures like filters. The likely outcome is that covered platforms will engage in general monitoring of the content uploaded by their users. This panel will discuss the issues raised by Article 17 DSM Directive and the model of algorithmic enforcement it incentivizes, with a focus on the freedom of expression and data protection risks it entails. • Article 17 of the Copyright in the Digital Single Market Directive creates a new liability regime for user-generated content platforms. • Does this provision introduce de facto the controversial upload filtering systems and, as a result, general monitoring of information in content-sharing platforms? • Is Article 17 essentially in conflict with the GDPR and, in particular, the principle of minimisation and the right not to be subject to automated decision-making processes? What are the potential consequences of this provision on users’ freedom of expression? • If Article 17 can negatively affect data protection and freedom of expression what are the possible legal and extra-legal responses to neutralise the risk?

Copyright, Data protection, frontpage, Privacy

RIS

Save .RIS

Bibtex

Save .bib

The regulation of online political micro-targeting in Europe external link

Internet Policy Review, vol. 8, num: 4, 2020

Abstract

In this paper, we examine how online political micro-targeting is regulated in Europe. While there are no specific rules on such micro-targeting, there are general rules that apply. We focus on three fields of law: data protection law, freedom of expression, and sector-specific rules for political advertising; for the latter we examine four countries. We argue that the rules in the General Data Protection Regulation (GDPR) are necessary, but not sufficient. We show that political advertising, including online political micro-targeting, is protected by the right to freedom of expression. That right is not absolute, however. From a European human rights perspective, it is possible for lawmakers to limit the possibilities for political advertising. Indeed, some countries ban TV advertising for political parties during elections.

Advertising, Data protection law, elections, europe, frontpage, Micro-targeting, Politics, Privacy, Regulering, Vrijheid van meningsuiting

RIS

Save .RIS

Bibtex

Save .bib