Protecting Individuals Against the Negative Impact of Big Data: Potential and Limitations of the Privacy and Data Protection Law Approach external link

Oostveen, M.
0101, Series: Information Law Series, ISBN: 9789403501314

Big data, Kluwer Information Law Series, Privacy

Bibtex

Book{ILS42, title = {Protecting Individuals Against the Negative Impact of Big Data: Potential and Limitations of the Privacy and Data Protection Law Approach}, author = {Oostveen, M.}, year = {0101}, date = {2018-01-01}, keywords = {Big data, Kluwer Information Law Series, Privacy}, }

The Golden Age of Personal Data: How to Regulate an Enabling Fundamental Right? external link

Oostveen, M. & Irion, K.
In: Bakhoum M., Conde Gallego B., Mackenrodt MO., Surblytė-Namavičienė G. (eds) Personal Data in Competition, Consumer Protection and Intellectual Property Law. MPI Studies on Intellectual Property and Competition Law, vol 28. Springer, Berlin, Heidelberg, 1120

Abstract

New technologies, purposes and applications to process individuals’ personal data are being developed on a massive scale. But we have not only entered the ‘golden age of personal data’ in terms of its exploitation: ours is also the ‘golden age of personal data’ in terms of regulation of its use. Understood as an enabling right, the architecture of EU data protection law is capable of protecting against many of the negative short- and long-term effects of contemporary data processing. Against the backdrop of big data applications, we evaluate how the implementation of privacy and data protection rules protect against the short- and long-term effects of contemporary data processing. We conclude that from the perspective of protecting individual fundamental rights and freedoms, it would be worthwhile to explore alternative (legal) approaches instead of relying on EU data protection law alone to cope with contemporary data processing.

automated decision making, Big data, Data protection, frontpage, General Data Protection Regulation, Privacy, profiling

Bibtex

Chapter{Oostveen2018, title = {The Golden Age of Personal Data: How to Regulate an Enabling Fundamental Right?}, author = {Oostveen, M. and Irion, K.}, url = {https://link.springer.com/chapter/10.1007/978-3-662-57646-5_2}, year = {1120}, date = {2018-11-20}, abstract = {New technologies, purposes and applications to process individuals’ personal data are being developed on a massive scale. But we have not only entered the ‘golden age of personal data’ in terms of its exploitation: ours is also the ‘golden age of personal data’ in terms of regulation of its use. Understood as an enabling right, the architecture of EU data protection law is capable of protecting against many of the negative short- and long-term effects of contemporary data processing. Against the backdrop of big data applications, we evaluate how the implementation of privacy and data protection rules protect against the short- and long-term effects of contemporary data processing. We conclude that from the perspective of protecting individual fundamental rights and freedoms, it would be worthwhile to explore alternative (legal) approaches instead of relying on EU data protection law alone to cope with contemporary data processing.}, keywords = {automated decision making, Big data, Data protection, frontpage, General Data Protection Regulation, Privacy, profiling}, }

The Golden Age of Personal Data: How to Regulate an Enabling Fundamental Right? external link

Oostveen, M. & Irion, K.
2016

Abstract

New technologies, purposes and applications to process individual’s personal data are developed on a massive scale. But we have not only entered the ‘golden age of personal data’ in terms of its exploitation: ours is also the ‘golden age of personal data’ in terms of regulation of its use. In this contribution, we explain how regulating the processing of an individual’s personal data can be a proxy of intervention, which directly or indirectly could benefit other individual rights and freedoms. Understood as an enabling right, the architecture of EU data protection law is capable of protecting against many of the negative short- and long-term effects of contemporary data processing. The new General Data Protection Regulation certainly strengthens aspects of this core architecture but certain regulatory innovations to cope with technological advancements and the data-driven economy appear less capably of yielding broad protection for individuals fundamental rights and freedoms. We conclude that from the perspective of protecting individual fundamental rights and freedoms, it would be worthwhile to explore alternative (legal) approaches of individual protection in contemporary data processing.

Big data, Data protection, enabling fundamental rights, EU law, General Data Protection Regulation, Privacy

Bibtex

Article{Oostveen2016b, title = {The Golden Age of Personal Data: How to Regulate an Enabling Fundamental Right?}, author = {Oostveen, M. and Irion, K.}, year = {1215}, date = {2016-12-15}, abstract = {New technologies, purposes and applications to process individual’s personal data are developed on a massive scale. But we have not only entered the ‘golden age of personal data’ in terms of its exploitation: ours is also the ‘golden age of personal data’ in terms of regulation of its use. In this contribution, we explain how regulating the processing of an individual’s personal data can be a proxy of intervention, which directly or indirectly could benefit other individual rights and freedoms. Understood as an enabling right, the architecture of EU data protection law is capable of protecting against many of the negative short- and long-term effects of contemporary data processing. The new General Data Protection Regulation certainly strengthens aspects of this core architecture but certain regulatory innovations to cope with technological advancements and the data-driven economy appear less capably of yielding broad protection for individuals fundamental rights and freedoms. We conclude that from the perspective of protecting individual fundamental rights and freedoms, it would be worthwhile to explore alternative (legal) approaches of individual protection in contemporary data processing.}, keywords = {Big data, Data protection, enabling fundamental rights, EU law, General Data Protection Regulation, Privacy}, }

Identifiability and the applicability of data protection to big data external link

Oostveen, M.
International Data Privacy Law, 2016

Abstract

Big data holds much potential, but it can also have a negative impact on individuals, particularly on their privacy and data protection rights. Data protection law is the point of departure in the discussion about big data; it is widely regarded as the answer to big data’s negative consequences. Yet a closer look at the criteria for applicability of EU data protection law reveals a number of weaknesses in the data protection law approach. Because the material scope of EU data protection law is dependent on the identifiability of individual, data protection only partially applies to the big data process. Therefore, in spite of its importance, data protection law is insufficient to protect individuals from big data’s potential harms.

bescherming persoonsgegevens, Big data, Data protection, frontpage, Grondrechten, Personal data, Privacy

Bibtex

Article{Oostveen2016, title = {Identifiability and the applicability of data protection to big data}, author = {Oostveen, M.}, url = {http://idpl.oxfordjournals.org/content/early/2016/09/06/idpl.ipw012.extract}, doi = {https://doi.org/10.1093/idpl/ipw012}, year = {0927}, date = {2016-09-27}, journal = {International Data Privacy Law}, abstract = {Big data holds much potential, but it can also have a negative impact on individuals, particularly on their privacy and data protection rights. Data protection law is the point of departure in the discussion about big data; it is widely regarded as the answer to big data’s negative consequences. Yet a closer look at the criteria for applicability of EU data protection law reveals a number of weaknesses in the data protection law approach. Because the material scope of EU data protection law is dependent on the identifiability of individual, data protection only partially applies to the big data process. Therefore, in spite of its importance, data protection law is insufficient to protect individuals from big data’s potential harms.}, keywords = {bescherming persoonsgegevens, Big data, Data protection, frontpage, Grondrechten, Personal data, Privacy}, }

Big data: Finders keepers, losers weepers? external link

Ethics and Information Technology, vol. 18, num: 1, pp: 25-31, 2016

Abstract

This article argues that big data’s entrepreneurial potential is based not only on new technological developments that allow for the extraction of non-trivial, new insights out of existing data, but also on an ethical judgment that often remains implicit: namely the ethical judgment that those companies that generate these new insights can legitimately appropriate (the fruits of) these insights. As a result, the business model of big data companies is essentially founded on a libertarian-inspired ‘finders, keepers’ ethic. The article argues, next, that this presupposed ‘finder, keepers’ ethic is far from unproblematic and relies itself on multiple unconvincing assumptions. This leads to the conclusion that the conduct of companies working with big data might lack ethical justification.

Big data, ethics, finders-keepers, justice, libertarianism, Personal data, Privacy

Bibtex

Article{Sax2016, title = {Big data: Finders keepers, losers weepers?}, author = {Sax, M.}, url = {http://link.springer.com/article/10.1007/s10676-016-9394-0}, doi = {https://doi.org/10.1007/s10676-016-9394-0}, year = {0326}, date = {2016-03-26}, journal = {Ethics and Information Technology}, volume = {18}, number = {1}, pages = {25-31}, abstract = {This article argues that big data’s entrepreneurial potential is based not only on new technological developments that allow for the extraction of non-trivial, new insights out of existing data, but also on an ethical judgment that often remains implicit: namely the ethical judgment that those companies that generate these new insights can legitimately appropriate (the fruits of) these insights. As a result, the business model of big data companies is essentially founded on a libertarian-inspired ‘finders, keepers’ ethic. The article argues, next, that this presupposed ‘finder, keepers’ ethic is far from unproblematic and relies itself on multiple unconvincing assumptions. This leads to the conclusion that the conduct of companies working with big data might lack ethical justification.}, keywords = {Big data, ethics, finders-keepers, justice, libertarianism, Personal data, Privacy}, }

International and comparative legal study on Big Data external link

van der Sloot, B. & Schendel, S. van
2016

Abstract

Working Paper 20 was written as part of the project ‘Big Data, Privacy and Security’, undertaken by the Netherlands Scientific Council for Government Policy (wrr) to investigate the consequences of the use of Big Data in the domain of security. This background study, entitled International and Comparative Legal Study on Big Data, was written by Bart van der Sloot and Sacha van Schendel. Many countries experiment with Big Data processes, which are almost by definition transnational. The first part of the study is a quick scan of the Big Data policies, legislation and regulations in Australia, Brazil, China, France, Germany, India, Israel, Japan, South-Africa, the United Kingdom and the United States. The second part presents the findings of a survey among Data Protection Authorities (dpas) in Europe. Both parts of the study focus on the relations between Big Data, security and privacy.

Big data, frontpage, Privacy

Bibtex

Other{vanderSloot2016, title = {International and comparative legal study on Big Data}, author = {van der Sloot, B. and Schendel, S. van}, url = {http://www.ivir.nl/international_and_comparative_legal_study_on_big_data/}, year = {0512}, date = {2016-05-12}, abstract = {Working Paper 20 was written as part of the project ‘Big Data, Privacy and Security’, undertaken by the Netherlands Scientific Council for Government Policy (wrr) to investigate the consequences of the use of Big Data in the domain of security. This background study, entitled International and Comparative Legal Study on Big Data, was written by Bart van der Sloot and Sacha van Schendel. Many countries experiment with Big Data processes, which are almost by definition transnational. The first part of the study is a quick scan of the Big Data policies, legislation and regulations in Australia, Brazil, China, France, Germany, India, Israel, Japan, South-Africa, the United Kingdom and the United States. The second part presents the findings of a survey among Data Protection Authorities (dpas) in Europe. Both parts of the study focus on the relations between Big Data, security and privacy.}, keywords = {Big data, frontpage, Privacy}, }

Should we worry about filter bubbles? external link

Zuiderveen Borgesius, F., Trilling, D., Trilling, D., Bodó, B., Vreese, C.H. de & Helberger, N.
Internet Policy Review, vol. 5, num: 1, 2016

Abstract

Some fear that personalised communication can lead to information cocoons or filter bubbles. For instance, a personalised news website could give more prominence to conservative or liberal media items, based on the (assumed) political interests of the user. As a result, users may encounter only a limited range of political ideas. We synthesise empirical research on the extent and effects of self-selected personalisation, where people actively choose which content they receive, and pre-selected personalisation, where algorithms personalise content for users without any deliberate user choice. We conclude that at present there is little empirical evidence that warrants any worries about filter bubbles.

behavioural targeting, Big data, frontpage, Personal data, profiling

Bibtex

Article{Borgesius2016, title = {Should we worry about filter bubbles?}, author = {Zuiderveen Borgesius, F. and Trilling, D. and Bodó, B. and Vreese, C.H. de and Helberger, N.}, url = {http://policyreview.info/node/401/pdf}, doi = {https://doi.org/10.14763/2016.1.401}, year = {0401}, date = {2016-04-01}, journal = {Internet Policy Review}, volume = {5}, number = {1}, pages = {}, abstract = {Some fear that personalised communication can lead to information cocoons or filter bubbles. For instance, a personalised news website could give more prominence to conservative or liberal media items, based on the (assumed) political interests of the user. As a result, users may encounter only a limited range of political ideas. We synthesise empirical research on the extent and effects of self-selected personalisation, where people actively choose which content they receive, and pre-selected personalisation, where algorithms personalise content for users without any deliberate user choice. We conclude that at present there is little empirical evidence that warrants any worries about filter bubbles.}, keywords = {behavioural targeting, Big data, frontpage, Personal data, profiling}, }

Is the Human Rights Framework Still Fit for the Big Data Era? A Discussion of the ECtHR’s Case Law on Privacy Violations Arising from Surveillance Activities external link

Abstract

Human rights protect humans. This seemingly uncontroversial axiom might become quintessential over time, especially with regard to the right to privacy. Article 8 of the European Convention on Human Rights grants natural persons> a right to complain, in order to protect their individual interests, such as those related to personal freedom, human dignity and individual autonomy. With Big Data processes, however, individuals are mostly unaware that their personal data are gathered and processed and even if they are, they are often unable to substantiate their specific individual interest in these large data gathering systems. When the European Court of Human Rights assesses these types of cases, mostly revolving around (mass) surveillance activities, it finds itself stuck between the human rights framework on the one hand and the desire to evaluate surveillance practices by states on the other. Interestingly, the Court chooses to deal with these cases under Article 8 ECHR, but in order to do so, it is forced to go beyond the fundamental pillars of the human rights framework.

Big data, conventionality, Grondrechten, Human rights, individual harm, mass surveillance, Privacy, societal interest

Bibtex

Other{nokey, title = {Is the Human Rights Framework Still Fit for the Big Data Era? A Discussion of the ECtHR’s Case Law on Privacy Violations Arising from Surveillance Activities}, author = {van der Sloot, B.}, url = {http://www.ivir.nl/publicaties/download/1701.pdf}, year = {1215}, date = {2015-12-15}, abstract = {Human rights protect humans. This seemingly uncontroversial axiom might become quintessential over time, especially with regard to the right to privacy. Article 8 of the European Convention on Human Rights grants natural persons> a right to complain, in order to protect their individual interests, such as those related to personal freedom, human dignity and individual autonomy. With Big Data processes, however, individuals are mostly unaware that their personal data are gathered and processed and even if they are, they are often unable to substantiate their specific individual interest in these large data gathering systems. When the European Court of Human Rights assesses these types of cases, mostly revolving around (mass) surveillance activities, it finds itself stuck between the human rights framework on the one hand and the desire to evaluate surveillance practices by states on the other. Interestingly, the Court chooses to deal with these cases under Article 8 ECHR, but in order to do so, it is forced to go beyond the fundamental pillars of the human rights framework.}, keywords = {Big data, conventionality, Grondrechten, Human rights, individual harm, mass surveillance, Privacy, societal interest}, }

Open Data, Privacy, and Fair Information Principles: Towards a Balancing Framework external link

Abstract

Open data are held to contribute to a wide variety of social and political goals, including strengthening transparency, public participation and democratic accountability, promoting economic growth and innovation, and enabling greater public sector efficiency and cost savings. However, releasing government data that contain personal information may threaten privacy and related rights and interests. In this paper we ask how these privacy interests can be respected, without unduly hampering benefits from disclosing public sector information. We propose a balancing framework to help public authorities address this question in different contexts. The framework takes into account different levels of privacy risks for different types of data. It also separates decisions about access and re-use, and highlights a range of different disclosure routes. A circumstance catalogue lists factors that might be considered when assessing whether, under which conditions, and how a dataset can be released. While open data remains an important route for the publication of government information, we conclude that it is not the only route, and there must be clear and robust public interest arguments in order to justify the disclosure of personal information as open data.

anonymous data, Big data, Data protection, fair information principles, Freedom of information, Grondrechten, OECD privacy Guidelines, Privacy, public sector data

Bibtex

Article{nokey, title = {Open Data, Privacy, and Fair Information Principles: Towards a Balancing Framework}, author = {Zuiderveen Borgesius, F. and van Eechoud, M.}, url = {http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2695005}, year = {1203}, date = {2015-12-03}, abstract = {Open data are held to contribute to a wide variety of social and political goals, including strengthening transparency, public participation and democratic accountability, promoting economic growth and innovation, and enabling greater public sector efficiency and cost savings. However, releasing government data that contain personal information may threaten privacy and related rights and interests. In this paper we ask how these privacy interests can be respected, without unduly hampering benefits from disclosing public sector information. We propose a balancing framework to help public authorities address this question in different contexts. The framework takes into account different levels of privacy risks for different types of data. It also separates decisions about access and re-use, and highlights a range of different disclosure routes. A circumstance catalogue lists factors that might be considered when assessing whether, under which conditions, and how a dataset can be released. While open data remains an important route for the publication of government information, we conclude that it is not the only route, and there must be clear and robust public interest arguments in order to justify the disclosure of personal information as open data.}, keywords = {anonymous data, Big data, Data protection, fair information principles, Freedom of information, Grondrechten, OECD privacy Guidelines, Privacy, public sector data}, }