Publicaties
Top Keywords
- Annotaties (57)
- Art. 10 EVRM (25)
- Auteursrecht (518)
- Bescherming van communicatie (20)
- Consumentenrecht (22)
- Content moderation (22)
- Databankenrecht (25)
- Digital Services Act (DSA) (31)
- EU (19)
- EU law (25)
- Freedom of expression (49)
- Fundamental rights (18)
- GDPR (22)
- Grondrechten (421)
- Industriële eigendom (38)
- Informatierecht (37)
- Intellectual property (30)
- Intellectuele eigendom (425)
- Internet (24)
- Journalistiek (33)
- Kluwer Information Law Series (43)
- Kronieken (18)
- Media law (29)
- Mediarecht (378)
- Mensenrechten (18)
- Merkenrecht (21)
- Naburige rechten (20)
- Omroeprecht (28)
- Online platforms (20)
- Overheidsinformatie (53)
- Personal data (35)
- Persrecht (41)
- Platforms (24)
- Privacy (327)
- Regulering (20)
- Technologie en recht (75)
- Telecommunicatierecht (127)
- Text and Data Mining (TDM) (21)
- Transparency (19)
- Vrijheid van meningsuiting (198)
Between Empowerment and Manipulation: The Ethics and Regulation of For-Profit Health Apps external link
Abstract
In the digital society, many of our everyday activities take place within digital choice architectures that become increasingly good at understanding and shaping our behavior. Health apps are a perfect example of this trend: they are easy to download and use and promise user empowerment. By collecting and analyzing user data, health apps promise to be able to ‘get to know’ their users and deliver personalized feedback and suggestions for better health outcomes. But this promise of user empowerment also comes with a risk of user manipulation. Most of the popular health apps are for-profit services. To monetize their userbase, they can rely on the very same user data collection, data analysis, and targeting techniques to shape the behavior of health app users in ways that benefit the health app provider, rather than the users themselves. As it turns out, the very conditions for empowerment largely overlap with the conditions for manipulation.
This dissertation offers an ethical and legal analysis of the tension between empowerment and manipulation in for-profit health apps, and digital choice architectures more generally. Building on ethical theories of personal autonomy and manipulation, the dissertation develops an ethical framework to evaluate the design and commercial practices of health apps. This ethical framework is then used to develop novel interpretations of key concepts in the Unfair Commercial Practices Directive (UCPD). Based on these novel interpretations of key concepts, it is argued that the UCPD has an important role to play in addressing consumer manipulation.
autonomy, Consumer law, health apps, manipulation, nudging
RIS
Bibtex
Opinie: ‘Let op de subtiele vermenging van commercie en adviezen in gezondheidsapps’ external link
Are we human, or are we users? The role of natural language processing in human-centric news recommenders that nudge users to diverse content external link
Opinie: Beleid voor CoronaCheck-app ontbreekt jammerlijk external link
De reikwijdte van artikel 17 DSM-richtlijn in het licht van het verbod op algemene toezichtverplichtingen: een Odyssee external link
Abstract
Met de Richtlijn auteursrechten en naburige rechten in de digitale eengemaakte markt (‘DSM-RL’) zijn nieuwe wettelijke verplichtingen op het terrein van het filteren van online content ontstaan. Aanbieders van onlinediensten voor het delen van content (‘OCSSPs’) dienen – op basis van door rechthebbenden verstrekte informatie – ervoor te zorgen dat beschermd materiaal niet beschikbaar is op hun platforms. Tegelijkertijd bevestigt artikel 17 lid 8 DSM-RL dat de nieuwe auteursrechtelijke regels niet tot een algemene toezichtverplichting moeten leiden. Ondanks de nieuwe filterverplichtingen heeft de Uniewetgever het traditionele verbod op een algemene toezichtverplichting – dat al 20 jaar deel uitmaakt van de regeling van aansprakelijkheidsprivileges in de Richtlijn inzake elektronische handel (‘REH’) – uitdrukkelijk overeind gehouden. Ook het voorstel van de Europese Commissie voor een Digital Services Act (‘DSA’) houdt het verbod op algemene toezichtverplichtingen in stand. Tegen deze achtergrond rijst de vraag hoe de nieuwe auteursrechtelijke filterverplichtingen moeten worden uitgelegd om een verboden algemene toezichtverplichting te voorkomen. De volgende analyse geeft antwoord op deze vraag op basis van een nadere bespreking van het verbod op algemene toezichtverplichtingen in de REH, de DSM-RL en het DSA-voorstel. Naast relevante rechtspraak van het HvJ EU komt het nauwe verband tussen het verbod op algemene toezichtverplichtingen en fundamentele rechten aan de orde.
Auteursrecht, DSM-richtlijn, filteren, frontpage, onlinediensten
RIS
Bibtex
Conditions for technological solutions in a COVID-19 exit strategy, with particular focus on the legal and societal conditions external link
Abstract
Which legal, ethical and societal conditions need to be fulfilled for the use of digital solutions in managing the COVID-19 exit-strategy? This was the central question of this research. Digital technologies can be part of solutions to societal challenges, for example to manage the pandemic and lead the Netherlands out of the COVID-19 crisis. One set of technologies that figured particularly prominently in that debate was the use of contact tracing apps like the CoronaMelder, as well as digital vaccination passports (CoronaCheck app).
In the Netherlands, Europe and worldwide, the introduction of apps such as the CoronaMelder or the CoronaCheck app was met by criticism from experts, politicians, civil society and academics. Concerns range from the lack of evidence for the effectiveness of such apps, uncertainty about the conditions that need to be fulfilled to reach their goal, our growing dependency on technology companies up to worries about the fundamental rights and adverse effects for vulnerable groups, such as elderly or users without a smart phone.
The overall goal of the research was to monitor the societal, ethical and legal implications of implementing apps like the CoronaMelder, and from that draw lessons for the future use of ‘technology-assisted governance solutions’. One important conclusion from the report is that ‘there are no easy technological fixes, and in order for a technological solution to work, it needs to be part of a broader vision on what such a solution needs to function in society, achieve its intended goals and respect the fundamental rights of users as well as non-users.’ The report also offers critical reflections on the need for democratic legitimisation and accountability, the role of big tech and insights on the societal impact of the CoronaMelder and other technological solutions.
covid-19, frontpage, Informatierecht