IViR Summer Courses 2024:
Latest News
Interview with Mireille van Eechoud on digital sovereignty
Digital Legal Lab member Mireille van Eechoud, Professor of Information Law at the University of Amsterdam, recently participated in a LERU Talk about what digital sovereignty means for universities, and the role of tech companies. Preserving ‘digital sovereignty’ of universities and researchers is key to a successful digital transformation of the university sector. LERU Talks… Continue reading Interview with Mireille van Eechoud on digital sovereignty
Vacancy: DSA Research Fellow (Researcher)
Do you want to become a driving force behind the DSA Observatory?
We are looking for a researcher with a profile in information law to join our team.
IViR at CPDP 2024
On 22-24 May 2024 the 17th international CPDP conference was held in Brussels.
Several of our researchers attended this conference and IViR researchers also moderated a few of the panels.
Upcoming events
Benelux Merken Congres
Amsterdam, The Netherlandshttps://www.delex.nl/shop/opleid…Europe’s Digital Agenda: Is the AI Act the Final Act? In Search of Common Principles
International conference
Bergen, Norwayhttps://www.uib.no/fg/innovasjon…EPIP 2024 Annual Conference: “Intellectual property and the future of the data economy”
Pisa, Italyhttps://epip2024.eu/Conferentie 20 jaar master Informatierecht: ‘Het opleiden van evenwichtskunstenaars’
Amsterdam, The NetherlandsPrivacy Law Scholars Conference Europe (PLSC) 2024
Amsterdam, The Netherlandshttps://www.ivir.nl/plsce2024/AlgoSoc International Conference 2025
The Future of Public Values in the Algorithmic Society
Amsterdam, The Netherlandshttps://algosoc.org/events/algos…Latest publications
Machine readable or not? – notes on the hearing in LAION e.v. vs Kneschke external link
Kluwer Copyright Blog, 2024
Artificial intelligence, Germany, text and data mining
Bibtex
Online publication{nokey,
title = {Machine readable or not? – notes on the hearing in LAION e.v. vs Kneschke},
author = {Keller, P.},
url = {https://copyrightblog.kluweriplaw.com/2024/07/22/machine-readable-or-not-notes-on-the-hearing-in-laion-e-v-vs-kneschke/},
year = {2024},
date = {2024-07-22},
journal = {Kluwer Copyright Blog},
keywords = {Artificial intelligence, Germany, text and data mining},
}
Opinion of the European Copyright Society on Certain Selected Aspects of Case C-227/23, Kwantum Nederland and Kwantum België
van Eechoud, M., Metzger, A., Quintais, J. & Rognstad, O.A.
IIC, 2024
Links
Copyright
Bibtex
Article{nokey,
title = {Opinion of the European Copyright Society on Certain Selected Aspects of Case C-227/23, Kwantum Nederland and Kwantum België},
author = {van Eechoud, M. and Metzger, A. and Quintais, J. and Rognstad, O.A.},
doi = {https://doi.org/10.1007/s40319-024-01504-1},
year = {2024},
date = {2024-07-22},
journal = {IIC},
keywords = {Copyright},
}
Taming the “Free”: Content moderation in the Fediverse and the Role of the DSA: A practical guide for server administrators in the Fediverse external link
Rijnswou, E. van & Verboom, C.
2024
Abstract
The current legal framework for content moderation in the Digital Services Act (DSA) is focused on centralized digital services. This makes it challenging for decentralized services, such as instances in the Fediverse, to know how to comply with the DSA. To address this issue, this report offers a practical guide for server administrators in the Fediverse to meet the DSA's content moderation obligations. In this practical guide, you will find:
- categorization of Fediverse instances under the DSA;
- content moderation obligations for all intermediary services;
- content moderations obligations for hosting services in particular; as well as
- for online platforms in general.
In this report instances in the Fediverse are classified as hosting services, to be precise, as online platforms. As an online platform, instances will have to comply with the general content moderation obligations for all intermediary services, as well as the additional obligations for hosting services and online platforms. At the same time, micro or small enterprises are exempt from the additional obligations that online platforms have, which means that instances meeting this exemption are not subject to the additional obligations. To simplify the steps that a server administrator can take to comply with the DSA, this report provides checkboxes to help server administrators determine if they fall under the DSA and, if so, what their obligations are. Additionally, platforms are encouraged to take on further responsibilities by, for example, adopting voluntary codes of conduct. We also conclude that even if an instance does not meet the exact requirements of a micro or small enterprise, full compliance with the additional obligations for online platforms may be less of a focus point for enforcement if you are a relatively small service. Finally, and most importantly, we advise server administrators of small instances to provide transparency in their content moderation practices.
This report is written by Emese van Rijnswou and Charlotte Verboom under the supervision of Dr. João Pedro Quintais & Ot van Daalen of the Glushko & Samuelson Information Law and Policy Lab (ILP Lab) of the Institute for Information Law (IViR) of the University of Amsterdam. The ILP Lab is a student-run, IViR-led institution which develops and promotes research-based policy solutions that protect fundamental rights and freedoms in the field of European information law. The report has been written in partnership with the DSA Observatory and European Digital Rights Initiative (EDRi). It reflects the recommendations and conclusions of the authors of the
ILP Lab.
Links
Bibtex
Report{nokey,
title = {Taming the “Free”: Content moderation in the Fediverse and the Role of the DSA: A practical guide for server administrators in the Fediverse},
author = {Rijnswou, E. van and Verboom, C.},
url = {https://ilplab.nl/wp-content/uploads/sites/2/2024/06/Final-Content-Moderation-in-the-Fediverse-2.pdf},
year = {2024},
date = {2024-04-24},
abstract = {The current legal framework for content moderation in the Digital Services Act (DSA) is focused on centralized digital services. This makes it challenging for decentralized services, such as instances in the Fediverse, to know how to comply with the DSA. To address this issue, this report offers a practical guide for server administrators in the Fediverse to meet the DSA\'s content moderation obligations. In this practical guide, you will find:
- categorization of Fediverse instances under the DSA;
- content moderation obligations for all intermediary services;
- content moderations obligations for hosting services in particular; as well as
- for online platforms in general.
In this report instances in the Fediverse are classified as hosting services, to be precise, as online platforms. As an online platform, instances will have to comply with the general content moderation obligations for all intermediary services, as well as the additional obligations for hosting services and online platforms. At the same time, micro or small enterprises are exempt from the additional obligations that online platforms have, which means that instances meeting this exemption are not subject to the additional obligations. To simplify the steps that a server administrator can take to comply with the DSA, this report provides checkboxes to help server administrators determine if they fall under the DSA and, if so, what their obligations are. Additionally, platforms are encouraged to take on further responsibilities by, for example, adopting voluntary codes of conduct. We also conclude that even if an instance does not meet the exact requirements of a micro or small enterprise, full compliance with the additional obligations for online platforms may be less of a focus point for enforcement if you are a relatively small service. Finally, and most importantly, we advise server administrators of small instances to provide transparency in their content moderation practices.
This report is written by Emese van Rijnswou and Charlotte Verboom under the supervision of Dr. João Pedro Quintais & Ot van Daalen of the Glushko & Samuelson Information Law and Policy Lab (ILP Lab) of the Institute for Information Law (IViR) of the University of Amsterdam. The ILP Lab is a student-run, IViR-led institution which develops and promotes research-based policy solutions that protect fundamental rights and freedoms in the field of European information law. The report has been written in partnership with the DSA Observatory and European Digital Rights Initiative (EDRi). It reflects the recommendations and conclusions of the authors of the
ILP Lab.},
}
Contesting personalized recommender systems: a cross-country analysis of user preferences external link
Starke, C., Metikoš, L., Helberger, N. & Vreese, C.H. de
Information, Communication & Society, 2024
Abstract
Very Large Online Platforms (VLOPs) such as Instagram, TikTok, and YouTube wield substantial influence over digital information flows using sophisticated algorithmic recommender systems (RS). As these systems curate personalized content, concerns have emerged about their propensity to amplify polarizing or inappropriate content, spread misinformation, and infringe on users’ privacy. To address these concerns, the European Union (EU) has recently introduced a new regulatory framework through the Digital Services Act (DSA). These proposed policies are designed to bolster user agency by offering contestability mechanisms against personalized RS. As their effectiveness ultimately requires individual users to take specific actions, this empirical study investigates users’ intention to contest personalized RS. The results of a pre-registered survey across six countries – Brazil, Germany, Japan, South Korea, the UK, and the USA – involving 6,217 respondents yield key insights: (1) Approximately 20% of users would opt out of using personalized RS, (2) the intention for algorithmic contestation is associated with individual characteristics such as users’ attitudes towards and awareness of personalized RS as well as their privacy concerns, (3) German respondents are particularly inclined to contest personalized RS. We conclude that amending Art. 38 of the DSA may contribute to leveraging its effectiveness in fostering accessible user contestation and algorithmic transparency.
Links
Algorithmic contestation, Digital services act, Personalisation, recommender systems
Bibtex
Article{nokey,
title = {Contesting personalized recommender systems: a cross-country analysis of user preferences},
author = {Starke, C. and Metikoš, L. and Helberger, N. and Vreese, C.H. de},
url = {https://www.tandfonline.com/doi/full/10.1080/1369118X.2024.2363926},
doi = {https://doi.org/10.1080/1369118X.2024.2363926},
year = {2024},
date = {2024-07-03},
journal = {Information, Communication & Society},
abstract = {Very Large Online Platforms (VLOPs) such as Instagram, TikTok, and YouTube wield substantial influence over digital information flows using sophisticated algorithmic recommender systems (RS). As these systems curate personalized content, concerns have emerged about their propensity to amplify polarizing or inappropriate content, spread misinformation, and infringe on users’ privacy. To address these concerns, the European Union (EU) has recently introduced a new regulatory framework through the Digital Services Act (DSA). These proposed policies are designed to bolster user agency by offering contestability mechanisms against personalized RS. As their effectiveness ultimately requires individual users to take specific actions, this empirical study investigates users’ intention to contest personalized RS. The results of a pre-registered survey across six countries – Brazil, Germany, Japan, South Korea, the UK, and the USA – involving 6,217 respondents yield key insights: (1) Approximately 20% of users would opt out of using personalized RS, (2) the intention for algorithmic contestation is associated with individual characteristics such as users’ attitudes towards and awareness of personalized RS as well as their privacy concerns, (3) German respondents are particularly inclined to contest personalized RS. We conclude that amending Art. 38 of the DSA may contribute to leveraging its effectiveness in fostering accessible user contestation and algorithmic transparency.},
keywords = {Algorithmic contestation, Digital services act, Personalisation, recommender systems},
}
EU copyright law roundup – second trimester of 2024 external link
Trapova, A. & Quintais, J.
Kluwer Copyright Blog, 2024
Links
Bibtex
Online publication{nokey,
title = {EU copyright law roundup – second trimester of 2024},
author = {Trapova, A. and Quintais, J.},
url = {https://copyrightblog.kluweriplaw.com/2024/07/03/eu-copyright-law-roundup-second-trimester-of-2024/},
year = {2024},
date = {2024-07-03},
journal = {Kluwer Copyright Blog},
}