Media Concentration Law: Gaps and Promises in the Digital Age

Media and Communication, vol. 11, iss. : 2, pp: 392-405, 2023

Abstract

Power concentrations are increasing in today’s media landscape. Reasons for this include increasing structural and technological dependences on digital platform companies, as well as shifts in opinion power and control over news production, distribution, and consumption. Digital opinion power and platformised media markets have prompted the need for a re-evaluation of the current approach. This article critically revisits and analyses media concentration rules. To this end, I employ a normative conceptual framework that examines ”opinion power in the platform world” at three distinct levels (individual citizen, institutional newsroom, and media ecosystem). At each level, I identify the existing legal tools and gaps in controlling power and concentration in the digital age. Based on that, I offer a unifying theoretical framework for a “digital media concentration law,” along with core concepts and guiding principles. I highlight policy goals and fields that are outside the traditional scope yet are relevant for addressing issues relating to the digital age. Additionally, the emerging European Union regulatory framework—specifically the Digital Services Act, the Digital Markets Act, and the European Media Freedom Act—reflects an evolving approach regarding platforms and media concentration. On a final note, the analysis draws from the mapping and evaluation results of a Europe-wide study on media pluralism and diversity online, which examined (national) media concentration rules.

digital platforms, editorial independence, European regulation, media concentration, Media law, media pluralism, opinion power, structural dependency

RIS

Save .RIS

Bibtex

Save .bib

Opinie: Internetproletariërs aller landen verenigt u! download

Mediaforum, iss. : 3, pp: 85, 2023

Facebook, Internet, moderators

RIS

Save .RIS

Bibtex

Save .bib

Personal Data Stores and the GDPR’s lawful grounds for processing personal data

Janssen, H., Cobbe, J., Norval, C. & Singh, J.
2019

Abstract

Personal Data Stores (‘PDSs’) entail users having a (physical or virtual) device within which they themselves can, in theory, capture, aggregate, and control the access to and the transfer of personal data. Their aim is to empower users in relation to their personal data, strengthening their opportunities for data protection, privacy, and/or to facilitate trade and monetisation. As PDS technologies develop, it is important to consider their role in relation to issues of data protection. The General Data Protection Regulation requires that the processing of user data be predicated on one of its defined lawful bases, whereby the Regulation does not favour any one basis over another. We explore how PDS architectures relate to these lawful bases, and observe that they tend to favour the bases that require direct user involvement. This paper considers issues that the envisaged architectural choices surrounding the lawful grounds may entail.

Data protection, decentralisation, lawful grounds for processing, personal data stores, Privacy, Transparency

RIS

Save .RIS

Bibtex

Save .bib

Data protection and tech startups: The need for attention, support, and scrutiny

Norval, C., Janssen, H., Cobbe, J. & Singh, J.
Policy & Internet, vol. 13, iss. : 2, pp: 278-299, 2021

Abstract

Though discussions of data protection have focused on the larger, more established organisations, startups also warrant attention. This is particularly so for tech startups, who are often innovating at the ‘cutting-edge’—pushing the boundaries of technologies that typically lack established data protection best-practices. Initial decisions taken by startups could well have long-term impacts, and their actions may inform (for better or for worse) how particular technologies and the applications they support are implemented, deployed, and perceived for years to come. Ensuring that the innovations and practices of tech startups are sound, appropriate and acceptable should therefore be a high priority. This paper explores the attitudes and preparedness of tech startups to issues of data protection. We interviewed a series of UK-based emerging tech startups as the EU's General Data Protection Regulation (GDPR) came into effect, which revealed areas in which there is a disconnect between the approaches of the startups and the nature and requirements of the GDPR. We discuss the misconceptions and associated risks facing innovative tech startups and offer a number of considerations for the firms and supervisory authorities alike. In light of our discussions, and given what is at stake, we argue that more needs to be done to help ensure that emerging technologies and the practices of the companies that operate them better align with the regulatory obligations. We conclude that tech startups warrant increased attention, support, and scrutiny to raise the standard of data protection for the benefit of us all.

RIS

Save .RIS

Bibtex

Save .bib

De toekomst van de digitale rechtsstaat. Pleidooi voor het gebruik van een mensenrechten impact assessment voor de publieke sector external link

(L)aw Matters: Blogs and Essays in Honour of prof. dr. Aalt Willem Hering, Boekenmaker, 2022, pp: 198-204

RIS

Save .RIS

Bibtex

Save .bib

Practical fundamental rights impact assessments

Janssen, H., Seng Ah Lee, M. & Singh, J.
International Journal of Law and Information, vol. 30, iss. : 2, pp: 200-232, 2022

Abstract

The European Union’s General Data Protection Regulation tasks organizations to perform a Data Protection Impact Assessment (DPIA) to consider fundamental rights risks of their artificial intelligence (AI) system. However, assessing risks can be challenging, as fundamental rights are often considered abstract in nature. So far, guidance regarding DPIAs has largely focussed on data protection, leaving broader fundamental rights aspects less elaborated. This is problematic because potential negative societal consequences of AI systems may remain unaddressed and damage public trust in organizations using AI. Towards this, we introduce a practical, four-Phased framework, assisting organizations with performing fundamental rights impact assessments. This involves organizations (i) defining the system’s purposes and tasks, and the responsibilities of parties involved in the AI system; (ii) assessing the risks regarding the system’s development; (iii) justifying why the risks of potential infringements on rights are proportionate; and (iv) adopt organizational and/or technical measures mitigating risks identified. We further indicate how regulators might support these processes with practical guidance.

RIS

Save .RIS

Bibtex

Save .bib

Intermediating data rights exercises: the role of legal mandates

International Data Privacy Law, vol. 12, iss. : 4, pp: 316-331, 2022

Abstract

Data subject rights constitute critical tools for empowerment in the digitized society. There is a growing trend of relying on third parties to facilitate or coordinate the collective exercises of data rights, on behalf of one or more data subjects. This contribution refers to these parties as ‘Data Rights Intermediaries’ (DRIs), ie where an ‘intermediating’ party facilitates or enables the collective exercise of data rights. The exercise of data rights by these DRIs on behalf of the data subjects can only be effectuated with the help of mandates. Data rights mandates are not expressly framed in the GDPR their delineation can be ambiguous. It is important to highlight that data rights are mandatable and this without affecting their inalienability in light of their fundamental rights’ nature. This article argues that contract law and fiduciary duties both have longstanding traditions and robust norms in many jurisdictions, all of which can be explored towards shaping the appropriate environment to regulate data rights mandates in particular. The article concludes that the key in unlocking the full potential of data rights mandates can already be found in existing civil law constructs, whose diversity reveals the need for solidifying the responsibility and accountability of mandated DRIs. The continued adherence to fundamental contract law principles will have to be complemented by a robust framework of institutional safeguards. The need for such safeguards stems from the vulnerable position of data subjects, both vis-à-vis DRIs as well as data controllers.

RIS

Save .RIS

Bibtex

Save .bib

Beyond financial regulation of crypto-asset wallet software: In search of secondary liability external link

Computer Law & Security Review, vol. 49, num: 105829, 2023

Abstract

Since Bitcoin, the blockchain space considerably evolved. One crucial piece of software to interact with blockchains and hold private-public key pairs to distinct crypto-assets and securities are wallets. Wallet software can be offered by liable third-parties (‘custodians’) who hold certain rights over assets and transactions. As parties subject to financial regulation, they are to uphold Anti-money Laundering and Combating the Financing of Terrorist (AML/CFT) standards by undertaking Know-Your-Customer (KYC) checks on users of their services. In juxtaposition, wallet software can also be issued without the involvement of a liable third-party. As no KYC is performed and users have full ‘freedom to act’, such ‘non-custodial’ wallet software is popular in criminal undertakings. They are required to interact with peer-to-peer applications and organisations running on blockchains whose benefits are not the subject of this paper. To date, financial regulation fails to adequately address such wallet software because it presumes the existence of a registered, liable entity offering said software. As illustrated in the case of Tornado Cash, financial regulation fails to trace chains of secondary liability. Alas, the considered solution is a systematic surveillance of all transactions. Against this backdrop, this paper sets forth an alternative approach rooted in copyright law. Concepts that pertain to secondary liability prove of value to develop a flexible, principles-based approach to the regulation of non-custodial wallet software that accounts for both, infringing and non-infringing uses.

blockchain, Crypto-assets, decentralised finance, non-custodial wallet, Regulation, secondary liability

RIS

Save .RIS

Bibtex

Save .bib

Leg het me nog één keer uit: het recht op een uitleg na Uber en Ola. Annotatie bij Hof Amsterdam, 4 april 2023 download

Privacy & Informatie, iss. : 3, pp: 114-116, 2023

RIS

Save .RIS

Bibtex

Save .bib

Dealing with opinion power and media concentration in the platform era external link

LSE Blog, 2023

media concentration, Media law, Platforms

RIS

Save .RIS

Bibtex

Save .bib