Seeing what others are seeing: Studies in the regulation of transparency for social media recommender systems

Abstract

This dissertation asks how the law can shed light on social media recommender systems: the algorithmic tools which platforms use to rank and curate online content. Recommender systems fulfil an important gatekeeping function in social media governance, but their actions are poorly understood. Legal reforms are now underway in EU law to impose transparency rules on social media recommenders, and the goal of this dissertation is to interrogate the accountability relations implied by this regulatory project. What kinds of information is the law demanding about social media recommender systems? Who is included in these new models of accountability, and who is excluded? This dissertation critiques a dominant paradigm in recent law and policy focused on algorithmic explanations. Building on insights from critical transparency studies and algorithm studies, it argues that disclosure regulation should move from algorithmic transparency toward platform observability: approaching recommenders not as discrete algorithmic artifacts but as complex sociotechnical systems shaped in important ways by their users and operators. Before any attempt to ‘open the black box’ of algorithmic machine learning, therefore, regulating for observability invites us to ask how recommenders find uptake in practice; to demand basic data on recommender inputs, outputs and interventions; to ask what is being recommend, sooner than why. Several avenues for observability regulation are explored, including platform ad archives; notices for visibility restrictions (or ‘shadow bans’); and researcher APIs. Through solutions such as these, which render visible recommender outcomes, this dissertation outlines a vision for a more democratic media governance—one which supports informed and inclusive deliberation about, across and within social media’s personalised publics.

Bibtex

PhD Thesis{nokey, title = {Seeing what others are seeing: Studies in the regulation of transparency for social media recommender systems}, author = {Leerssen, P.}, url = {https://dare.uva.nl/search?identifier=18c6e9a0-1530-4e70-b9a6-35fb37873d13}, year = {2023}, date = {2023-04-21}, abstract = {This dissertation asks how the law can shed light on social media recommender systems: the algorithmic tools which platforms use to rank and curate online content. Recommender systems fulfil an important gatekeeping function in social media governance, but their actions are poorly understood. Legal reforms are now underway in EU law to impose transparency rules on social media recommenders, and the goal of this dissertation is to interrogate the accountability relations implied by this regulatory project. What kinds of information is the law demanding about social media recommender systems? Who is included in these new models of accountability, and who is excluded? This dissertation critiques a dominant paradigm in recent law and policy focused on algorithmic explanations. Building on insights from critical transparency studies and algorithm studies, it argues that disclosure regulation should move from algorithmic transparency toward platform observability: approaching recommenders not as discrete algorithmic artifacts but as complex sociotechnical systems shaped in important ways by their users and operators. Before any attempt to ‘open the black box’ of algorithmic machine learning, therefore, regulating for observability invites us to ask how recommenders find uptake in practice; to demand basic data on recommender inputs, outputs and interventions; to ask what is being recommend, sooner than why. Several avenues for observability regulation are explored, including platform ad archives; notices for visibility restrictions (or ‘shadow bans’); and researcher APIs. Through solutions such as these, which render visible recommender outcomes, this dissertation outlines a vision for a more democratic media governance—one which supports informed and inclusive deliberation about, across and within social media’s personalised publics.}, }