- WWW: Website
Jaron Harambam (1983) works as a postdoctoral researcher at the Institute for Information Law of the University of Amsterdam, where he is part of the multi-disciplinary Fair News Project in which he studies the role of algorithms in news provision. He defended his PhD (cum laude, highest distinction) October 2017 at the Rotterdam Centre for Cultural Sociology at the Erasmus University Rotterdam, titled: “The Truth Is Out There” – Conspiracy Culture in an Age of Epistemic Instability. For this research project, he carried out extensive ethnographic fieldwork in the Dutch conspiracy milieu to study what contemporary conspiracy theories are about, who actually follows them and what these people do with these ideas in their everyday lives. The overall goal is to arrive at an empirically rich and theoretically elaborate understanding of the contemporary popularity of conspiracy theories. He has published about his research project in Cultural Sociology (2016) and in the Public Understanding of Science (2015). His broader sociological interests lie at the intersections of science, popular culture, (new) media and religion. Although most experienced with ethnographic research methods, he is knowledgeable of quantitative research as well. Before, he studied the commercialization of virtual game worlds, allegedly breaking down the “magic circle” of play which got published in the European Journal of Cultural Studies (2011). And he conducted both quantitative and qualitative research about governmental efforts to revive rural social life through new digital technologies which got published in Information, Communication, and Society (2013). He currently is an editor of the Dutch peer-reviewed journal Sociologie, and he co-edited a special issue on actor-network theory (2014), and is a founding member of the European network of scholars working on conspiracy theories, COST COMPACT.
|Bountouridis, D., Harambam, J., Makhortykh, M., van Hoboken, J.|
In: RecSys'19: Proceedings of the 13th ACM Conference on Recommender Systems, pp. 69-77, 2019.
(| | )
Recommender systems (RS) are on the rise in many domains. While they offer great promises, they also raise concerns: lack of transparency, reduction of diversity, little to no user control. In this paper, we align with the normative turn in computer science which scrutinizes the ethical and societal implications of RS. We focus and elaborate on the concept of user control because that mitigates multiple problems at once. Taking the news industry as our domain, we conducted four focus groups, or moderated think-aloud sessions, with Dutch news readers (N=21) to systematically study how people evaluate different control mechanisms (at the input, process, and output phase) in a News Recommender Prototype (NRP). While these mechanisms are sometimes met with distrust about the actual control they offer, we found that an intelligible user profile (including reading history and flexible preferences settings), coupled with possibilities to influence the recommendation algorithms is highly valued, especially when these control mechanisms can be operated in relation to achieving personal goals. By bringing (future) users' perspectives to the fore, this paper contributes to a richer understanding of why and how to design for user control in recommender systems.
In: 2019 , pp. 39-42, 2019, (In: Hoe nu verder met de vaccinatietwijfel?: Tien adviezen aan Staatssecretaris Paul Blokhuis, red. Roland Pierik, Universiteit van Amsterdam, 2019. ISBN 9789090317144.).
|Harambam, J., Helberger, N., van Hoboken, J.|
In: Philosophical Transactions of the Royal Society A, 376 (2135), pp. 1-21, 2018, ISBN: 1364–503X.
(| | )
The deployment of various forms of AI, most notably of machine learning algorithms, radically transforms many domains of social life. In this paper we focus on the news industry, where different algorithms are used to customize news offerings to increasingly specific audience preferences. While this personalization of news enables media organizations to be more receptive to their audience, it can be questioned whether current deployments of algorithmic news recommenders (ANR) live up to their emancipatory promise. Like in various other domains, people have little knowledge of what personal data is used and how such algorithmic curation comes about, let alone that they have any concrete ways to influence these data-driven processes. Instead of going down the intricate avenue of trying to make ANR more transparent, we explore in this article ways to give people more influence over the information news recommendation algorithms provide by thinking about and enabling possibilities to express voice. After differentiating four ideal typical modalities of expressing voice (alternation, awareness, adjustment and obfuscation) which are illustrated with currently existing empirical examples, we present and argue for algorithmic recommender personae as a way for people to take more control over the algorithms that curate people's news provision.
In: Sociologie, 13 (1), pp. 73-92, 2018, ISSN: 1875-7138.
(| | )
The Truth dominates many public discussions today. Conventional truths from established epistemic authorities about all sorts of issues, from climate change to terrorist attacks, are increasingly challenged by ordinary citizens and presidents alike. Many have therefore proclaimed that we have entered a post-truth era: a world in which objective facts are no longer relevant. Media and politics speak in alarmist discourse about how fake news, conspiracy theories and alternative facts threaten democratic societies by destabilizing the Truth ‐ a clear sign of a moral panic. In this essay, I firstly explore what sociological changes have led to (so much commotion about) the alleged demise of the Truth. In contrast to the idea that we have moved beyond it, I argue that we are amidst public battles about the Truth: at stake is who gets to decide over that and why. I then discuss and criticize the dominant counter reaction (re-establishing the idea of one objective and irrefutable truth), which I see as an unsuccessful de-politisation strategy. Basing myself on research and experiments with epistemic democracy in the field of science studies, I end with a more effective and democratic alternative of how to deal with knowledge in the complex information landscape of today.
In: Trouw, 2018.