More Than Justifications an Analysis of Information Needs in Explanations and Motivations to Disable Personalization external link
Abstract
There is consensus that algorithmic news recommenders should be explainable to inform news readers of potential risks. However, debates continue over which information users need and which stakeholders should access this information. As the debate continues, researchers also call for more control over algorithmic news recommender systems, for example, by turning off personalized recommendations. Despite this call, it is unclear the extent to which news readers will use this feature. To add nuance to the discussion, we analyzed 586 responses to two open-ended questions: i) what information needs to contribute to trustworthiness perceptions of new recommendations, and ii) whether people want the ability to turn off personalization. Our results indicate that most participants found knowing the sources of news items important for trusting a recommendation system. Additionally, more than half of the participants were inclined to disable personalization. The most common reasons to turn off personalization included concerns about bias or filter bubbles and a preference to consume generalized news. These findings suggest that news readers have different information needs for explanations when interacting with an algorithmic news recommender and that many news readers prefer to disable the usage of personalized news recommendations.
Links
control, DSA, news recommenders, Personalisation, trust