{"id":26515,"date":"2024-02-16T15:47:50","date_gmt":"2024-02-16T15:47:50","guid":{"rendered":"https:\/\/www.ivir.nl\/projects\/comenius\/online-disinformation\/video-script\/"},"modified":"2024-02-16T15:47:50","modified_gmt":"2024-02-16T15:47:50","slug":"video-script","status":"publish","type":"vo_project","link":"https:\/\/www.ivir.nl\/nl\/projecten\/comenius\/online-disinformation\/video-script\/","title":{"rendered":"Video Script"},"content":{"rendered":"\n<p class=\"has-text-align-center\"><strong>Combating disinformation \u2013 moving beyond takedowns and legal interventions<\/strong>&nbsp;<\/p>\n\n\n\n<p>By now, you probably have a good sense of how complex disinformation is: from how we define it, to how it\u2019s regulated, and most of all, what we can \u2013 and should \u2013 do about it. In the blog post, we looked at the broad range of categories that fall under the umbrella term \u2018disinformation\u2019. This includes who is responsible for creating and spreading it \u2013 whether it\u2019s government actors, private individuals, or automated bots \u2013 and the medium they use, for instance disinformation campaigns on Facebook and Twitter or through so-called \u2018peer-to-peer\u2019 networks, like WhatsApp. We also talked about the variation in the targets of disinformation, and in the harms caused to individuals and, sometimes, to broader society. Finally, the blog post touched on how disinformation impacts freedom of expression, and some key rights issues that must be borne in mind when it comes to how it\u2019s regulated.&nbsp;<\/p>\n\n\n\n<p>The infographic built on this regulation piece, outlining the differentiated responses to disinformation so far from the private sector, civil society, states, and regional organisations. These efforts range from self-regulation by platforms, through to imposed regulation by states and regional bodies like the European Union.&nbsp;<\/p>\n\n\n\n<p>In this video, we\u2019re going to round out this knowledge package by going beyond the legal efforts to reign in disinformation, and look at broader technological developments and educational campaigns to counter the harmful effects of disinformation (\u2026) without blocking or removing speech. The first is technological solutions that arise at the source: that is, solutions that aim to slow the spread or limit the reach of harmful speech, such as disinformation. The second is media and information literacy initiatives, which target the audience and seek to limit the harms and disruption caused by disinformation. We\u2019ll look at each of these solutions in turn.&nbsp;<\/p>\n\n\n\n<p><strong>In Technology We Trust?&nbsp;<\/strong><\/p>\n\n\n\n<p>A kneejerk response to harmful speech \u2013 like hate speech and disinformation \u2013 is to simply remove it from platforms altogether, and in some cases, to suspend or ban the individuals responsible for sowing and spreading it. But this raises all kinds of concerns for the right to freedom of expression and other rights: who gets to decide what constitutes harmful speech? Does the assessment depend on the broader social and political context? What if the platforms get it wrong, or take it too far? For instance, during the pandemic, YouTube removed a video posted by a professor of medical and scientific research at Stanford University. In the video, the Professor examined data relating to COVID-19, questioned the need for ongoing lockdowns and urged a more targeted response to protect the most vulnerable. YouTube removed the video on the basis that it contained \u2018medical misinformation\u2019 \u2013 a stance which has received significant criticism for removing legitimate medical information and critical commentary by an expert in the field.&nbsp;&nbsp;<\/p>\n\n\n\n<p>There are all kinds of issues with takedowns, whether they\u2019re done by platforms of their own volition, to avoid liability or political or societal backlash, or through state regulation which may capture legitimate speech. \u2013 a criticism often made of Germany\u2019s NetzDG legislation which we touched on in the infographic. Of course, takedowns may also be ordered pursuant to (legitimate) court orders, following a process which considers the right to freedom of expression and whether the restriction is proportionate in a democratic society. But even still, there are other ways to limit the reach of disinformation, without banning or restricting the speech itself.&nbsp; We\u2019re going to look at two examples.&nbsp;<\/p>\n\n\n\n<p><strong>(i) Adding Friction&nbsp;<\/strong><\/p>\n\n\n\n<p>The first is to add so-called \u2018friction\u2019 to slow the spread of disinformation.<sup>1<\/sup> This can take different forms, such as warnings and notifications before individuals can share certain content, or limiting the number of times an item can be shared by a single account.&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>Adding warnings and notifications has been shown to work to limit the spread of harmful content. I\u2019ll give you an example. A researcher at Princeton University conducted a large-scale study of a Reddit community with 13 million subscribers, to see whether displaying community rules could reduce concerns about harassment and influence the behaviour of participants on the forum.&nbsp;The subreddit that was the focus of the study hosted discussions about peer-reviewed journal articles and live Q&amp;As with scientists \u2013 but it was also a hotbed of harassment. Commenters mocked Professor Stephen Hawking\u2019s medical condition during a Q&amp;A in 2015, while a discussion about research to do with obesity in women resulted in the removal of almost 1,500 out of 2,200 comments. The study assessed the impact of displaying community rules at the top of a discussion. <\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/www.ivir.nl\/publicaties\/download\/image-1024x180.jpeg\" alt=\"\" class=\"wp-image-16228\"\/><\/figure>\n\n\n\n<p>The notice explained the kinds of comments that weren\u2019t allowed \u2013 from memes to abusive comments \u2013 and advised that a community of 1,200 moderators encourages respectful discussion.&nbsp;<\/p>\n\n\n\n<p>The experiment found that displaying the community rules not only influenced who joined discussions, but also how they behaved. The study concluded that \u201cin online discussions, where unruly, harassing behavior is common, displaying community rules could reduce concerns about harassment that prevent people from joining while also influencing the behavior of those who do participate\u201d.<sup>2<\/sup>&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>I know what you\u2019re probably thinking \u2013 this might not work so well with disinformation, right? Well, there are other ways to add friction which might be even more relevant for disinformation. In India, for example, when concerns were raised that WhatsApp was being used to circulate misinformation leading to real-world violence, Facebook (which owns WhatsApp) put limits on the number of users who could be forwarded news items.<sup>3<\/sup>&nbsp;&nbsp;<\/p>\n\n\n\n<p><strong>(ii) Dissuading Sharing&nbsp;<\/strong><\/p>\n\n\n\n<p>Short of limiting the number of people who can receive posts or messages, there are ways to dissuade individuals from sharing harmful content. One example was the #WeCounterHate project. AI would flag content as potential hate speech, and where that was confirmed by a human moderator, a message would be generated to alert the poster \u2013 and anyone who saw it \u2013 that the hate speech was being countered, urging individuals to think twice before retweeting it, and advising that ever retweet would result in a donation to a non-profit fighting for equality.<sup>4<\/sup> The project\u2019s pilot phase showed a lot of promise: from reducing the retweet rate of hate speech by more than 65%, to one in five authors of hateful tweets deleting them.&nbsp;&nbsp;<\/p>\n\n\n\n<p><strong>In Education We Trust?<\/strong>&nbsp;<\/p>\n\n\n\n<p>One of the other ways to combat the harms of disinformation is through media and information literacy initiatives. This could include tips on how to identify so-called \u2018fake news\u2019, or explanations about how platforms use algorithms to promote certain content online and curate our news feeds, in part based on previous likes and dislikes. Unlike measures which remove content online, media and information literacy initiatives focus on how to better prepare users to discern what\u2019s true and what\u2019s false; what\u2019s authentic and what\u2019s not. This piece of the puzzle should not be overlooked: the internet is a bottomless pool of information, but it\u2019s where many of us now go to get our news and learn about the world around us. Ensuring that we are better consumers of news may be an important means of countering the harms of disinformation and restricting or limiting its spread. Moreover, <a href=\"https:\/\/www.tandfonline.com\/doi\/full\/10.1080\/23311983.2022.2037229\" target=\"_blank\" rel=\"noreferrer noopener\">some studies<\/a> have shown that increased media and information literacy leads to decreased sharing of inaccurate stories or \u2018fake news\u2019.<sup>5<\/sup>&nbsp;&nbsp;<\/p>\n\n\n\n<p>A large-scale study looked at the effectiveness of digital media literacy in the US and India, and found that \u201crelatively short, scalable interventions could be effective in fighting misinformation around the world\u201d.<sup>6<\/sup>&nbsp;<\/p>\n\n\n\n<p>Other surveys have shown more mixed results: for instance, studies in the US suggest that digital literacy measures may prove useful to identify people with inaccurate beliefs \u2013 or to help people identify misinformation \u2013 but are less effective when it comes to deterring individuals from spreading misinformation online.<sup>7<\/sup>&nbsp;&nbsp;<\/p>\n\n\n\n<p>Perhaps one of the reasons that accounts for the different results in these studies is what is meant by \u201cdigital literacy\u201d. For instance, the studies that showed mixed results defined media literacy as \u201cfamiliarity with basic concepts related to the internet and social media\u201d \u2013 which is quite a low bar. By contrast, civil society organisations in many countries throughout Europe and beyond&nbsp; have launched digital literacy campaigns in schools, starting from primary school. These campaigns have as their goal \u201cactive, responsible citizens and voters\u201d.<sup>8<\/sup>&nbsp; Some media and information literacy initiatives specifically target disinformation. Take, for instance, \u2018Lie Detectors\u2019, an \u2019independent and award-winning Media Literacy organisation\u2019 which works to \u2018equip young people and teachers to tell apart fact from falsehood and opinion online\u2019.<sup>9<\/sup> Journalists deliver sessions explaining how journalism works and walking children through the basics of fact-checking and media bias. In Finland, students are tasked with creating a fake news campaign to better understand how and why disinformation is created and shared.<sup>10<\/sup>&nbsp;<\/p>\n\n\n\n<p><strong>Conclusion&nbsp;<\/strong><\/p>\n\n\n\n<p>Given the complexity of disinformation, there\u2019s unlikely to be a \u2018silver bullet\u2019 which can stop or limit its spread, or avoid all of the potential harms it causes. But as we\u2019ve shown, there are different tools in the toolkit beyond removals and takedowns which do not pose the same risks to long-standing rights like freedom of expression. These include adding friction into online discourse to make people think twice before sharing disinformation, and incorporating media and information literacy initiatives in schools and in broader public education efforts. After all, in technology and education we trust.&nbsp;&nbsp;<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<ol class=\"wp-block-list\" style=\"font-size:10px;list-style-type:1\">\n<li class=\"has-small-font-size\">See Molly Land and Rebecca Hamilton, \u2018Beyond Takedown: Expanding the Toolkit for Responding to Online Hate\u2019 in Propaganda, War Crimes Trials and International Law: From Cognition to Criminality 143 (Routledge 2020) at <br>https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=3514234; NYU Stern Center for Business and Human Rights, \u2018Harmful Content: The Role of Internet Platform Companies in Fighting Terrorist Incitement and Politically Motivated Disinformation,\u2019 (New York 2017) at <a href=\"https:\/\/issuu.com\/nyusterncenterforbusinessandhumanri\/docs\/final.harmful_content._the_role_of_?e=31640827\/54951655\">https:\/\/issuu.com\/nyusterncenterforbusinessandhumanri\/docs\/final.harmful_content._the_role_of_?e=31640827\/54951655<\/a><\/li>\n\n\n\n<li class=\"has-small-font-size\">J. Nathan Matias, \u2018Preventing harassment and increasing group participation through social norms in 2,190 online science discussions,\u2019 Proceedings of the National Academy of Sciences of the United States of America (PNAS), 14 May 2019, at <a href=\"https:\/\/www.pnas.org\/content\/116\/20\/9785\">https:\/\/www.pnas.org\/content\/116\/20\/9785<\/a>. See also Land &amp; Hamilton, pp 9-10.<\/li>\n\n\n\n<li class=\"has-small-font-size\">Land &amp; Hamilton, p 10; V. Ananth, \u2018WhatsApp Races Against Time to Fix Fake News Mess Ahead of 2019 General Elections,\u2019 The Economic Times (24 July 2018) at <a href=\"https:\/\/economictimes.indiatimes.com\/tech\/internet\/whatsapp-races-against-time-to-fix-fake-news-mess-ahead-of-2019-general-elections\/articleshow\/65112280.cms\/\">https:\/\/economictimes.indiatimes.com\/tech\/internet\/whatsapp-races-against-time-to-fix-fake-news-mess-ahead-of-2019-general-elections\/articleshow\/65112280.cms\/<\/a>.<\/li>\n\n\n\n<li class=\"has-small-font-size\">See <a href=\"https:\/\/www.forbes.com\/sites\/afdhelaziz\/2019\/12\/25\/the-power-of-purpose-how-we-counter-hate-used-artificial-intelligence-to-battle-hate-speech-online\/\">https:\/\/www.forbes.com\/sites\/afdhelaziz\/2019\/12\/25\/the-power-of-purpose-how-we-counter-hate-used-artificial-intelligence-to-battle-hate-speech-online\/<\/a>; Land &amp; Hamilton, p. 13<\/li>\n\n\n\n<li class=\"has-small-font-size\"><a href=\"https:\/\/www.tandfonline.com\/doi\/full\/10.1080\/23311983.2022.2037229\">https:\/\/www.tandfonline.com\/doi\/full\/10.1080\/23311983.2022.2037229<\/a>; <a href=\"https:\/\/thehill.com\/changing-america\/enrichment\/education\/598795-media-literacy-is-desperately-needed-in-classrooms\/\">https:\/\/thehill.com\/changing-america\/enrichment\/education\/598795-media-literacy-is-desperately-needed-in-classrooms\/<\/a><\/li>\n\n\n\n<li class=\"has-small-font-size\">Andrew M Guess et al, \u201cA digital media literacy intervention increases discernment between mainstream and false news in the United States and India\u201d (PNAS, 7 July 2020) at <a href=\"https:\/\/www.pnas.org\/content\/117\/27\/15536\">https:\/\/www.pnas.org\/content\/117\/27\/15536<\/a>.<\/li>\n\n\n\n<li class=\"has-small-font-size\">Harvard Kennedy School, Misinformation Review: Digital literacy is associated with more discerning accuracy judgments but not sharing intentions (6 December 2021) at <a href=\"https:\/\/misinforeview.hks.harvard.edu\/article\/digital-literacy-is-associated-with-more-discerning-accuracy-judgments-but-not-sharing-intentions\/\">https:\/\/misinforeview.hks.harvard.edu\/article\/digital-literacy-is-associated-with-more-discerning-accuracy-judgments-but-not-sharing-intentions\/<\/a>; Sarah Brown, \u201cStudy: Digital literacy doesn\u2019t stop the spread of misinformation\u201d (MIT Management Sloan School, 5 January 2022) at <a href=\"https:\/\/mitsloan.mit.edu\/ideas-made-to-matter\/study-digital-literacy-doesnt-stop-spread-misinformation\">https:\/\/mitsloan.mit.edu\/ideas-made-to-matter\/study-digital-literacy-doesnt-stop-spread-misinformation<\/a>.<\/li>\n\n\n\n<li class=\"has-small-font-size\">The Guardian, \u201cHow Finland starts its fight against fake news in primary schools\u201d (29 January 2020) at <a href=\"https:\/\/www.theguardian.com\/world\/2020\/jan\/28\/fact-from-fiction-finlands-new-lessons-in-combating-fake-news\">https:\/\/www.theguardian.com\/world\/2020\/jan\/28\/fact-from-fiction-finlands-new-lessons-in-combating-fake-news<\/a>; &nbsp;MediaSmarts, Canada\u2019s Centre for Digital and Media Literacy at <a href=\"https:\/\/mediasmarts.ca\/\">https:\/\/mediasmarts.ca\/<\/a>. See also Land &amp; Hamilton, pp 10-11.<\/li>\n\n\n\n<li class=\"has-small-font-size\">p 23; <a href=\"https:\/\/www.etwinning.net\/downloads\/BOOK2021_eTwinning_INTERACTIF.pdf\">https:\/\/www.etwinning.net\/downloads\/BOOK2021_eTwinning_INTERACTIF.pdf<\/a><\/li>\n\n\n\n<li class=\"has-small-font-size\"><a href=\"https:\/\/www.theguardian.com\/world\/2020\/jan\/28\/fact-from-fiction-finlands-new-lessons-in-combating-fake-news\">https:\/\/www.theguardian.com\/world\/2020\/jan\/28\/fact-from-fiction-finlands-new-lessons-in-combating-fake-news<\/a><\/li>\n<\/ol>\n","protected":false},"parent":26509,"menu_order":2,"template":"","project_types":[],"class_list":["post-26515","vo_project","type-vo_project","status-publish","hentry","entry"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Video Script - IVIR<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.ivir.nl\/nl\/projecten\/comenius\/online-disinformation\/video-script\/\" \/>\n<meta property=\"og:locale\" content=\"nl_NL\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Video Script - IVIR\" \/>\n<meta property=\"og:description\" content=\"Combating disinformation \u2013 moving beyond takedowns and legal interventions&nbsp; By now, you probably have a good sense of how complex disinformation is: from how we define it, to how it\u2019s regulated, and most of all, what we can \u2013 and should \u2013 do about it. In the blog post, we looked at the broad range&hellip; Continue reading Video Script\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.ivir.nl\/nl\/projecten\/comenius\/online-disinformation\/video-script\/\" \/>\n<meta property=\"og:site_name\" content=\"IVIR\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:site\" content=\"@ivir_uva\" \/>\n<meta name=\"twitter:label1\" content=\"Geschatte leestijd\" \/>\n\t<meta name=\"twitter:data1\" content=\"10 minuten\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/www.ivir.nl\\\/nl\\\/projecten\\\/comenius\\\/online-disinformation\\\/video-script\\\/\",\"url\":\"https:\\\/\\\/www.ivir.nl\\\/nl\\\/projecten\\\/comenius\\\/online-disinformation\\\/video-script\\\/\",\"name\":\"Video Script - IVIR\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.ivir.nl\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/www.ivir.nl\\\/nl\\\/projecten\\\/comenius\\\/online-disinformation\\\/video-script\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/www.ivir.nl\\\/nl\\\/projecten\\\/comenius\\\/online-disinformation\\\/video-script\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.ivir.nl\\\/publicaties\\\/download\\\/image-1024x180.jpeg\",\"datePublished\":\"2024-02-16T15:47:50+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/www.ivir.nl\\\/nl\\\/projecten\\\/comenius\\\/online-disinformation\\\/video-script\\\/#breadcrumb\"},\"inLanguage\":\"nl-NL\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/www.ivir.nl\\\/nl\\\/projecten\\\/comenius\\\/online-disinformation\\\/video-script\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"nl-NL\",\"@id\":\"https:\\\/\\\/www.ivir.nl\\\/nl\\\/projecten\\\/comenius\\\/online-disinformation\\\/video-script\\\/#primaryimage\",\"url\":\"https:\\\/\\\/www.ivir.nl\\\/publicaties\\\/download\\\/image-1024x180.jpeg\",\"contentUrl\":\"https:\\\/\\\/www.ivir.nl\\\/publicaties\\\/download\\\/image-1024x180.jpeg\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/www.ivir.nl\\\/nl\\\/projecten\\\/comenius\\\/online-disinformation\\\/video-script\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/www.ivir.nl\\\/nl\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Projects\",\"item\":\"https:\\\/\\\/www.ivir.nl\\\/projects\\\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Comenius\",\"item\":\"https:\\\/\\\/www.ivir.nl\\\/nl\\\/projecten\\\/comenius\\\/\"},{\"@type\":\"ListItem\",\"position\":4,\"name\":\"Online disinformation\",\"item\":\"https:\\\/\\\/www.ivir.nl\\\/nl\\\/projecten\\\/comenius\\\/online-disinformation\\\/\"},{\"@type\":\"ListItem\",\"position\":5,\"name\":\"Video Script\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/www.ivir.nl\\\/#website\",\"url\":\"https:\\\/\\\/www.ivir.nl\\\/\",\"name\":\"IVIR\",\"description\":\"Universiteit van Amsterdam\",\"publisher\":{\"@id\":\"https:\\\/\\\/www.ivir.nl\\\/#organization\"},\"alternateName\":\"Institute for Information Law\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/www.ivir.nl\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"nl-NL\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/www.ivir.nl\\\/#organization\",\"name\":\"Institute for Information Law\",\"alternateName\":\"IVIR\",\"url\":\"https:\\\/\\\/www.ivir.nl\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"nl-NL\",\"@id\":\"https:\\\/\\\/www.ivir.nl\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/www.ivir.nl\\\/publicaties\\\/download\\\/IVIR_2023_LOGO-hoog.svg\",\"contentUrl\":\"https:\\\/\\\/www.ivir.nl\\\/publicaties\\\/download\\\/IVIR_2023_LOGO-hoog.svg\",\"width\":1,\"height\":1,\"caption\":\"Institute for Information Law\"},\"image\":{\"@id\":\"https:\\\/\\\/www.ivir.nl\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/x.com\\\/ivir_uva\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/institute-for-information-law-ivir-\\\/\",\"https:\\\/\\\/bsky.app\\\/profile\\\/ivir-uva.bsky.social\"]}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Video Script - IVIR","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.ivir.nl\/nl\/projecten\/comenius\/online-disinformation\/video-script\/","og_locale":"nl_NL","og_type":"article","og_title":"Video Script - IVIR","og_description":"Combating disinformation \u2013 moving beyond takedowns and legal interventions&nbsp; By now, you probably have a good sense of how complex disinformation is: from how we define it, to how it\u2019s regulated, and most of all, what we can \u2013 and should \u2013 do about it. In the blog post, we looked at the broad range&hellip; Continue reading Video Script","og_url":"https:\/\/www.ivir.nl\/nl\/projecten\/comenius\/online-disinformation\/video-script\/","og_site_name":"IVIR","twitter_card":"summary_large_image","twitter_site":"@ivir_uva","twitter_misc":{"Geschatte leestijd":"10 minuten"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/www.ivir.nl\/nl\/projecten\/comenius\/online-disinformation\/video-script\/","url":"https:\/\/www.ivir.nl\/nl\/projecten\/comenius\/online-disinformation\/video-script\/","name":"Video Script - IVIR","isPartOf":{"@id":"https:\/\/www.ivir.nl\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.ivir.nl\/nl\/projecten\/comenius\/online-disinformation\/video-script\/#primaryimage"},"image":{"@id":"https:\/\/www.ivir.nl\/nl\/projecten\/comenius\/online-disinformation\/video-script\/#primaryimage"},"thumbnailUrl":"https:\/\/www.ivir.nl\/publicaties\/download\/image-1024x180.jpeg","datePublished":"2024-02-16T15:47:50+00:00","breadcrumb":{"@id":"https:\/\/www.ivir.nl\/nl\/projecten\/comenius\/online-disinformation\/video-script\/#breadcrumb"},"inLanguage":"nl-NL","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.ivir.nl\/nl\/projecten\/comenius\/online-disinformation\/video-script\/"]}]},{"@type":"ImageObject","inLanguage":"nl-NL","@id":"https:\/\/www.ivir.nl\/nl\/projecten\/comenius\/online-disinformation\/video-script\/#primaryimage","url":"https:\/\/www.ivir.nl\/publicaties\/download\/image-1024x180.jpeg","contentUrl":"https:\/\/www.ivir.nl\/publicaties\/download\/image-1024x180.jpeg"},{"@type":"BreadcrumbList","@id":"https:\/\/www.ivir.nl\/nl\/projecten\/comenius\/online-disinformation\/video-script\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.ivir.nl\/nl\/"},{"@type":"ListItem","position":2,"name":"Projects","item":"https:\/\/www.ivir.nl\/projects\/"},{"@type":"ListItem","position":3,"name":"Comenius","item":"https:\/\/www.ivir.nl\/nl\/projecten\/comenius\/"},{"@type":"ListItem","position":4,"name":"Online disinformation","item":"https:\/\/www.ivir.nl\/nl\/projecten\/comenius\/online-disinformation\/"},{"@type":"ListItem","position":5,"name":"Video Script"}]},{"@type":"WebSite","@id":"https:\/\/www.ivir.nl\/#website","url":"https:\/\/www.ivir.nl\/","name":"IVIR","description":"Universiteit van Amsterdam","publisher":{"@id":"https:\/\/www.ivir.nl\/#organization"},"alternateName":"Institute for Information Law","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.ivir.nl\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"nl-NL"},{"@type":"Organization","@id":"https:\/\/www.ivir.nl\/#organization","name":"Institute for Information Law","alternateName":"IVIR","url":"https:\/\/www.ivir.nl\/","logo":{"@type":"ImageObject","inLanguage":"nl-NL","@id":"https:\/\/www.ivir.nl\/#\/schema\/logo\/image\/","url":"https:\/\/www.ivir.nl\/publicaties\/download\/IVIR_2023_LOGO-hoog.svg","contentUrl":"https:\/\/www.ivir.nl\/publicaties\/download\/IVIR_2023_LOGO-hoog.svg","width":1,"height":1,"caption":"Institute for Information Law"},"image":{"@id":"https:\/\/www.ivir.nl\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/ivir_uva","https:\/\/www.linkedin.com\/company\/institute-for-information-law-ivir-\/","https:\/\/bsky.app\/profile\/ivir-uva.bsky.social"]}]}},"_links":{"self":[{"href":"https:\/\/www.ivir.nl\/nl\/wp-json\/wp\/v2\/vo_project\/26515","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.ivir.nl\/nl\/wp-json\/wp\/v2\/vo_project"}],"about":[{"href":"https:\/\/www.ivir.nl\/nl\/wp-json\/wp\/v2\/types\/vo_project"}],"version-history":[{"count":0,"href":"https:\/\/www.ivir.nl\/nl\/wp-json\/wp\/v2\/vo_project\/26515\/revisions"}],"up":[{"embeddable":true,"href":"https:\/\/www.ivir.nl\/nl\/wp-json\/wp\/v2\/vo_project\/26509"}],"wp:attachment":[{"href":"https:\/\/www.ivir.nl\/nl\/wp-json\/wp\/v2\/media?parent=26515"}],"wp:term":[{"taxonomy":"project_types","embeddable":true,"href":"https:\/\/www.ivir.nl\/nl\/wp-json\/wp\/v2\/project_types?post=26515"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}