Article 12 DSA: Will platforms be required to apply EU fundamental rights in content moderation decisions? external link

Content moderation, Digital services act, DSA, frontpage, Fundamental rights

Bibtex

Online publication{Quintais2021f, title = {Article 12 DSA: Will platforms be required to apply EU fundamental rights in content moderation decisions?}, author = {Quintais, J. and Appelman, N. and Fahy, R.}, url = {https://dsa-observatory.eu/2021/05/31/article-12-dsa-will-platforms-be-required-to-apply-eu-fundamental-rights-in-content-moderation-decisions/}, year = {0531}, date = {2021-05-31}, keywords = {Content moderation, Digital services act, DSA, frontpage, Fundamental rights}, }

The Interplay between the Digital Services Act and Sector Regulation: How Special is Copyright? external link

Quintais, J. & Schwemer, S.
European Journal of Risk Regulation, vol. 13, iss. : 2, pp: 191-217, 2022

Abstract

On 15 December 2020, the European Commission published its proposal for a Regulation on a Single Market for Digital Services (Digital Services Act). It carries out a regulatory overhaul of the 21-year- old horizontal rules on intermediary liability in the Directive and introduces new due diligence obligations for intermediary services. Our analysis illuminates an important point that has so far received little attention: how would the Digital Services Act’s rules interact with existing sector-specific lex specialis rules? In this paper, we look specifically at the intersection of the Digital Services Act with the regime for online content sharing service providers (OCSSPs) set forth in art. 17 of Directive (EU) 2019/790 on copyright in the Digital Single Market (CDSM Directive). At first glance, these regimes do not appear to overlap as the rules on copyright are lex specialis to the Digital Services Act. A closer look shows a more complex and nuanced picture. Our analysis concludes that the DSA will apply to OCSSPs insofar as it contains rules that regulate matters not covered by art. 17 CDSM Directive, as well as specific rules on matters where art. 17 leaves margin of discretion to Member States. This includes, to varying degrees, rules in the DSA relating to the liability of intermediary providers and to due diligence obligations for online platforms of different sizes. Importantly, we consider that such rules apply even where art. 17 CDSM Directive contains specific (but less precise) regulation on the matter. From a normative perspective, this might be a desirable outcome, to the extent that the DSA aims to establish “uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected”. Based on our analysis, we suggest a number of clarifications that might be help achieve that goal.

Art. 17 CDSM Directive, Content moderation, Copyright, Digital services act, frontpage, Online platforms

Bibtex

Article{Quintais2021e, title = {The Interplay between the Digital Services Act and Sector Regulation: How Special is Copyright?}, author = {Quintais, J. and Schwemer, S.}, url = {https://www.ivir.nl/ejrr_2022/}, doi = {https://doi.org/https://doi.org/10.1017/err.2022.1}, year = {0314}, date = {2022-03-14}, journal = {European Journal of Risk Regulation}, volume = {13}, issue = {2}, pages = {191-217}, abstract = {On 15 December 2020, the European Commission published its proposal for a Regulation on a Single Market for Digital Services (Digital Services Act). It carries out a regulatory overhaul of the 21-year- old horizontal rules on intermediary liability in the Directive and introduces new due diligence obligations for intermediary services. Our analysis illuminates an important point that has so far received little attention: how would the Digital Services Act’s rules interact with existing sector-specific lex specialis rules? In this paper, we look specifically at the intersection of the Digital Services Act with the regime for online content sharing service providers (OCSSPs) set forth in art. 17 of Directive (EU) 2019/790 on copyright in the Digital Single Market (CDSM Directive). At first glance, these regimes do not appear to overlap as the rules on copyright are lex specialis to the Digital Services Act. A closer look shows a more complex and nuanced picture. Our analysis concludes that the DSA will apply to OCSSPs insofar as it contains rules that regulate matters not covered by art. 17 CDSM Directive, as well as specific rules on matters where art. 17 leaves margin of discretion to Member States. This includes, to varying degrees, rules in the DSA relating to the liability of intermediary providers and to due diligence obligations for online platforms of different sizes. Importantly, we consider that such rules apply even where art. 17 CDSM Directive contains specific (but less precise) regulation on the matter. From a normative perspective, this might be a desirable outcome, to the extent that the DSA aims to establish “uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected”. Based on our analysis, we suggest a number of clarifications that might be help achieve that goal.}, keywords = {Art. 17 CDSM Directive, Content moderation, Copyright, Digital services act, frontpage, Online platforms}, }

Intermediary Liability and Trade Mark Infringement – Proliferation of Filter Obligations in Civil Law Jurisdictions? external link

1126, pp: 381-403

Abstract

The erosion of the safe harbour for hosting in the EU Directive on Copyright in the Digital Single Market (CDSM Directive) leads to a remarkable climate change in the field of EU copyright law and the civil law jurisdictions of continental EU Member States. Inevitably, it raises the question of potential repercussions on the safe harbour for hosting and filtering standards in trademark cases. Even though online marketplaces are explicitly exempted from the new copyright rules and the CDSM Directive is not intended to neutralize the safe harbour for hosting in trademark cases, the adoption of a more restrictive approach in copyright law may quicken the appetite of trademark proprietors for similar measures in trademark law. The extension of the new copyright approach to trademark cases, however, is unlikely to yield satisfactory results.Due to the different conceptual contours of trademark rights, a system mimicking the filtering obligations following from the CDSM Directive would give trademark proprietors excessive control over the use of their trademarks in the digital environment. Such an overbroad system of automated, algorithmic filtering would encroach upon the fundamental guarantee of freedom of expression and freedom of competition. It is likely to have a chilling effect on legitimate descriptive use of trademarks, comparative advertising, advertising by resellers, information about alternative offers in the marketplace, and use criticizing or commenting upon trademarked products. As a result, consumers would receive less diverse information on goods and services and the free movement of goods and services in the internal market would be curtailed. The reliability of the internet as an independent source of trademark-related information would be put at risk. The analysis, thus, leads to the insight that a proliferation of the new filtering obligations in copyright law is undesirable and should be avoided.

algorithmic enforcement, confusion, Content moderation, descriptive use, dilution, exhaustion of trademark rights, filtering obligations, free movement of goods and services, freedom of commercial expression, freedom of competition, frontpage, market transparency, Merkenrecht, parallel imports, platform economy

Bibtex

Chapter{Senftleben2020g, title = {Intermediary Liability and Trade Mark Infringement – Proliferation of Filter Obligations in Civil Law Jurisdictions?}, author = {Senftleben, M.}, url = {https://www.ivir.nl/publicaties/download/Intermediary_Liability_and_Trade_Mark_Infringement.pdf https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3736919 https://www.oxfordhandbooks.com/view/10.1093/oxfordhb/9780198837138.001.0001/oxfordhb-9780198837138}, year = {1126}, date = {2020-11-26}, abstract = {The erosion of the safe harbour for hosting in the EU Directive on Copyright in the Digital Single Market (CDSM Directive) leads to a remarkable climate change in the field of EU copyright law and the civil law jurisdictions of continental EU Member States. Inevitably, it raises the question of potential repercussions on the safe harbour for hosting and filtering standards in trademark cases. Even though online marketplaces are explicitly exempted from the new copyright rules and the CDSM Directive is not intended to neutralize the safe harbour for hosting in trademark cases, the adoption of a more restrictive approach in copyright law may quicken the appetite of trademark proprietors for similar measures in trademark law. The extension of the new copyright approach to trademark cases, however, is unlikely to yield satisfactory results.Due to the different conceptual contours of trademark rights, a system mimicking the filtering obligations following from the CDSM Directive would give trademark proprietors excessive control over the use of their trademarks in the digital environment. Such an overbroad system of automated, algorithmic filtering would encroach upon the fundamental guarantee of freedom of expression and freedom of competition. It is likely to have a chilling effect on legitimate descriptive use of trademarks, comparative advertising, advertising by resellers, information about alternative offers in the marketplace, and use criticizing or commenting upon trademarked products. As a result, consumers would receive less diverse information on goods and services and the free movement of goods and services in the internal market would be curtailed. The reliability of the internet as an independent source of trademark-related information would be put at risk. The analysis, thus, leads to the insight that a proliferation of the new filtering obligations in copyright law is undesirable and should be avoided.}, keywords = {algorithmic enforcement, confusion, Content moderation, descriptive use, dilution, exhaustion of trademark rights, filtering obligations, free movement of goods and services, freedom of commercial expression, freedom of competition, frontpage, market transparency, Merkenrecht, parallel imports, platform economy}, }

The Odyssey of the Prohibition on General Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce Directive and Article 17 of the Directive on Copyright in the Digital Single Market external link

Abstract

EU law provides explicitly that intermediaries may not be obliged to monitor their service in a general manner in order to detect and prevent the illegal activity of their users. However, a misunderstanding of the difference between monitoring specific content and monitoring FOR specific content is a recurrent theme in the debate on intermediary liability and a central driver of the controversy surrounding it. Rightly understood, a prohibited general monitoring obligation arises whenever content – no matter how specifically it is defined – must be identified among the totality of the content on a platform. The moment platform content must be screened in its entirety, the monitoring obligation acquires an excessive, general nature. Against this background, a content moderation duty can only be deemed permissible if it is specific in respect of both the protected subject matter and potential infringers. This requirement of 'double specificity' is of particular importance because it prevents encroachments upon fundamental rights. The jurisprudence of the Court of Justice of the European Union has shed light on the anchorage of the general monitoring ban in primary EU law, in particular the right to the protection of personal data, the freedom of expression and information, the freedom to conduct a business, and the free movement of goods and services in the internal market. Due to their higher rank in the norm hierarchy, these legal guarantees constitute common ground for the application of the general monitoring prohibition in secondary EU legislation, namely Article 15(1) of the E-Commerce Directive ('ECD') and Article 17(8) of the Directive on Copyright in the Digital Single Market ('CDSMD'). With regard to the Digital Services Act (‘DSA’), this result of the analysis implies that any further manifestation of the general monitoring ban in the DSA would have to be construed and applied – in the light of applicable CJEU case law – as a safeguard against encroachments upon the aforementioned fundamental rights and freedoms. If the final text of the DSA does not contain a reiteration of the prohibition of general monitoring obligations known from Article 15(1) ECD and Article 17(8) CDSMD, the regulation of internet service provider liability, duties of care and injunctions would still have to avoid inroads into the aforementioned fundamental rights and freedoms and observe the principle of proportionality. The double specificity requirement plays a central role in this respect.

algorithmic enforcement, Auteursrecht, censorship, Content moderation, Copyright, defamation, Digital services act, filtering, Freedom of expression, frontpage, general monitoring, hosting service, injunctive relief, intermediary liability, notice and stay down, notice and take down, safe harbour, trade mark, user-generated content

Bibtex

Report{Senftleben2020e, title = {The Odyssey of the Prohibition on General Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce Directive and Article 17 of the Directive on Copyright in the Digital Single Market}, author = {Senftleben, M. and Angelopoulos, C.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3717022}, year = {1029}, date = {2020-10-29}, abstract = {EU law provides explicitly that intermediaries may not be obliged to monitor their service in a general manner in order to detect and prevent the illegal activity of their users. However, a misunderstanding of the difference between monitoring specific content and monitoring FOR specific content is a recurrent theme in the debate on intermediary liability and a central driver of the controversy surrounding it. Rightly understood, a prohibited general monitoring obligation arises whenever content – no matter how specifically it is defined – must be identified among the totality of the content on a platform. The moment platform content must be screened in its entirety, the monitoring obligation acquires an excessive, general nature. Against this background, a content moderation duty can only be deemed permissible if it is specific in respect of both the protected subject matter and potential infringers. This requirement of \'double specificity\' is of particular importance because it prevents encroachments upon fundamental rights. The jurisprudence of the Court of Justice of the European Union has shed light on the anchorage of the general monitoring ban in primary EU law, in particular the right to the protection of personal data, the freedom of expression and information, the freedom to conduct a business, and the free movement of goods and services in the internal market. Due to their higher rank in the norm hierarchy, these legal guarantees constitute common ground for the application of the general monitoring prohibition in secondary EU legislation, namely Article 15(1) of the E-Commerce Directive (\'ECD\') and Article 17(8) of the Directive on Copyright in the Digital Single Market (\'CDSMD\'). With regard to the Digital Services Act (‘DSA’), this result of the analysis implies that any further manifestation of the general monitoring ban in the DSA would have to be construed and applied – in the light of applicable CJEU case law – as a safeguard against encroachments upon the aforementioned fundamental rights and freedoms. If the final text of the DSA does not contain a reiteration of the prohibition of general monitoring obligations known from Article 15(1) ECD and Article 17(8) CDSMD, the regulation of internet service provider liability, duties of care and injunctions would still have to avoid inroads into the aforementioned fundamental rights and freedoms and observe the principle of proportionality. The double specificity requirement plays a central role in this respect.}, keywords = {algorithmic enforcement, Auteursrecht, censorship, Content moderation, Copyright, defamation, Digital services act, filtering, Freedom of expression, frontpage, general monitoring, hosting service, injunctive relief, intermediary liability, notice and stay down, notice and take down, safe harbour, trade mark, user-generated content}, }

Selected Aspects of Implementing Article 17 of the Directive on Copyright in the Digital Single Market into National Law – Comment of the European Copyright Society external link

Metzger, A., Senftleben, M., Derclaye E., Dreier, T., Geiger, C., Griffiths, J., Hilty, R., Hugenholtz, P., Riis, T., Rognstad, O.A., Strowel, A.M., Synodinou, T. & Xalabarder, R.
2020

Abstract

The national implementation of Article 17 of the Directive on Copyright in the Digital Single Market (DSMD) poses particular challenges. Article 17 is one of the most complex – and most controversial – provisions of the new legislative package which EU Member States must transpose into national law by 7 June 2021. Seeking to contribute to the debate on implementation options, the European Copyright Society addresses several core aspects of Article 17 that may play an important role in the national implementation process. It deals with the concept of online content-sharing service providers (OCSSPs) before embarking on a discussion of the licensing and content moderation duties which OCSSPs must fulfil in accordance with Article 17(1) and (4). The analysis also focuses on the copyright limitations mentioned in Article 17(7) that support the creation and dissemination of transformative user-generated content (UGC). It also discusses the appropriate configuration of complaint and redress mechanisms set forth in Article 17(9) that seek to reduce the risk of unjustified content removals. Finally, the European Copyright Society addresses the possibility of implementing direct remuneration claims for authors and performers, and explores the private international law aspect of applicable law – an impact factor that is often overlooked in the debate.

algorithmic enforcement, applicable law, collective copyright management, content hosting, Content moderation, copyright contract law, EU copyright law, filtering mechanisms, Freedom of expression, Licensing, notice-and-takedown, private international law, transformative use, user-generated content

Bibtex

Article{Metzger2020, title = {Selected Aspects of Implementing Article 17 of the Directive on Copyright in the Digital Single Market into National Law – Comment of the European Copyright Society}, author = {Metzger, A. and Senftleben, M. and Derclaye E. and Dreier, T. and Geiger, C. and Griffiths, J. and Hilty, R. and Hugenholtz, P. and Riis, T. and Rognstad, O.A. and Strowel, A.M. and Synodinou, T. and Xalabarder, R.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3589323}, year = {0507}, date = {2020-05-07}, abstract = {The national implementation of Article 17 of the Directive on Copyright in the Digital Single Market (DSMD) poses particular challenges. Article 17 is one of the most complex – and most controversial – provisions of the new legislative package which EU Member States must transpose into national law by 7 June 2021. Seeking to contribute to the debate on implementation options, the European Copyright Society addresses several core aspects of Article 17 that may play an important role in the national implementation process. It deals with the concept of online content-sharing service providers (OCSSPs) before embarking on a discussion of the licensing and content moderation duties which OCSSPs must fulfil in accordance with Article 17(1) and (4). The analysis also focuses on the copyright limitations mentioned in Article 17(7) that support the creation and dissemination of transformative user-generated content (UGC). It also discusses the appropriate configuration of complaint and redress mechanisms set forth in Article 17(9) that seek to reduce the risk of unjustified content removals. Finally, the European Copyright Society addresses the possibility of implementing direct remuneration claims for authors and performers, and explores the private international law aspect of applicable law – an impact factor that is often overlooked in the debate.}, keywords = {algorithmic enforcement, applicable law, collective copyright management, content hosting, Content moderation, copyright contract law, EU copyright law, filtering mechanisms, Freedom of expression, Licensing, notice-and-takedown, private international law, transformative use, user-generated content}, }