More than 60 human rights and media organisations have written an open letter to all Members of the European Parliament urging them to reject the proposed Terrorist Content Online Regulation since provisions for the automatic removal of content online seriously endanger free speech.
MEPs will disccuss this issue on 28 April, during the next European Parliament’s plenary.
Dear Member of the European Parliament,
We are writing to you to share our concerns about the European Union’s proposal for a Regulation on addressing the dissemination of terrorist content online. We urge the Members of the European Parliament to vote against the adoption of the proposal.
Since 2018, we, the undersigned human rights organisations, journalists associations and researchers, have been warning against the serious threats to fundamental rights and freedoms in this legislative proposal, especially for freedom of expression and opinion, freedom to access information, right to privacy and the rule of law.
Thanks to the work of the European Parliament’s negotiations team, an extended debate and the involvement of civil society, a number of problematic issues of the proposal have been addressed during the trilogues between the European Parliament and the Council of the European Union.
However, despite the outcome of the last trilogue negotiation, the final text of the proposed Regulation still contains dangerous measures that will ultimately weaken the protection of fundamental rights in the EU. It also has the potential to set a dangerous precedent for online content regulation worldwide.
The proposed Regulation is headed for a final vote in the plenary of the European Parliament in April 2021. We urge the Members of the European Parliament to vote against the adoption of the proposal for the following reasons:
1. The proposal continues to incentivise online platforms to use automated content moderation tools, such as upload filters
The short timeframe that the proposal imposes on providers to remove content considered terrorist strongly incentivises platforms to deploy automated content moderation tools in order to delete terrorist content, such as upload filters. Current content moderation practices are characterised by a profound lack of transparency and accuracy of automated decision making. Because it is impossible for automated tools to consistently differentiate activism, counter-speech, and satire about terrorism from content considered terrorism itself, increased automation will ultimately result in the removal of legal content like news content and content about discriminatory treatment of minorities and underrepresented groups. Platforms already remove massive quantities of content documenting violence in war zones, uploaded by survivors, civilians, or journalists, as tracked by the Syrian and Yemeni Archives, which can hinder accountability efforts. The proposed Regulation, which lacks safeguards to prevent such practices when automated tools are in use, will only reinforce that trend. Upload filters may additionally have an adverse effect on the Internet, especially with regards to its open architecture and interoperable building blocks.
2. There is a severe lack of independent judicial oversight
The proposal calls on Member States to designate at their discretion “national competent authorities” that are vested with the powers to implement the Regulation’s measures, notably the issuance of removal orders. While the proposal states that these authorities must be objective, non-discriminatory, and rights-respecting, we nevertheless believe that only courts or independent administrative authorities subject to judicial review should have a mandate to issue removal orders. The lack of judicial oversight is a severe risk to freedom of expression, assembly, association, religion and access to information. It also subverts the Charter of Fundamental Rights, which protects the freedom to receive and impart information and says that lawful expression is protected and should only be limited subsequently, by a court and upon legitimate request.
3. Member States will issue cross-border removal orders without any checks
According to the outcome of the trilogue, any competent authority will have the power to order the deletion of online content, hosted anywhere in the EU within one hour. This means that one Member State can extend its enforcement jurisdiction beyond its territory without prior judicial review and consideration for the rights of individuals in the affected jurisdictions. In light of the serious threats to the rule of law in certain EU Member States, the mutual trust that underpins the European judicial cooperation might be seriously undermined. Furthermore, the procedure of minimal notification to and verification by the affected state foreseen in the current text does not contain sufficient safeguards against state overreach and abuse of power, and it won’t solve disagreements among Member States over what constitutes terrorism, irony, art, or journalistic reporting.
We urge the European Parliament to reject this proposal, as it poses serious threats to freedom of expression and opinion, freedom to access information, the right to privacy, and the rule of law. Moreover, it will set a dangerous precedent for any future EU legislation regulating the digital ecosystem by distorting the law enforcement framework under the pretext of strengthening the Digital Single Market. Therefore, the proposed Regulation on addressing the dissemination of terrorist content online as it stands now has no place in EU law.
Access Now, International
ARTICLE 19, International
Asociația pentru Tehnologie și Internet (ApTI), Romania
Association of European Journalists (AEJ), Belgium
Bits of Freedom, the Netherlands
Bulgarian Helsinki Committee, Bulgaria
Centre for Democracy & Technology (CDT), International
Chaos Computer Club (CCC), Germany
Civil Liberties Union for Europe (Liberties), International
Comité de Vigilance en matière de Lutte contre le Terrorisme (Comité T), Belgium
Committee to Protect Journalists (CPJ), International
Digitale Gesellschaft, Germany
Digital Rights Ireland, Ireland
Državljan D, Slovenia
Electronic Frontier Finland (Effi), Finland
Electronic Frontier Foundation (EFF), USA
Elektroniks Forpost Norge (EFN), Norway
Entropia e.V., Germany
European Digital Rights (EDRi), International
European Federation of Journalists (EFJ), International
Fitug e.V., Germany
Föreningen för digitala fri- och rättigheter (DFRI), Sweden
Global Forum for Media Development (GFMD), International
Global Voices, International
Helsinki Foundation for Human Rights, Poland
Hermes Center, Italy
Homo Digitalis, Greece
Human Rights Monitoring Institute, Lithuania
Human Rights Watch, International
International Commission of Jurists, International
Internationale Liga für Menschenrechte, Germany
International Federation for Human Rights (FIDH), International
Internet Governance Project, School of Public Policy at the Georgia Institute of Technology
Internet Society, International
IT Political Association of Denmark (IT-Pol), Denmark
Irish Council for Civil Liberties, Ireland
La Quadrature Du Net (LQDN), France
Latvian Human Rights Committee, Latvia
Liga voor de Rechten van de Mens, the Netherlands
Liga voor Mensenrechten, Belgium
Ligue des Droits de l’Homme, France
Ligue des Droit Humains, Belgium
Open Technology Institute, USA
Panoptykon Foundation, Poland
Ranking Digital Rights, USA
Reporters Without Borders (RSF), International
Rights International Spain, Spain
Statewatch, the United Kingdom
Vrijschrift.org, The Netherlands
Wikimedia Deutschland, Germany
Wikimedia France, France
7amleh – The Arab Center for the Advancement of Social Media, Palestine