Fighting fake news: we are still looking for easy wins and clear enemies – voices from the EC conference on countering online disinformation
More than 70% of EU internet users are concerned about online disinformation before the European Parliament elections in May. Moreover, issues such as a general lack of trust in media and too little data made available to academia and fact-checkers are hindering the EU’s fight against disinformation. Amongst other measures, the European Union has responded to these problems by implementing a Code of Practice (CoP) against disinformation which first results were presented on a conference on Tuesday.
“The fact that we have gone over an hour shows how much interest and how much to discuss there is,” said Claire Bury, Deputy Director General of the European Commission, in her closing speech on Tuesday’s Commission conference ‘Countering online disinformation – towards a more transparent, trustworthy and accountable digital media ecosystem.’ And indeed, there was a lot to discuss. The event drew a balance of the achievements made in tackling disinformation in Europe and looked at its future, especially in view of upcoming free and fair Parliament elections. Some answers to the question of how to tackle disinformation were given, some progress reported on. Yet, many questions were left unanswered.
The Code of Practice – in practice
In September 2018, a Code of Practice against Disinformation was introduced by the European Commission. Signatories to the code include online platforms, leading social networks and advertisers. On January 29, a first report on the findings of this self-regulatory approach was published. The EU’s Commissioner for Digital Economy and Society, Mariya Gabriel, summarized the findings of the report as the following: “we have a lot to do, platforms have to step up their efforts, but we have laid the basis. Changes are visible in all the right directions.”
The conference saw presentations of various signatories of the contract. After mentioning that the past years have been “quite a period for us,” the Head of Brussels’ Facebook office, Thomas Myrup Kristensen, argued that the company has stepped up their work to close fake accounts and dubious sites. And while Jon Steinberg from Google acknowledged their wider responsibility as a signatory of the code, he also said that they are only a small part and highlighted the role of NGOs, the media sector, and politicians to counter disinformation by, for example, not further politicizing the term itself.
Not every participant in the panel was as euphoric about the CoP. There are no immediate consequences for platforms when not complying with the code, getting citizens involved in the process poses a challenge, and the data made available for third parties such as fact-checkers or media educators remains very limited.
The need for a better understanding
Dr. Ľuboš Kukliš, chairperson of the European Regulators Group for Audiovisual Media Services, highlighted that “to measure success or failure of the code is quite hard since there are few things that can objectively measured.” Member of the European Parliament, Tania Fajon, added: “the code is a beginning, but the key to moving on is to understand disinformation. Once disinformation has reached the people, a lot of emotions are created and even if the media is removed, it is extremely hard to correct the damage.”
Christoph Schott from Avaaz, a global campaigning organization, proposed a different solution to this problem. Instead of deleting misinformation, it could be corrected and presented to the people as false. A member of the European Parliament’s cultural committee who had been among the observers supported this proposal: “it is about more information, not less. And about giving everyone the chance to access it.”
In his keynote speech, Director of Research at the Reuters Institute for the Study of Journalism, Rasmus Nielsen, also highlighted the need to get a broader understanding the media environment and the perceptions of people on the problem. Due to a phenomenon called ‘automated serendipity’, using search engines for news leads to a more diverse news consumption and to make users read sources that they would have never found otherwise. However, Nielsen argued that the backdrop of this is that fewer than half of the people can recall where they got their information from. Moreover, it is also important to consider that only 51% of the people that use mainstream media actually trust in it, while the numbers for social media are even lower (23%).
In EU’s fight against disinfo, we look for solutions that doesn’t cost anything+doesn’t piss anyone off – have yet to see significant investment in supporting independent news or recognition powerful people who lie is key to these problems #EUtackledisinfo https://t.co/NeBR1KnViy
— Rasmus Kleis Nielsen (@rasmus_kleis) 29 January 2019
Siada El Ramly, Director General of EDIMA, a trade association representing online platforms, reminded, “the code is not expected to fix fake news as a whole, but it transmits a clear commitment to take action by the platforms. We have to give it a fair chance to see how it works out after 12 months.”
Voices from the academic realm
Representatives from academics agreed that research can help a lot to counter disinformation but only if a few changes are being implemented. First, and most urgently, platforms need to make more data available for researchers to use. Moreover, Luciano Morganti, Professor at the Vrije Universiteit Brussel, mentioned the problem of very slow responses to the rapid changes in the digital environment. His proposal: an emergency research response team as a framework for academic research on online disinformation. Morganti also highlighted the need to look at future problems: “research’s role is to ask ‘what if?’ questions. For example, what happens if platforms don’t comply with the code? There is no plan B so far.”
Gianni Riotta, Director of the LUISS (Free International University for Social Studies “Guido Carli”) datalab, argued in a different direction. For him, it was important to make clear that fake news is not a fact like gravity for a physicist. It’s not natural, but instead “developed by industry to combat democracy. We (debunkers) fail more often than we succeed, but in the end, the goal is to give trust to people that don’t trust us and to save democracy.”
As can be seen, many actions have been taken by diverse actors concerning the problem of disinformation and many questions remain to be tackled. “Now, there is a need to coordinate all these actions and bring them together. The Commission has to work on closely monitoring the implementation of the CoP and make sure that necessary data is made available as quickly as possible,” concluded Bury. A full stream of the conference is available via this link.
by Alena Bieling