information-manipulations-around-covid-19.-france-under-attack


AuthorsIris Boyer and Théophile Lenoir

Iris Boyer is Deputy Head for Technology, Communications and Education at ISD, mainstreaming this global think tank’s efforts against extremism through scaled partnerships with tech companies, policymakers and civil society. She also leads ISD’s regional development in France and specialises on Digital Policy.

Théophile Lenoir develops Institut Montaigne’s research program on digital issues. The program looks at the challenges associated with the collection and processing of data in European societies, ranging from social cohesion to information security. His own work focuses on communication technologies and the transformations of the public space in France. It seeks to explore new models of governance for digital technologies.

In times of a pandemic, what harm can social media posts actually cause? During Covid-19, it seems they can actually do quite a lot. From messages arguing that bleach can cure Covid-19 to ones that show the weaknesses of the West in dealing with the crisis, some of the information circulating on social media has been problematic to governments and sanitary organizations.

Whilst more efforts are needed on the part of platforms to deal with illegal and dangerous information, and to reduce amplification mechanisms, public authorities also need to raise their awareness of the interest online communities have in hostile narratives. This study by Institut Montaigne and the Institute for Strategic Dialogue, based on data collected by Linkfluence, shows that both influencers gathered around far-right and far-left themes are interested in anti-Europe narratives, however the first are also interested in pro-authoritarian narratives. Successful influence messages use these interests to circulate in specific communities.

Narratives and disinformation do not circulate evenly online

Our mapping shows that the sharing of information and disinformation is always political: information, whether true or false, only circulates in communities that are interested in it. Consequently, players can take advantage of events such as the Covid-19 crisis to draft messages that meet these interests and serve political goals. More than a creator of disinformation, the pandemic has catalyzed information creation, both true and false, on specific topics with political aims.

The influencers that gather around far-right themes are the most sensitive to messages that both promote authoritative regimes and show Europe’s weaknesses. In parallel, these influencers were the most interested in most pieces of disinformation and conspiracy theories we looked at (concerning 5G or the fact that Agnès Buzyn, former Health Minister of France, and her husband, Yves Lévy, former chair of the national health research center Inserm, plotted to wreck Professor Raoult’s work on chloroquine).

The influencers that gather around themes associated with the far-left are relatively immune to overt pro-authoritarian narratives. They will be more interested in reading about Europe’s weaknesses. They are also more interested in pieces of disinformation and conspiracy theories that concern corporations and include an economic dimension.

Finally, our study shows that the influencers that gather around technology and health issues are relatively impermeable to narratives that do not concern them directly. The messages that interest these influencers are tailored to their themes (for example: non-medical cures, 5G…).

France was relatively immune to international conspiracy theories during the lockdown

Messages such as the ones accusing Bill Gates of having created the coronavirus were largely absent from the discussions we looked at in France, though they were successful in the United States.

Our study suggests that, as of today, language remains a safety net concerning foreign disinformation. This is coherent with previous findings on the “#Macron leaks” operation, which showed that extremist groups in Russia and the United States posted English content in French discussion groups, hindering their circulation. Overall, despite the European Union bringing to light information manipulation operations coming from China during the Covid-19 crisis, we found fewer occurrences of French messages promoting China, compared to messages promoting Russia. It is possible that China-related players still mainly operate in English.

Fact checking: a challenge for traditional media

Fake cures or dangerous information regarding the virus (including, for example, the idea that bleach is a remedy) were largely absent from Twitter and Facebook. It could be that this information spread mostly on messaging services such as Whatsapp or Telegram. This may be a particularity in the French context, as ISD has identified significant sharing of fake cures in English on social media, notably the harmful assertion that colloidal silver can help “resolve” coronavirus.

The absence of such messages on social media raises the difficult question of the media’s role in covering disinformation. By warning against the circulation of a piece of disinformation, media organizations can play a significant role in making it visible. This can be illustrated through the 5G misinformation, whereby an interview of French virologist Luc Montagnier on April 17 by a French media, during which Mr. Montagnier stated that 5G frequencies may have contributed to the spread of the virus, has generated active debates online.

In a world of online influence and manipulations, governments, researchers and platforms are part of the solution

Most government initiatives in France and from social media platforms have focused on sharing reliable information on the virus. This is a necessary step, however it is not a sufficient response to foreign interference. In parallel to encouraging platforms to take more action and delete illegal and dangerous content, public authorities also need to raise their awareness of the interest online communities have in hostile narratives.

Three dimensions are therefore crucial to ensuring that democratic societies develop their understanding of the challenges ahead, and learn to live in a world of online influence and manipulations:

  1. Governments need to recognize this challenge as a priority area, by continuing to make public health-related information reliable, transparent, and compelling. They should also create new regulations to open up channels of communication between platforms and governments, to incentivize companies to share information and to require more transparency from platforms, including through audit mechanisms.
     
  2. Researchers need to have access to more data from online platforms to understand the fragilities of public debates and the extent to which they are exploited by foreign actors. The real-time reporting on disinformation and polarization campaigns is essential to helping governments understand the scope of the challenge and to make communication infrastructures a priority in State-level negotiations.
     
  3. Online intermediary platforms need to be responsible for prioritizing authoritative information and sources, de-platforming malign ones, and down-ranking and clearly labeling misinformation. The new regulation should lead them to undertake robust and transparent research, as well as to design products and moderation systems compliant with privacy on the information sharing and communication ecosystems. This should include private groups and messaging apps when they are related to the spread of potentially harmful mis- and disinformation.

L’articolo Information Manipulations Around Covid-19 . France Under Attack proviene da .

Lascia un commento

Il tuo indirizzo email non sarà pubblicato. I campi obbligatori sono contrassegnati *