Facebook debates whether to remove fake news about COVID-19

Facebook’s moderation of misinformation content is regularly singled out. However, when it comes to COVID-19, the social network has been very responsive and until now, it still adopts a very strict policy that was put in place from the start of the pandemic.

But today, the company Meta wonders if, given the evolution of the health situation, it should still apply the same policy on Facebook.

Fake news on COVID-19: Meta consults its supervisory board

In a recent press release, Meta’s supervisory board indicated that it had been seized by the company. Mark Zuckerberg’s company seeks advice from this board on whether it should continue to remove disinformation content about COVID-19, or whether a less restrictive approach would be more in line with its values ​​and responsibilities in of human rights.

As a reminder, the supervisory board or oversight board is an independent entity created by Meta. This is often considered an equivalent of the US Supreme Court, for questions of moderation.

The advice can be taken by users who are not satisfied with Meta’s moderation decisions, but it can also be taken by the company, when it wants recommendations regarding its moderation policy.

According to the explanations of Meta’s supervisory board, in general, the company’s approach to false information “primarily relies on contextualizing potentially false claims and narrowing their scope, rather than removing content. »

In fact, the company believes that “Because it is difficult to precisely define what constitutes misinformation on a range of topics, removing misinformation on a large scale risks unduly interfering with user expression. »

But compared to COVID-19, Meta took a different approach as early as January 2020. Instead of betting on “contextualization”, the company removed content with a vengeance.

A strict policy which was justified by the urgency and gravity of the situation

Why ? According to the supervisory board, which cites Meta’s petition, this decision was made because “Outside health experts have told us that misinformation about COVID-19, such as false claims about cures, masking, social distancing, and the transmissibility of the virus, could contribute to the risk of imminent physical harm.”

If today, Meta wonders if it should always apply the same measures, it is because the situation has changed. First, according to his query to the oversight board, at the start of the pandemic, there was a lack of authoritative sources of information. This would have created a vacuum that encouraged the spread of misinformation. Today, however, people have better access to information.

In addition to this, on the health level, Meta believes that the development of vaccines, therapies and the evolution of variants has made COVID-19 less deadly today. Furthermore, the company also indicated that “Public health authorities are actively assessing whether COVID-19 has progressed to a less severe condition. »

Will Meta’s moderation policy change?

Nevertheless, Facebook’s parent company also recognizes that the evolution of the pandemic will have a different evolution depending on the country, depending on parameters such as the vaccination rate, the resources of the health system, or even the confidence of citizens in their governments.

At this time, it is unclear what will change in Meta’s policy regarding COVID-19 misinformation. But if the group obtains the approval of its supervisory board to adopt a “less restrictive” policy, Facebook could for example treat misinformation about the virus in the same way as it treats other fake news.

And as mentioned above, Meta’s general policy with respect to disinformation is to contextualize and reduce the scope of a publication instead of deleting, so as not to interfere with the user’s freedom of expression.


Leave a Comment