In The News

Facebook to let users know if they’ve engaged with fake COVID-19 posts

Facebook is planning to introduce a feature that lets their users know when they’ve interacted with content on the platform that contains misinformation about the coronavirus pandemic.

From claims that drinking bleach can cure coronavirus, to conspiracies that the pandemic has been caused by 5G mobile phone signals, social media has been awash with copious amounts of fake news, some of which could be dangerous.

And while Facebook has been removing them or reducing their reach, they’ve still managed to be seen by millions.

In a blog post, Facebook’s Vice President of Integrity, Guy Rosen, explained the latest way Facebook is tackling the problem. By promoting a message inviting those users who’ve interacted with fake coronavirus content to visit the WHO’s “myth busting” webpages.

We’re going to start showing messages in News Feed to people who have liked, reacted or commented on harmful misinformation about COVID-19 that we have since removed. These messages will connect people to COVID-19 myths debunked by the WHO including ones we’ve removed from our platform for leading to imminent physical harm.

This means that Facebook users who have interacted with misinformation about the coronavirus pandemic will see a message (below) appear at the top of their newsfeeds inviting them to help friends and family avoid false information by heading to the WHO myth busting website.

You’ll notice, however, that Facebook isn’t telling the user that they’ve engaged with false content, or what that content is. This is highlighting a balance that Facebook is having to strike between letting a Facebook user know they’ve interacted with fake information, but not being too blunt about it as to make that user become defensive.

That latter point was demonstrated in the aftermath of the 2016 presidential elections in the US, Facebook introduced a “disputed” tag to content that was deemed inaccurate. However the tag ultimately made it more likely that users would share the information, which was obviously the opposite to the intended consequence.


Sponsored Content. Continued below...




With this newsfeed notification, Facebook is being more subtle in the hope to avoid users becoming defensive, but there are problems with this approach, since users won’t know exactly what information they engaged with that was false.

Nor is it obvious if the user seeing this notification will even be aware that they’ve engaged with misinformation, since the notification doesn’t mention that they have.

Facebook users who engage with coronavirus misinformation will begin seeing this notification in their newsfeed in the next couple of weeks.

Share
Published by
Craig Haley