What do these stories all have in common?
A privacy message that asked you to copy and paste some legal sounding jargon to your profile to protect your information.
The claim that changing your profile picture to a giraffe would put you at risk of being hacked.
A tale about HIV infected oranges coming to Libya.
The answer is they are all hoaxes that went super viral across Facebook over the last few years, each accumulating millions of views and hundreds of thousands of shares. There is a good chance that you saw at least one of them at some point.
Facebook have claimed a number of times that they will try and tackle these viral hoaxes but to little avail. The problem is that Facebook’s filters need to be able to differentiate between genuinely interesting and popular stories and viral hoaxes, two types of posts that will ultimately act in the same way since both will inherently attract many likes, shares and comments.
Traditionally posts that attract lots of attention will be given an extra boost and be rewarded with increased visibility across Facebook, making it easy for certain posts or videos to go viral. But how to separate the useful from the bunk when they’re both attracting lots of attention?
Sponsored Content. Continued below...
Facebook thinks it has the answer (again) in a blog post Friday that involves inviting users to provide their feedback when a particular post goes viral. They will do this by asking certain Facebook users if they would rather see the viral story appear on their newsfeed when compared to another story. If users pick the other story, Facebook claim this would be an indication that the viral story – despite getting shared and commented on lots of times – may not be an interesting story, and as such will receive less visibility.
But will it work? It’s an interesting concept but not one without potential for failure. The risk of false positives is a definite risk when giving users the choice of a 50/50 split and even interesting stories that are not hoaxes will inevitably receive a certain amount of people telling Facebook that they’d prefer to see the other post.
Another potential for failure is that many viral hoaxes fool a lot of people – hence why they go viral – so there is no guarantee that the majority of users who are asked for their feedback won’t select the viral post as the post they’d rather see because they are not aware that it is a hoax.
The blog post on Facebook claims that viral posts are “typically anomalies” which begs the question why Facebook could not possibly provide some manual intervention when certain posts take off by getting a human to check the post to see if it’s legitimate instead of relying solely on their software.
Do you think this will work? Or do you think we haven’t seen the end of viral hoaxes just yet?