YouTube

YouTube tackles conspiracy videos with “information cues”

YouTube has announced it is to try and fight fake news by offering their viewers “information cues” on videos that garner “significant debate”.

By “information cues”, YouTube means small boxes offering further context related to the video, and by videos that garner significant debate, YouTube basically means conspiracy videos.

The Google/Alphabet owned video-sharing website has proposed a context-to-content solution similar to Facebook’s “related articles” solution where the social networking website would offer up articles related to an article promoting fake news in a bid to promote a little bit more context (that’s after Facebook scrapped the “disputed flag” solution that didn’t work at all.)

So, if a video on YouTube that promotes a conspiracy theory qualifies for YouTube’s “information cues”, viewers of the video can expect a box to pop up containing a brief few words on the subject related to the conspiracy video that – when clicked on – would link to an external website. At the moment, at least, this will mean Wikipedia.


Sponsored Content. Continued below...




So if a conspiracy video starts babbling on about how the moon landings were actually filmed in a studio somewhere in Hollywood, the “information cue” would entice the viewer to the Wikipedia page regarding the moon landings.

This is the first real attempt YouTube has made to fight fake news and conspiracies, in the aftermath of much criticism being levelled at the video-sharing site after many conspiracy videos began trending and appearing at the top of search results immediately after a number of mass shootings including the massacre at Marjory Stoneman Douglas high school.


Sponsored Content. Continued below...




As such, this approach has some issues. For one, we’re not really keen on the information cues leading to Wikipedia articles. The first 21st Century rule you’ll learn at school when writing essays is never reference Wikipedia as a source – it is, after all, available for anyone to edit and is itself a collection of information from more reputable sources. So why YouTube is solely using Wikipedia… well we don’t know.

Additionally, the information cues from Wikipedia only contain a small snippet of information before being cut off, at which point the viewer would have to click the provided link to continue reading. Is a small snippet of information going to be enough to provide enough context or even motivation to click away from the video? As Gizmodo pointed out, a video promoting the fake moon landings offered up an information cue with some generic information about the moon landings before being cut off.

With that said, the context-to-content solution is a potentially viable part-solution to the issue we have with fake news. It doesn’t censor information and doesn’t mean anyone has to fulfil that dreaded role of “arbiter of truth”. It provides Internet users easy and handy access to relevant information that may put what they are seeing or reading into more context, and may even lure some conspiracy-inclined individuals a little further outside their echo chambers.

For YouTube, however, information cues are solution version 1.0. There is some way to go before this is likely to have any real effect.

Share
Published by
Craig Haley