In The News

AI fake porn videos raise serious concerns

Move over fake news. Now we have another emerging problem; AI generated fake porn.

Known as “deepfake” videos, this emerging issue uses artificial intelligence to create realistic fake videos where users can stitch the head of one person onto the body of another.

And yes, needless to say, this new technology is currently focused on creating fake adult-only videos. Already several videos have sprung up on various adult websites featuring the heads of celebrities including Gal Gadot, Emma Watson and Jessica Alba, almost seamlessly stitched onto the bodies of various adult actresses. Named after the user handle of the person behind the videos – “deepfakes” – the videos are already proving immensely popular on sites like Reddit and Celebrity Jihad.


Sponsored Content. Continued below...




The AI “deepfakes” software works by the user inputting a number of photographs of a particular person into the software, which then builds a three-dimensional model of that persons face. The software then takes that 3D model and superimposes it onto the body of another to create a realistic video.

Why is this a problem?

Many are worried at the prospect of Internet users being able to create realistic fake videos, and for good reason. While those who developed the software seem content with stitching the face of their favourite celebrity onto the body of an adult entertainment actress, there is already a public app that can let almost anyone do the same thing regardless of their technical ability, with discussion of more mainstream apps available to the public, who can choose for themselves how they would like to use this technology.


Sponsored Content. Continued below...




One potential issue is this lends itself to the already established “sextortion” industry, where crooks will attempt to steal nude photographs of victims and proceed to blackmail them into paying (either money or with other compromising photos) or else face having the photographs released. If no nude photos exist of a particular victim, the crooks may simply “photoshop” realistic images. And now, with this technology, this crime can be orchestrated once again only this time using faked videos.

Variations of this technology have existed for years, and have already been implemented in Hollywood. A similar technology used photographs and videos of the late Paul Walker (as well as help from his similar looking brother Cody) to help producers finish the movie Furious 7 after the actor tragically died before filming could be completed.

One thing is for sure – unless technology remains as good at spotting the fakes as it is at creating them, we could be in quite a bit of trouble if we provide anyone with the means to create realistic, seamless but ultimately fake videos of whatever they please.

Share
Published by
Craig Haley