Pornhub and Twitter ban 'deepfake' celebrity face-swap porn videos

Wednesday, 07 Feb, 2018

In December, Motherboard reported on an alarming trend: People were placing the faces of celebrities such as Gal Gadot, Scarlett Johansson, Taylor Swift and others in pornographic videos that looked extremely real.

A spokesperson said: "We do not tolerate any non-consensual content on the site and we remove all said content as soon as we are made aware of it". The adult-video giant has announced that "deepfakes" will be banned on its platform, making it clear that the site considers such films to be nonconsensual.

It's an inventive and relatively wholesome twist popularised by those who frequent Deepfakes, a subreddit where people paste famous people's faces onto porn actors by using a machine-learning algorithm. The practice is obviously done without the celebrity's consent, leading many opponents of the videos to classify them in the same league as revenge porn.

Image hosting platform Gfycat followed suit, removing "objectionable" GIFs from the site.

While they may soon be harder to find, the nature of the web means they are likely impossible to remove completely. "We have investigated these servers and shut them down immediately". While the platform is friendlier to porn images, the service says that it will not allow the distribution of materials that were made without the person's consent. Someone also placed Carrie Fisher's likeness into Rogue One comparing it side-by-side with the actual job that Disney did for the original. All that is commonly being used to create the fake videos are images of the celebrities.

In a statement to PCMag on Wednesday, PornHub Vice President Corey Price said the company in 2015 introduced a submission form, which lets users easily flag nonconsensual content like revenge porn for removal.

Further videos are also available on a subsection of Reddit, where a user under the name deepfakes has created a desktop app visitors can download to create their own face-swap porn videos.