Reddit bans community for Deepfake sex tape software (variety.com).
Reddit has banned a community dedicated to Fakeapp, an artificial intelligence (AI) video editing application that has gained some notoriety for being used for the production of so-called Deepfake celebrity porn clips. This follows an earlier ban of communities dedicated to the sharing of Deepfake videos.
Following the earlier banning of produce of the FakeApp application on hosts such as Gfycat shared on Reddit, perhaps because one of Emma Watson included those dubious in-the-bath scenes leaked video from Fappening 2.0 last year (15th Mar. 2017).
That Deepfakes—an allusion to Douglas Adams’ fictional computer in The Hitchhiker’s Guide to the Galaxy later paid homage to by IBM’s chess master beating Deep Thought—have not only made the video editing technique of replacing celebrity head from one clip onto porn star body from another much easier, have also upped the faking game with an algorithm which did not just switch heads from video to video but which scans the source celebrity face representations to find the best representations making the same expression of the face being replaced at the particular time in a clip. This undeniably and unsurprisingly piqued and kept interest.
Reddit users who visited the community Thursday found it banned, complete with a note reading: “This subreddit was banned due to a violation of our content policy, specifically our policy against involuntary pornography.” A Reddit spokesperson declined to comment further.
A step raising memories of subreddit being shut down during the first Fappening (Blog 17th Sept. 2014) with parting explanation note that “Every man is responsible for his own soul” and Reddit wanting to be be responsible for avoiding having celebrity pitball lawyer of choice Marty Singer tear them a new soul.
Reddit initially remained mum on the phenomenon, but rolled out new guidelines specifying the ban of involuntary pornography last week. That’s when the company also took the step to ban a number of communities associated with Deepfakes.
Which considering celebrity porn fake pictures have been around as long as the internet suggests it is the ease of creating videos of “involuntary pornography” rather than celebrity porn per se that gave Gfycat and then Reddit pre-lawyer threat virtual collywobbles, not surprising considering FBI investigating is building case that disaffected Americans elected an incredibly toxic orange looking President because Russians, for a laugh on their version of Candid Camera, told them Crooked Hilary would, as well as causing WWIII, take away their home arsenal of assault rifles and conspiracy dungarees too.
And others for quick follow involuntary birthday suit blocking: What are ‘Deepfakes,’ and why are Pornhub and Reddit banning them? (esquire.com).
Reddit, Twitter, and Pornhub have all announced measures to wipe explicit deepfakes from their platforms. Pornhub told Motherboard it will remove any “non-consensual” content. Reddit issued a ban on what it called “involuntary pornography” and banned the deepfakes subreddit. Twitter told Verge it will suspend accounts that create or share “intimate media” like deepfakes without the subject’s consent. Other video platforms like Discord and Gfycat have also moved to ban non-consensual face-swapping on pornography.
But with Esquire pointing out under a “What’s the punishment” subheading with reference to an article by Wired titled “People can put your face on porn—and the law can’t help you”:
Legally, there is little protection against deepfakes because the body doing the act isn’t the celebrity’s own. These videos aren’t made from illegally stolen nudes, and as Wired wrote, “You can’t sue someone for exposing the intimate details of your life when it’s not your life they’re exposing.” You can even drag the First Amendment into it, defending deepfakes as art, satire, and “free speech” because they weren’t technically created illegally.
Assuming of course those creating—or hosting—are not attempting to commercially profit, which might explain why the subscription offering Pornhub were quick to pull and have you pull off over something else, hopefully something else they will not be sued for copyright infringement as they—and most other porn 2.0 “hubs” have been in the past (Wikipedia).
And although not “technically created illegally” it would only be a matter of time before one has celebrity head on top of pornstar body engaged in a act extreme enough to warrant a charge of slander if the Deepfake trend continued to get attention; let’s not forget sites have removed celebrity fakes after lawyer threats before (rollingstone.com, Oct. 2011).
But will banning banish them? As leaked pics and vids constantly prove it’s impossible to put the cork back into the bottle once uploaded, certainly not in the short term:
It took us less than 30 seconds to find banned “deepfake” AI smut on the internet (theregister.co.uk).
There are sites ready and willing to host deepfakes. One such, EroMe, describes itself as “the best place to share your erotic pics and porn videos”. A company representative told El Reg it didn’t see anything wrong with deepfakes and thinks of them as “parody”.
Indeed, EroMe free photo and video sharing glad of the free advert, and doubtless many sites sharing along with ads you can peruse and hopefully click on while you are there too, hoping Deepfakes proves the same bumper ad revenue payday of the Fappenings.
But one at least seems to have either some backing from the FakeApp creators, or at least has harvested tutorials and video guides by the creators along with download links to the application—with is own “Trump is punk rock” apparel shop and “Vape Tube” with much tattooed alt-Gandalf mega “smoke” ring blowing tricks too: The Deepfake Society (thedeepfakesociety.com).
And strangly a lot of the “watch in HD” links on the Deepfakes Society XXX sister-site are still working links to Pornhub, perhaps suggesting Pornhub’s reaction was as much to do with free advertising too.
Are Deepfakes a faking game changer? They likely are, but to be honest, as good as they are and the “involuntary porn” label that has got them banned adding to the tabloid and media tech moral panic of AI and VR turning all into unwitting revenge porn stars (Latest Picks 22nd May 2017) they will not be fooling anyone who doesn’t also think Bigfoot is surfing for them in Washington state or who really believes an elected orange tycoon’s real desire is to make anything great again but his own ego.
But with the “involuntary” fuss they have caused, the eventual loser out of an increased panic over anything fake that is news and crackdown over them and likely old skool celebrity fakes in general may be the very web 2.0 communities that created them.
Updated 3rd January 2019
Scarlett Johansson on fake AI-generated sex videos: ‘Nothing can stop someone from cutting and pasting my image’ (washingtonpost.com).
Having had her face “grafted into dozens of graphic sex scenes by anonymous online ‘creators’” admitting that while “demeaning” there is not much she can do:
“I think it’s a useless pursuit, legally, mostly because the internet is a vast wormhole of darkness that eats itself. There are far more disturbing things on the dark web than this, sadly.”
Which of course would be published on the not-so dark web too with much conviction without the threat of possible conviction of the creator or uploader, and ScarJo adding that while you can challenge the copyright of somebody using your likeness in the US the same Right of Publicity (Wikipedia) does not apply in Europe and the United Kingdom, as has been noted before in the Rihanna vs Topshop T-shirt court case (Pick of the Week 21st May 2013) for which a UK living nude sketcher and caricaturist of celebs in the buff often doing things with oversized vegetables is rather glad.
So, with celebrity image and video fakery not in any way being something new, it’s perhaps best put in perspective as price to pay for being in the public eye, with it being a “different situation” for “someone who loses a job over their image being used like that”, as noted and collectively stoked in internet moral panics much loved by tabloids in Insta-era where uploading several selfies a day for potential use as head shot is the norm and much more frequent actual revenge porn sharing of real nude pics, which has put Facebook et al. in much a tizz this year with the discovery that they and their platform may be held responsible (Latest Picks 13th Jan. 2018).
Vulnerable people like women, children and seniors must take extra care to protect their identities and personal content. That will never change no matter how strict Google makes their policies. (Google in September added “involuntary synthetic pornographic imagery” to its ban list, allowing anyone to request the search engine block results that falsely depict them as “nude or in a sexually explicit situation.”)
But admitting that “if a person has more resources, they may employ various forces to build a bigger wall around their digital identity”, which was noted during the original Fappening when, without a high powered and paid lawyer such requests to either search engines or law enforcement were often met by a wrinkle of the nose and suggestion not to take any more such pictures (Pick of the Week 21st October 2014).
Updated 8th October 2019
And despite a tabloid moral panic of political propaganda Deepfakes influencing elections (dailystar.co.uk, Sept. 2019) to finally allow Russia to help conspiracy dungaree wearing Midwesterners elect Mr. Magoo as president next:
Deepfake videos online ‘almost all porn’ as 15,000 AI-generated clips found (dailystar.co.uk).
Almost 15,000 deepfake videos—AI-generated clips where different faces are superimposed digitally onto different bodies—were discovered online.
Of those, almost all—around 96%—were porn, according to a report on the study by Massachusetts Institute of Technology.
Indeed, surprising no one with 3% of the remaining 4% meme-esque Nicholas Cage, Jim Carrey or Keanu Reeves, and the need for Deepfaked propaganda questionable considering those likely to consume it already emphatically believe [insert liberal figure here] is an utter un-American pinko rotter aiming to take their assault rifles away and stop them bankrupting them self when getting sick in middle and old age.
Recent/related stories
- Pornhub Fuck Your Period campaign (Latest Picks 1st February 2018)
- YouTube star Chrissy Chambers wins damages in landmark UK ‘revenge porn’ case (Latest Picks 18th January 2018)
- VR revenge porn: Angry exes could create 3D avatars of past lovers (Latest Picks 22nd May 2017)
Page: prev. | 1 | next