GitHub’s Deepfake Porn Crackdown Still Isn’t Working

In late November, a deepfake porn maker claiming to be based in the US uploaded a sexually explicit video to the world’s largest site for pornographic deepfakes, featuring TikTok influencer Charli D’Amelio’s face superimposed onto a porn performer’s body. Despite the influencer presumably playing no role in the video’s production, it was viewed more than 8,200 times and captured the attention of other deepfake fans.

“So nice! What program did you use for creating the deepfake??” one user going by the name balascool commented. “I love charli.” D’Amelio’s agent did not reply to a request for comment.

The video’s creator, “DeepWorld23,” has claimed in the comments that the program was a deepfake model hosted on developer platform GitHub. This program was “starred” by 46,300 other users before being disabled in August 2024 after the platform introduced rules banning projects for synthetically creating nonconsensual sexual images, aka deepfake porn. It became available again in November 2024 in an archived format, where users can still access the code.

GitHub’s crackdown is incomplete, as the code—along with others taken down by the developer site—also persists in other repositories on the platform. A WIRED investigation has found more than a dozen GitHub projects linked to deepfake “porn” videos evading detection, extending access to code used for intimate image abuse and highlighting blind spots in the platform’s moderation efforts. WIRED is not naming the projects or websites to avoid amplifying the abuse.

“It’s not easy to always remove something the moment it comes online,” says Henry Ajder, an AI adviser to tech companies like Meta and Adobe on the challenge of moderating open source material online. “At the same time, there were red flags that were pretty clear.”

“When we look at intimate image abuse, the vast majority of tools and weaponized use have come from the open source space,” says Ajder. But they often start with well-meaning developers, he says. “Someone creates something they think is interesting or cool and someone with bad intentions recognizes its malicious potential and weaponizes it.”

Some, like the repository disabled in August, have purpose-built communities around them for explicit uses. The model positioned itself as a tool for deepfake porn, claims Ajder, becoming a “funnel” for abuse, which predominantly targets women.

Other videos uploaded to the porn-streaming site by an account crediting AI models downloaded from GitHub featured the faces of popular deepfake targets, celebrities Emma Watson, Taylor Swift, and Anya Taylor-Joy, as well as other less famous but very much real women, superimposed into sexual situations.

The creators freely described the tools they used, including two scrubbed by GitHub but whose code survives in other existing repositories.

Perpetrators on the prowl for deepfakes congregate in many places online, including in covert community forums on Discord and in plain sight on Reddit, compounding deepfake prevention attempts. One Redditor offered their services using the archived repository’s software on September 29. “Could someone do my cousin,” another asked.

Torrents of the main repository banned by GitHub in August are also available in other corners of the web, showing how difficult it is to police open-source deepfake software across the board. Other deepfake porn tools, such as the app DeepNude, have been similarly taken down before new versions popped up.

“There’s so many models, so many different forks in the models, so many different versions, it can be difficult to track down all of them,” says Elizabeth Seger, director of digital policy at cross-party UK think tank Demos. “Once a model is made open source publicly available for download, there’s no way to do a public rollback of that,” she adds.

One deepfake porn creator with 13 manipulated explicit videos of female celebrities credited one prominent GitHub repository marketed as a “NSFW” version of another project encouraging responsible use and explicitly asking users not to use it for nudity. “Learning all available Face Swap AI from GitHUB, not using online services,” their profile on the tube site says, brazenly.

GitHub had already disabled this NSFW version when WIRED identified the deepfake videos. But other repositories branded as “unlocked” versions of the model were available on the platform on January 10, including one with 2,500 “stars.”

“It is technically true that once [a model is] out there it can’t be reversed. But we can still make it harder for people to access,” says Seger.

If left unchecked, she adds, the potential for harm of deepfake “porn” is not just psychological. Its knock-on effects include intimidation and manipulation of women, minorities, and politicians, as has been seen with political deepfakes affecting female politicians globally.

But it’s not too late to get the problem under control, and platforms like GitHub have options, says Seger, including intervening at the point of upload. “If you put a model on GitHub and GitHub said no, and all hosting platforms said no, for a normal person it becomes harder to get that model.”

Reining in deepfake porn made with open source models also relies on policymakers, tech companies, developers and, of course, creators of abusive content themselves.

At least 30 US states also have some legislation addressing deepfake porn, including bans, according to nonprofit Public Citizen’s legislation tracker, though definitions and policies are disparate, and some laws cover only minors. Deepfake creators in the UK will also soon feel the force of the law after the government announced criminalizing the creation of sexually explicit deepfakes, as well as the sharing of them, on January 7.

Source : Wired