A group of Minneapolis women discovered their male friend secretly created explicit deepfakes of them and over 80 other women using AI. The shocking case, detailed in a new CNBC investigation, exposes a dangerous legal void where victims have almost no recourse against AI-powered sexual exploitation - and how everyday social media photos can be weaponized in ways lawmakers never anticipated.
The nightmare began in summer 2024 when Jessica Guistolise and her friends learned the unthinkable. A male friend had been secretly harvesting their Facebook photos and feeding them into an AI service called DeepSwap to create explicit videos and images. Not just of them - over 80 women across the Twin Cities had been targeted in what amounts to one of the most extensive deepfake abuse cases documented to date.
"I heard that camera click, and I was quite literally in the darkest corners of the internet," Guistolise told CNBC in an exclusive investigation. "Because I've seen myself doing things that are not me doing things." The sound of camera shutters now triggers panic attacks, her eyes swelling with tears as the trauma resurfaces.
But here's the most disturbing part: the perpetrator broke no laws. Because the women weren't minors and he never distributed the content, authorities found no crime had been committed. "He did not break any laws that we're aware of," said Molly Kelley, one of the victims and a law student. "And that is problematic."
This legal vacuum reflects how quickly AI has outpaced regulation. Less than a decade ago, creating convincing deepfakes required serious technical expertise. Now, as CNBC's investigation reveals, anyone with an internet connection can access "nudify" services through Facebook ads, Apple and Google app stores, or simple web searches.
"That's the reality of where the technology is right now, and that means that any person can really be victimized," Haley McNamara, senior vice president at the National Center on Sexual Exploitation, told CNBC. The services often masquerade as playful face-swapping tools while serving primarily pornographic purposes.
DeepSwap itself remains mysteriously elusive. The company has shifted its claimed headquarters from Hong Kong to Dublin, listing "MINDSPARK AI LIMITED" as its corporate entity. CNBC couldn't locate CEO Penyne Wu online or get responses from marketing manager Shawn Banks. The opacity suggests these services operate deliberately in regulatory gray zones.