As AI-generated photos of Taylor Swift, currently one of the most influential celebrities in the world, went viral across X, one of the many trolls and incels who shared the deepfake pornographic pics f*cked around and taunted Swift’s fans. He, of course, found out what Swifties are capable of.
“I don’t care how powerful Swifties are, they’ll never find me,” the X user, who goes by the handle @Zvbear, posted, adding that “I’m like the Joker I use fake numbers and addresses.”
🔥🚨DEVELOPING: Taylor Swift fans are blaming the AI images of her that went viral yesterday on my friend Zvbear, they have been doxing him, attempting to get him arrested and even murdered. Swifties should be held accountable for this unreal behavior. pic.twitter.com/N5N29QUH9X
— Dom Lucre | Breaker of Narratives (@dom_lucre) January 25, 2024
Swifties engaged and tracked him down and attributed the account to Zubear Abdi, a 28-year-old Somalian who lives in Canada.
zubear abdi aka @zvbear
— 𝗯𝗿𝗼𝘁𝗵𝗲𝗿… (@theantiqueswift) January 25, 2024
toronto, ontario, canada
studied: University of Toronto Scarborough
Somalian but lives in Canada
still need an exact addy though. pic.twitter.com/N8L7yZsaso
@Zvbear later said he was taking his account private until the “tsunami passes,” conceding to the fans and their attempts to dox him.
“Now I’m dealing with Swifties. A whole different animal,” he continued. “This is a Tactical Retreat, every great army has done this.”
— jesse 🌟 (@ThePrimeJesse) January 25, 2024
X has blocked searches of ‘Taylor Swift’ in an attempt to further spread the AI-generated images. Users searching for “Taylor Swift” or “Taylor Swift AI” are met with an error message, although some have found workarounds by slightly altering the search terms or using quotation marks.
X later issued a statement saying that they have “a zero-tolerance policy” for Non-Consensual Nudity (NCN) images, and are “actively removing all identified images and taking appropriate actions against the accounts responsible for posting them.”
Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content. Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them. We're closely…
— Safety (@Safety) January 26, 2024
On Friday, White House press secretary Karine Jean-Pierre called on lawmakers to move to protect people from these types of content. This was echoed by politicians. Rep. Yvette D. Clarke (D-NY) warned that this is nothing new. “For yrs, women have been targets of deepfakes w/o their consent,” but with the advancement of technology, creating deepfakes has become easier and cheaper.
What’s happened to Taylor Swift is nothing new. For yrs, women have been targets of deepfakes w/o their consent. And w/ advancements in AI, creating deepfakes is easier & cheaper.
— Yvette D. Clarke (@RepYvetteClarke) January 25, 2024
This is an issue both sides of the aisle & even Swifties should be able to come together to solve.
Rep Joe Morelle (D-NY), called for the passing of the Preventing Deepfakes of Intimate Images Act, emphasizing that deepfakes don’t just happen to celebrities like Swift, “they’re happening to women and girls everywhere, every day.”
The deepfake images made of Taylor Swift are abhorrent. And deepfakes don't just happen to celebrities—they're happening to women and girls everywhere, every day.
— Joe Morelle (@RepJoeMorelle) January 28, 2024
We need to put a stop to this by passing my legislation, the Preventing Deepfakes of Intimate Images Act.
Information for this story was found via the Wall Street Journal, BBC News, X, and the sources and companies mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.