Taylor Swift backed by White House, SAG-AFTRA over explicit AI photos amid growing crisis
Washington DC - Taylor Swift has received support from the White House and SAG-AFTRA, along with millions of fans, after sexually explicit AI-generated photos of the pop star were spread around X, formerly known as Twitter.
Swifties rallied around the 34-year-old singer after the disturbing images went viral on Thursday, quickly leading phrases like "Protect Taylor Swift" to trend on the platform.
As fans expressed their worries, many others noted that such AI photos are, unfortunately, not uncommon anymore, and there has been concerningly little done to prevent their circulation.
On Friday, White House press secretary Karine Jean-Pierre condemned the explicit images of Swift and urged social media platforms like X to take stronger actions to prevent such material from spreading.
"We are alarmed by the reports of the circulation of images… of false images, to be more exact. And it is alarming," Jean-Pierre said during a press briefing.
"So while social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation and non-consensual, intimate imagery of real people."
Jean-Pierre further called for legislation targeting such images specifically and noted that the topic of AI remains a focus of the Biden administration. In November 2023, the Department of Justice developed the first-ever helpline providing support for "survivors of image-based sexual abuse," covering both authentic and AI-generated explicit images spread without consent.
Along with support from the White House, Swift was also backed by SAG-AFTRA, which has released a statement urging protections for performers against non-consenual AI-generated material.
AI images of Taylor Swift draw attention to widespread issue of deepfakes
The organization's statement, also released on Friday, called the images "upsetting, harmful, and deeply concerning."
"The development and dissemination of fake images – especially those of a lewd nature – without someone's consent must be made illegal," the statement read. "As a society, we have it in our power to control these technologies, but we must act now before it is too late."
SAG-AFTRA's historic strike in 2023 sought protections against the use of AI in Hollywood, but many have argued the final agreement did not go far enough.
Despite fears that actors could be replaced by AI-generated performers, the final agreement does not prevent studios from using generative AI technology; instead, studios are now required to inform the union of using such content, but compensation may remain elusive, as such technology often uses elements from multiple performers whose identities can be difficult to prove.
Though Swift's superstardom is certainly behind the intense spread of her AI images, women everywhere remain frequent targets of AI tools and deepfake pornography. A 2023 study from Home Security Heroes found that 98% of AI-generated deepfake videos online are pornographic in nature, and 99% of such content features female subjects.
Just last year, a teenage student at a New Jersey high school called for new protections for victims of AI-generated pornography after fake, sexually explicit images of herself and 30 of her female classmates had been spread by their male peers, as reported by ABC6.
Amid the disturbing situation facing Swift, X, under the control of owner Elon Musk, has made significant cuts to its content moderation teams, leaving victims even more vulnerable to the widespread circulation of abusive content.
Cover photo: Amy Sussman / GETTY IMAGES NORTH AMERICA / Getty Images via AFP