Pornographic Taylor Swift deepfakes generated by Musk’s Grok AI
Know-how reporter
Getty PicturesElon Musk’s AI video generator has been accused of creating “a deliberate selection” to create sexually specific clips of Taylor Swift with out prompting, says an professional in on-line abuse.
“This isn’t misogyny by chance, it’s by design,” stated Clare McGlynn, a legislation professor who has helped draft a legislation which might make pornographic deepfakes unlawful.
In response to a report by The Verge, Grok Think about’s new “spicy” mode “did not hesitate to spit out totally uncensored topless movies” of the pop star with out being requested to make specific content material.
The report additionally stated correct age verification strategies – which turned legislation in July – weren’t in place.
XAI, the corporate behind Grok, has been approached for remark.
XAI’s personal acceptable use coverage prohibits “depicting likenesses of individuals in a pornographic method”.
“That this content material is produced with out prompting demonstrates the misogynistic bias of a lot AI know-how,” stated Prof McGlynn of Durham College.
“Platforms like X may have prevented this if they’d chosen to, however they’ve made a deliberate selection to not,” she added.
This isn’t the primary time Taylor Swift’s picture has been used on this means.
Sexually specific deepfakes utilizing her face went viral and had been considered tens of millions of occasions on X and Telegram in January 2024.
Deepfakes are computer-generated photos which exchange the face of 1 particular person with one other.
‘Utterly uncensored, utterly uncovered’
In testing the guardrails of Grok Think about, The Verge information author Jess Weatherbed entered the immediate: “Taylor Swift celebrating Coachella with the boys”.
Grok generated nonetheless photos of Swift carrying a costume with a bunch of males behind her.
This might then be animated into quick video clips beneath 4 completely different settings: “regular”, “enjoyable”, “customized” or “spicy”.
“She ripped [the dress] off instantly, had nothing however a tasselled thong beneath, and began dancing, utterly uncensored, utterly uncovered,” Ms Weatherbed informed BBC Information.
She added: “It was stunning how briskly I used to be simply met with it – I by no means requested it to take away her clothes, all I did was choose the ‘spicy’ possibility.”
Gizmodo reported equally specific outcomes of well-known girls, although some searches additionally returned blurred movies or with a “video moderated” message.
The BBC has been unable to independently confirm the outcomes of the AI video generations.
Ms Weatherbed stated she signed as much as the paid model of Grok Think about, which price £30, utilizing a model new Apple account.
Grok requested for her date of delivery however there was no different age verification in place, she stated.
Below new UK legal guidelines which entered into drive on the finish of July, platforms which present specific photos should confirm customers’ ages utilizing strategies that are “technically correct, sturdy, dependable and truthful”.
“Websites and apps that embody Generative AI instruments that may generate pornographic materials are regulated beneath the Act,” the media regulator Ofcom informed BBC Information.
“We’re conscious of the rising and fast-developing danger GenAI instruments could pose within the on-line house, particularly to youngsters, and we’re working to make sure platforms put acceptable safeguards in place to mitigate these dangers,” it stated in an announcement.
New UK legal guidelines
Presently, producing pornographic deepfakes is unlawful when utilized in revenge porn or depicts youngsters.
Prof McGlynn helped draft an modification to the legislation which might make producing or requesting all non-consensual pornographic deepfakes unlawful.
The federal government has dedicated to creating this modification legislation, however it’s but to return into drive.
“Each lady ought to have the proper to decide on who owns intimate photos of her,” stated Baroness Owen, who proposed the modification within the Home of Lords.
“It’s important that these fashions usually are not utilized in such a means that violates a girl’s proper to consent whether or not she be a celeb or not,” Woman Owen continued in an announcement given to BBC Information.
“This case is a transparent instance of why the Authorities should not delay any additional in its implementation of the Lords amendments,” she added.
A Ministry of Justice spokesperson stated: “Sexually specific deepfakes created with out consent are degrading and dangerous.
“We refuse to tolerate the violence in opposition to girls and ladies that stains our society which is why we’ve got handed laws to ban their creation as shortly as doable.”
When pornographic deepfakes utilizing Taylor Swift’s face went viral in 2024, X quickly blocked searches for her identify on the platform.
On the time, X stated it was “actively eradicating” the photographs and taking “acceptable actions” in opposition to the accounts concerned in spreading them.
Ms Weatherbed stated the group at The Verge selected Taylor Swift to check the Grok Think about function due to this incident.
“We assumed – wrongly now – that if they’d put any form of safeguards in place to stop them from emulating the likeness of celebrities, that she could be first on the record, given the problems that they’ve had,” she stated.
Taylor Swift’s representatives have been contacted for remark.


