Ordinary pop stars have bodyguards, Taylor Swift has the Swifties: an army of sworn hardcore fans who come out whenever they believe the singer is being wronged. Investor Scooter Braun has already experienced this in the past, as has the young footballer Alejandro Balde, who dared to say publicly that he didn’t like Taylor Swift’s music. But the Swifties’ latest operation has a more serious reason: It’s about deepfake porn with Taylor Swift.
On Thursday, X posts spread on the platform that appeared to show nude pictures of the pop singer. However, the images were faked using artificial intelligence (AI). As US media reports, a single post is said to have generated up to 45 million views and hundreds of thousands of likes before it was finally deleted after 17 hours. As is the case with social networks, the content spread quickly. Faster than X’s moderators could react. This is despite the fact that publishing non-consensual nude photos violates X’s terms of service, as the platform’s safety team emphasized again today.