MArtificial intelligence (AI) generated nude photos of pop star Taylor Swift have sparked outrage from fans and US politicians. One image was viewed 47 million times on the web service X, formerly Twitter, before it was removed on Thursday. According to US media, the fake photo was online for about 17 hours. Other accounts, however, are still distributing recordings.
The singer's fans tried to resist the spread of the material with massive expressions of solidarity by uploading real pictures of their idol. If you're looking for Taylor Swift, you shouldn't see any fakes at all. They also reported the deep fakes to Twitter support.
“What happened to Taylor Swift is nothing new”
The images are said to be from a Telegram group where various users specialize in generating deep fakes. This was reported by the “404 Media” platform. As a result, the creators did not expect their forgery to succeed in X.
“What happened to Taylor Swift is nothing new,” said Democratic Congresswoman Yvette Clarke of New York, who is campaigning for legislation to crack down on deeply fake nude images. “For years, women have been targeted for fakes without their consent. And with the development of artificial intelligence, it will become easier and cheaper to create deep fakes,” he added.
“We have to take precautions”
Activists and regulators fear that easy-to-use AI tools could lead to an uncontrollable flood of fake and harmful content. Deepfakes are a technology where real people's faces are inserted into photos or movies using artificial intelligence.
Republican Congressman Tom Kean warned that artificial intelligence technology is advancing faster than its regulation. “Whether Taylor Swift is a victim or another young person in our country, we must take protective measures to combat this alarming trend,” he said.
The only good thing about this is that it “happened to Taylor Swift, who probably has enough power to pass a law to end this phenomenon. You're sick,” influencer Danisha Carter wrote on X.
According to analysts, US billionaire Elon Musk's platform promised to take action against the affected content and its authors. Taylor Swift's management initially did not comment on the incident.