San Francisco, Jan 26 (SocialNews.XYZ) Google-owned YouTube has deleted more than 1,000 deepfake scam ad videos of celebrities from its platform.
YouTube said it is “investing heavily” to stop AI celebrity scam ads.
After a 404 Media probe into such fake celebrity ads, YouTube deleted more than 1,000 videos tied to an advertising ring that used AI to make celebrities like Taylor Swift, Steve Harvey, and Joe Rogan promote Medicare scams.
Such videos had nearly 200 million views, with both users and celebrities regularly complaining about them, said the report.
YouTube is “aware” that its platform is being used with AI-generated ads of celebrities, and is working hard to stop such celebrity deepfakes.
The YouTube action came as non-consensual deepfake porn of Taylor Swift went viral on X, with one post garnering more than 45 million views and 24,000 reposts before it was removed.
The post was live on the platform for around 17 hours prior to its removal.
A report from 404 Media found that the images may have originated in a group on Telegram, where users share explicit AI-generated images of women.
Users in the group also reportedly joked about how the images of Swift went viral on X.
According to the latest research from cybersecurity firm Deeptrace, about 96 per cent of deepfakes are pornographic, and they almost always portray women.
Source: IANS
About Gopi
Gopi Adusumilli is a Programmer. He is the editor of SocialNews.XYZ and President of AGK Fire Inc.
He enjoys designing websites, developing mobile applications and publishing news articles on current events from various authenticated news sources.
When it comes to writing he likes to write about current world politics and Indian Movies. His future plans include developing SocialNews.XYZ into a News website that has no bias or judgment towards any.
He can be reached at gopi@socialnews.xyz