Gayle King warns others to be careful of AI scams after her likeness was used in a fake weight-loss product video.

Tech allowing realistic images/videos to be created is becoming a concern; celebs & influencers must confirm if viral ads are real.

October 8th 2023.

Gayle King warns others to be careful of AI scams after her likeness was used in a fake weight-loss product video.
Gayle King recently had to set the record straight about a fake video featuring her likeness. The viral video was being used as a promotion for weight-loss products from the Artipet company, and seemingly had King's voice telling her followers to "learn more about my secret."

Upon hearing the news, King took to her social media to warn her followers of the dangers of AI scams. She clarified that the footage actually came from her radio show promotion, and that fans should not be fooled by increasingly popular AI videos.

The danger of manipulating King's likeness is perilous, given her large social media following of nearly 1 million users. But she isn't the only celebrity falling victim to this scam - Tom Hanks, who boasts almost 10 million followers, had his image used without his consent in a dental ad.

Experts are warning that deep fake imaging is becoming a growing concern, as the technology behind it becomes more advanced and its use more widespread. With celebrities and influencers having to issue statements whether their viral ads are legitimate or not, it's important to stay vigilant and aware of the potential for these AI scams.

[This article has been trending online recently and has been generated with AI. Your feed is customized.]
[Generative AI is experimental.]

 0
 0