Dr. Mosley's image used in fraudulent health scams after his death through deepfakes.

He passed away in June.

July 18th 2024.

Dr. Mosley's image used in fraudulent health scams after his death through deepfakes.
The world of social media has been hit with a disturbing trend - deepfaked videos promoting health scams, featuring renowned medical professionals like the late Dr. Michael Mosley. Along with TV doctors Dr. Hilary Jones and Rangan Chatterjee, Dr. Mosley was caught up in this alarming trend, even after his passing at the age of 67 while on vacation in Greece.

One particularly troubling deepfaked video surfaced online after Dr. Mosley's death, where he appeared to be endorsing a product that claimed to "normalize" blood sugar levels, urging diabetes patients to abandon their insulin and other medications. This is just one example of the vile scams being promoted through these fake videos, which are often circulated on Facebook.

It's important to note that none of the medical professionals featured in these videos have any connection to the products being advertised. In fact, Dr. Jones addressed this issue in a report by the BMJ, stating that he has been falsely associated with products claiming to improve blood pressure, diabetes, and even hemp gummies with names like Via Hemp Gummies, Bouncy Nutrition, and Eco Health.

Unfortunately, these deepfaked videos are difficult to control. Even if they are taken down, they often resurface under different names the next day. John Cormack, a retired doctor who worked with the BMJ on their investigation, points out that it is much cheaper for scammers to create these videos than to conduct actual research and bring legitimate products to market.

When asked for a statement, a spokesperson from Meta (formerly known as Facebook) said they would be investigating the examples highlighted by the BMJ. They also encouraged users to report any content that violates their policies. The BMJ recommends that if you come across a deepfake, you should contact the person being impersonated to verify its authenticity, leave a comment questioning its legitimacy, and report it to the platform where you found it.

Sadly, this is not the first time celebrities have been targeted by deepfakes to promote products they have no association with. In one instance, singer Taylor Swift's image was used in an ad for what appeared to be high-end cookware, but was later revealed to be a deepfake and a scam. In another, actor Tom Hanks was featured in a fake dental plan ad, prompting him to address the issue on social media and confirm that it was indeed a deepfake.

Beyond promoting scams, deepfakes have also been used for more disturbing purposes. In March, it was reported that over 250 British celebrities, including Channel 4 News presenter Cathy Newman, have been victims of deepfaked porn. The violation and invasion of privacy experienced by these individuals is truly alarming.

As the technology behind deepfakes becomes more advanced, it is becoming increasingly difficult to spot these fake videos. However, anti-malware software company Norton suggests looking out for unnatural eye movements, facial expressions, and posture, as well as distorted movement and abnormal color in the video. Other tell-tale signs include too-perfect hair, teeth that are not separated, poor audio quality, and a lack of background noise in outdoor scenes.

Furthermore, be wary of videos claiming to offer ground-breaking health treatments that are not being reported by reputable news sources. And if you're still unsure, you can use Google's reverse image search tool to see if the original image has been manipulated.

The use of deepfakes is not limited to celebrities - even regular people can become victims. Radio host Zoe Ball recently spoke out about a scam that used her face to endorse a financial plan. She warned her followers to be cautious and not fall for these false claims.

The impact of deepfakes on the entertainment industry is also a cause for concern. In fact, many celebrities went on strike last year to protest the use of AI to replace actors and writers. In April, over 200 names from the music industry, including Katy Perry, Billie Eilish, and Stevie Wonder, signed an open letter expressing their fears about the threat of AI to their livelihoods.

If you come across a deepfake, whether it's promoting a scam or impersonating a celebrity, it's important to take action. Report it to the platform where you found it, and alert the person being impersonated if possible. We must all do our part to combat this dangerous trend and protect ourselves from falling victim to these fake videos.

[This article has been trending online recently and has been generated with AI. Your feed is customized.]
[Generative AI is experimental.]

 0
 0