Many people are unaware of a common scam that is quite prevalent.

Fraudsters have advanced and are no longer limited to poorly written emails.

September 17th 2024.

Many people are unaware of a common scam that is quite prevalent.
James Nesbitt, a well-known actor, was taken aback when he heard his own voice speaking to him. It wasn't a recording or a voiceover for a project, but rather a scammer trying to trick him into sending money. Many of us have received similar messages, usually from unknown numbers claiming to be a friend or family member in need of urgent financial assistance. While these messages can be annoying, they are usually easy to ignore and dismiss. However, a new type of scam has emerged that is much more convincing and difficult to detect.

The latest data reveals that over a quarter of adults in the UK have been targeted by a high-tech voice cloning scam in the past year. What makes this scam so dangerous is that almost half of the victims are not even aware that such technology exists. This lack of knowledge makes them more susceptible to falling victim to the scam. A survey conducted by Starling Bank, with a sample size of over 3,000 people, found that voice cloning scams are becoming increasingly prevalent. Scammers are using artificial intelligence to replicate the voices of friends or family members by using just a few seconds of audio. With the abundance of personal information and recordings on social media platforms, it is easy for scammers to gather enough audio to create a convincing clone of someone's voice.

The scam works by the fraudster contacting the victim and pretending to be a close friend or family member in need of urgent financial help. They use the cloned voice to make a phone call, send a voice message, or leave a voicemail, making it seem like a genuine emergency. Shockingly, nearly 1 in 10 people surveyed admitted that they would send money in this situation, even if the call seemed suspicious. This puts millions of people at risk of falling victim to this type of fraud. Despite the growing prevalence of this scam, only 30% of people surveyed said they would know what to look out for if they were targeted.

In response to this concerning trend, Starling Bank has launched the "Safe Phrases" campaign to raise awareness and help protect people from falling victim to voice cloning scams. The campaign encourages the public to agree on a unique phrase with their loved ones, which only they would know. If they receive a call from someone claiming to be a friend or family member and they do not know the agreed-upon phrase, they can immediately recognize that it is likely a scam.

Financial fraud is on the rise, with a 46% increase in reported cases in the past year in England and Wales. The average adult in the UK has been targeted by a fraud scam at least five times in the last 12 months, according to the Starling Bank's research. Lisa Grahame, the bank's chief information security officer, emphasized the importance of being aware of these types of scams and taking measures to protect oneself and loved ones from falling victim. She also highlighted the fact that people often post recordings of their voices online without realizing how it could make them vulnerable to fraudsters.

To demonstrate how easy it is to clone someone's voice, Starling Bank enlisted the help of James Nesbitt, who agreed to have his own voice cloned using AI technology. The experience was a shock for Nesbitt, who has a distinctive voice that is essential to his career. He expressed his surprise at how advanced the technology has become and how easily it can be used for criminal activities if it falls into the wrong hands. As a father himself, he found the thought of his children being scammed in this way scary. He has now committed to setting up a safe phrase with his family and friends to protect them from such scams.

[This article has been trending online recently and has been generated with AI. Your feed is customized.]

 0
 0