No doubt you’ve heard of fraud in which the perpetrator calls an elderly man and pretends to be his grandson or other close relative. The usual routine is to act in a state of distress, pretend they are in a difficult situation, and request an urgent money order to resolve the situation.
While many grandparents will realize that the voice does not belong to their grandchildren and hang up, others will not and are keen to help their worried relative by sending money to the caller’s account.
Sunday’s Washington Post report shows that some scammers have taken the scam to a whole new level by deploying artificial intelligence technology capable of cloning voices, making it even more likely that a target will fall into a trap.
To run this more sophisticated version of the scam, criminals need “an audio sample of just a few sentences,” according to the Post. The sampler is then run through one of the many widely available online tools that use the original voice to create a line that can be ordered to say whatever you want by simply typing the phrases.
FTC data shows that there were over 36,000 reports of so-called impostor scams in 2022 alone, and more than 500 of them were committed over the phone. Claimed losses reached $11 million.
There are fears that as AI tools become more effective and accessible, more people will fall for the scam in the coming months and years.
However, the scam still requires some planning as the determined perpetrator needs to find the audio recording of the voice as well as the relevant victim’s phone number. Audio samples, for example, can be found online on popular sites like TikTok and YouTube, and phone numbers can also be found online.
Fraud can also take many forms. The Post cites an example where an elderly couple, posing as a lawyer, contacted an elderly couple, informing them that their grandson was in custody on suspicion of a crime and they needed more than $15,000 in legal fees. The fake lawyer then pretended to pass the phone to his grandson, whose cloned voice pleaded for help paying the fees, which they duly did.
They only realized they had been duped when their grandson called them later that day to chat. It is believed that the scammer may have cloned his voice from YouTube videos posted by his grandson, although this is hard to be sure.
Some are calling for companies that develop artificial intelligence technology to clone voices to be held accountable for such crimes. But before that happens, it is obvious that many others will lose money due to this nefarious scam.
Source: Digital Trends
I am Garth Carter and I work at Gadget Onus. I have specialized in writing for the Hot News section, focusing on topics that are trending and highly relevant to readers. My passion is to present news stories accurately, in an engaging manner that captures the attention of my audience.