[imagesource: Needpix]
Advances in artificial intelligence have added a new layer of eerie to online and phone scams.
One of the first of many victims was Benjamin Perkin, whose elderly parents lost thousands of dollars in a recent voice-cloning scam.
When his parents received a nightmare phone call from an alleged lawyer, saying that their son had killed an American diplomat in a car accident and was in prison, needing money for legal fees, they didn’t even think twice to run to his rescue.
They rushed to the banks to get the cash and then received another call saying their son needed an additional $21 000 (just under R400 000) before a court date later that day.
Perkin’s parents registered that the call was off, but felt strongly that they had indeed spoken to their dear son. Then the real Perkin called his parents later that night for a surprise check-in, and the truth unravelled, forcing them to reckon with the fact that they had been duped.
Scammers had likely used videos that Perkin posted on YouTube, training an AI tool to generate his voice with the audio samples.
While impostor scams were the second most popular racket in the US, with over 36,000 reported victims in 2022, inexpensive online tools are adding an advanced layer of creepy by translating an audio file into a voice replica, allowing a fraudster to “speak” whatever he types.
The Washington Post reported Hani Farid, professor of digital forensics at the University of California at Berkeley, saying that “It’s the perfect storm… [with] All the ingredients you need to create chaos:
Experts say federal regulators, law enforcement and courts are woefully ill-equipped to crack down on this growing scam. Most victims have few leads to identify the perpetrator and it is difficult for police to trace calls and funds from scammers operating around the world. And there’s little legal precedent for courts to hold the companies that make the devices accountable for their use.
“Two years ago, even a year ago, you needed a lot of audio to clone a person’s voice,” Farid said. “Now… if you have a Facebook page… or if you record a TikTok and your voice is out there for 30 seconds, people can clone your voice.”
Companies such as ElevenLabs, an AI voice synthesising start-up founded in 2022 that convert a short vocal sample into an artificially generated voice, have come under fire for the ways its text-to-speech tool has been used in nefarious ways:
ElevenLabs [has] been used to replicate the voices of celebrities…like Emma Watson reciting a passage from Adolf Hitler’s “Mein Kampf” incorrectly. ElevenLabs did not respond to a request for comment, but in a twitter thread The company said it is adding safeguards to prevent abuse, including restricting free users from creating custom voices and launching a tool to detect AI-generated audio.
In the meantime, people are losing thousands.
“The money is gone,” Perkin said. “No insurance. No getting it back. It’s gone.”
If a loved one tells you they need money, put that call on hold and try calling your family member separately to fully suss out the situation.
Also, never pay people in gift cards or cash, as that makes it difficult to trace.
Stay safe and sane out there, ya’ll.
[source:washingtonpost]
Hey Guys - thought I’d just give a quick reach-around and say a big thank you to our rea...
[imagesource:CapeRacing] For a unique breakfast experience combining the thrill of hors...
[imagesource:howler] If you're still stumped about what to do to ring in the new year -...
[imagesource:maxandeli/facebook] It's not just in corporate that staff parties get a li...
[imagesource:here] Imagine being born with the weight of your parents’ version of per...