• Sat. Apr 1st, 2023

New voice cloning technologies permits scammers to impersonate everybody

ByEditor

Mar 17, 2023

Published March 17, 2023 5:02 p.m. ET

Artificial intelligence specialist Marie Haynes says AI tools will swiftly make it difficult to distinguish AI from a correct person’s voice. (Dave Charbonneau/CTV News Ottawa)

As artificial intelligence technologies continues to advance, scammers are locating new tactics to exploit it.

Voice cloning has emerged as a particularly damaging tool, with scammers functioning with it to imitate the voices of folks their victims know and trust in order to deceive them into handing much more than dollars.

“People today now will swiftly be in a position to use tools like ChatGPT or even Bing and sooner or later Google, to create voices that sound incredibly a lot like their voice, use their cadence,” described Marie Haynes, an artificial intelligence specialist. “And will be incredibly, incredibly difficult to distinguish from an actual correct reside person.” 

She warns that voice cloning will be the new tool for scammers who pretend to be an person else.

Carmi Levy, a technologies analyst, explains that scammers can even spoof the phone numbers of household and mates, creating it seem like the get in touch with is fundamentally coming from the person they are impersonating.

“Scammers are functioning with increasingly sophisticated tools to convince us that when the phone rings it is in truth coming from that household member or that considerable other. That person that we know,” he says.

Levy advises folks who get suspicious calls to hang up and get in touch with the person they really feel is calling them straight. 

“If you get a get in touch with and it sounds just a tiny bit off, the 1st concern you ought to do is say ‘Okay, thank you incredibly a lot for letting me know. I am going to get in touch with my grandson, my granddaughter, whoever it is that you occur to be telling me is in complications straight.’ Then get off the phone and get in touch with them,” he advises.

Haynes also warns that voice cloning is just the beginning, with AI very productive enough to clone someone’s face as adequately. 

“Rapidly, if I get a FaceTime get in touch with, how am I going to know that it is legitimately somebody that I know,” she says. “Possibly it is somebody pretending to be that person.”

As this technologies becomes a lot much more widespread, specialists are urging folks to be vigilant and to confirm calls from mates and household prior to sending any dollars. 

“There are all sorts of tools that can take written word and create a voice out of it,” says Haynes. “We are swiftly going to be locating that scam calls are going to be genuinely, genuinely on the rise.”

Leave a Reply