Georgia mom, Debbie Shelton Moore, becomes victim of sccammers using AI to sound like daughter
A Georgia mother said she nearly suffered a “heart attack from sheer panic” when scammers used artificial intelligence to recreate her daughter’s voice to make it seem as if she was kidnapped by three men and held at ransom for $50,000.
Debbie Shelton Moore received a phone call from a number that had the same area code as her daughter Lauren’s phone, thinking she was calling because she just got into a car crash.
As she answered, Lauren’s voice was on the other end but it wasn’t her daughter calling.
“My heart is beating, and I’m shaking,” Shelton Moore told WXIA. “It just sounded so much like her, it was 100% believable. Enough to almost give me a heart attack from sheer panic.”
One of the men demanded a ransom from Shelton Moore in exchange for the kidnapped woman.
“By this time the man had said ‘Your daughter’s been kidnapped and we want $50,000’ and then they had her crying ‘mom, mom,’” Shelton Moore added. “It was her voice and that’s why I was totally freaking out,
The man on the phone claimed Lauren was in the trunk of his car.
Shelton Moore opened up her daughter’s location on her phone which showed her stalled on a parkway.
Shelton Moore’s husband overheard the phone call and decided to FaceTime Lauren, who said she was safe and rather confused on the call, which made the parents realize they were the targets of a scam.
“All I was thinking was how am I gonna get my daughter, how in the world are we supposed to get him money,” Shelton Moore said.
After being reassured Lauren was safe, Shelton Moore and her husband, who works in cybersecurity, called the Cherokee County Sheriff’s Office, which notified the Kennesaw Police who sent officers to check-up on Lauren.
Lauren has been aware of scams like the one her mom was victim to because of videos on social media, according to the outlet.
While Shelton Moore says she is aware of most tactics used by scammers, she was unprepared to hear her own daughter’s distressed voice.
“I’m very well aware of scammers and scams and IRS scams and the fake jury duty,” she said. “But of course, when you hear their voice, you’re not going to think clearly and you will panic.”
Following her recent encounter with the new scam, Shelton Moore implemented a new rule with her family, coming up with a code word in case they are ever in an emergency situation.
In March, the Federal Trade Commission warned about the rise in AI-based scams and told the public to be wary of unknown phone numbers calling with what sounds to be a family member on the other end of the line.
“Artificial intelligence is no longer a far-fetched idea out of a sci-fi movie. We’re living with it, here and now. A scammer could use AI to clone the voice of your loved one,” the report reads. “All he needs is a short audio clip of your family member’s voice — which he could get from content posted online — and a voice-cloning program. When the scammer calls you, he’ll sound just like your loved one.”
The FTC recommends potential scam victims not to panic and attempt to call the person on a known phone number, if that fails call a friend or family member of the person.
Scammers will make you pay “in ways that make it hard to get your money back,” including through wire money, cryptocurrency or prepaid gift cards.
Read the full article Here