
Georgia mom Debbie Shelton Moore falls victim to scammers who use AI to sound like her daughter
A Georgia mother said she almost had a “heart attack out of sheer panic” when scammers used artificial intelligence to mimic her daughter’s voice to make it appear as if she had been kidnapped by three men and held for $50,000 for ransom been detained.
Debbie Shelton Moore received a call from a number with the same area code as her daughter Lauren’s phone. She thought she was calling because she was in a car accident.
When she answered, Lauren’s voice could be heard on the other end, but it wasn’t her daughter who was calling.
“My heart is beating and I’m shaking,” Shelton Moore told WXIA. “It just sounded so much like her, it was 100% believable. Enough to almost give me a heart attack from panicking.”
One of the men demanded a ransom from Shelton Moore in return for the kidnapped woman.
“By that point, the man had said, ‘Your daughter has been kidnapped and we want $50,000,’ and then they were yelling, ‘Mom, Mom,'” Shelton Moore added. “It was her voice and that’s why I freaked out,
The man on the phone claimed Lauren was in the trunk of his car.
Shelton Moore opened up her daughter’s location on her phone and revealed that she had stopped on a park path.
Shelton Moore’s husband overheard the call and decided to FaceTime. Lauren said she was safe and quite confused when she received the call, which made the parents realize they were the target of a scam.
“I was just thinking how I’m going to have my daughter, how on earth are we going to get him money,” Shelton Moore said.
After assuring Lauren was safe, Shelton Moore and her husband, who works in cybersecurity, called the Cherokee County Sheriff’s Office, who notified the Kennesaw Police Department, who dispatched officers to investigate Lauren.
According to the outlet, Lauren was known from videos posted on social media of scams like the one her mother fell victim to.
While Shelton Moore says she’s aware of most scammers’ tactics, she wasn’t prepared to hear her own daughter’s distressed voice.

“I’m well aware of scammers and scammers and IRS scams and fake jury duty,” she said. “But of course if you hear her voice, you’re not going to think straight and you’re going to panic.”
After her recent exposure to the new scam, Shelton Moore and her family implemented a new rule and came up with a code word in case they ever find themselves in an emergency situation.
In March, the Federal Trade Commission warned of the rise in AI-powered scams and urged the public to beware of unfamiliar phone numbers calling with what appear to be family members on the other end of the line.
“Artificial intelligence is no longer a far-fetched idea from a sci-fi movie. We live with it, here and now. A scammer could use AI to clone a loved one’s voice,” the report said. “All he needs is a short audio clip of your family member’s voice – which he could get from content posted online – and a voice cloning program. When the scammer calls you, they will sound just like your loved one.”
The FTC recommends potential fraud victims not to panic and try to call the person on a familiar phone number. If that fails, call a friend or family member of the person.
Scammers force you to “pay in a way that makes it difficult to get your money back,” including through wire transfer, cryptocurrency, or prepaid gift cards.