AI Helps Fake Kidnapping by Cloning Teen’s Voice

In a disturbing new twist on phone scams, a mother from Arizona recently received a chilling phone call from someone impersonating her daughter. The caller claimed to have been kidnapped and demanded a ransom of $1 million or else she would be killed. What made the situation particularly unsettling was that the voice on the other end of the line sounded almost identical to the daughter’s voice. However, it was later revealed that the call was made using an AI simulation.

The mother, Candace DeStefano, said that she immediately began to tape record the call when she realized that something was not right. She later played the recording for her daughter, who was startled to hear her own voice urging her mother to comply with the kidnapper’s demands.

According to computer science experts, it is now possible to create a voice clone from even the briefest of soundbites. With just three seconds of someone’s voice, an AI program can mimic their tone, inflection, and even their emotions. “In the beginning, it would require a larger amount of samples,” explained Subbarao Kambhampati, a professor of computer science at Arizona State University. “Now there are ways in which you can do this with just three seconds of your voice. Three seconds. And with the three seconds, it can come close to how exactly you sound.”

While AI voice-cloning technology has many useful applications, such as helping people with speech impairments to communicate, it also has the potential to be used for nefarious purposes. Scammers and cybercriminals, in particular, are finding ways to exploit this technology to deceive people.

To prevent falling victim to such scams, FBI experts suggest being cautious about what information one shares on social media. Scammers often target people who have public profiles, as it makes it easier for them to collect information about their target. Furthermore, the FBI recommends that people ask the scammer a series of questions about the “abductee” that only the victim would know. A family emergency word or question that only the victim and their family know is also recommended, so that they can quickly determine if the situation is genuine or a scam.

The use of AI in phone scams is a growing concern, and people need to be vigilant to avoid being deceived. The only way to combat this problem is through public awareness and education. As Candace DeStefano wisely advises, “Stay safe!”

LEAVE A REPLY

Please enter your comment!
Please enter your name here