Artificial intelligence, or AI, has come a long way since 1952 when computer scientist Arthur Samuel created a checkers-playing program for the IBM 700 series. At the time it was unlikely he could have predicted how widespread applications of AI would become, or how AI would eventually be used to defraud and cause harm to others. Today, scammers are using the technology to fake kidnapping scams by spoofing phone numbers and cloning voices. The Federal Trade Commission has categorized these crimes as “Imposter Scams”. 

Spoofing numbers used to be complicated when it involved specialized knowledge and a digital connection to the phone company. But VoIP, voice over internet protocol, a phone service delivered via the internet, has made phone number spoofing easy. Companies like Grasshopper, Magic Jack, RingCentral, and Zoom often allow users to set up their display name and numbers on the company’s web interface without being connected to a phone. This is how scammers are spoofing numbers.

AI voice generators use text to speech technology to read text out loud in human sounding voices. Microsoft’s new VALL-E can “closely simulate a person’s voice when given a 3 second voice sample”. Voice samples can be taken from YouTube videos, TikTok, other social media post, or voicemail greetings to perpetrate an imposter scam. And these scams, where family members are tricked into believing their loved one is calling and in need of help and money are nothing new, they’re just becoming more prevalent. 

In 2022 it was estimated that $11 million was stolen through thousands of imposter scams. Just this month a mother in Arizona told reporters that she was “100 percent convinced” the person on the phone who was crying and begging for help was her daughter when AI kidnappers called demanding $1 million in ransom for her safe return. Luckily this mom was able to verify, while remaining on the phone with the criminal, that her daughter was safe and on a ski trip with a friend. 

It’s inevitable that there will be more victims of imposter scams because the improvements of AI voice generators make it more difficult for loved ones to doubt the voice they hear on the other end of the line. And the ease of phone number spoofing doesn’t help. Yet, it is possible to avoid becoming a victim of an imposter scam. If you receive a call from someone claiming to be a loved one in need of help and money, 

  • Stay on the line yet text your loved one and ask them if they’re okay. 
  • If someone is with you when you answer the phone, have them call your loved one to check on them. 
  • Ask the caller to Facetime you.
  • And never give anyone credit card information over the phone or agree to send money digitally through an app.