Cybercriminals are using artificial intelligence (AI) to carry out virtual kidnapping scams, in which they impersonate real individuals and demand ransom payments.
One common tactic is to use AI voice cloning tools to create convincing audio recordings of the victim’s loved ones. In some cases, scammers may even use deepfake technology to create fake videos of the victim being held captive. The goal of these scams is to prey on the victim’s emotions and convince them to pay the ransom. In some cases, victims have been known to pay thousands of dollars before realizing that they have been scammed.
The first step is for the scammer to identify a potential victim. They may do this by searching social media for people who have recently posted about their children or other loved ones. Once they have identified a potential victim, they will then contact the victim and claim to have kidnapped their loved one.
The scammer will then use AI voice cloning tools to create a convincing audio recording of the victim’s loved one. In some cases, the scammer may even use deepfake technology to create a fake video of the victim being held captive. Next, scammer demands a ransom payment from the victim, and threaten to harm the victim’s loved one if the ransom is not paid.
In April 2023, Jennifer DeStefano from Arizona received a call from someone claiming to have kidnapped her daughter. The caller demanded a ransom of $1 million and threatened harm to the child. Jennifer heard her daughter crying in the background but couldn’t speak to her. The ransom amount eventually dropped to $50,000, but before paying, Jennifer confirmed her daughter was safe and hadn’t been kidnapped.
This is just one example of how AI-powered virtual kidnapping scams are being used to exploit people’s emotions. As AI technology continues to develop, it is likely that these scams will become more sophisticated and widespread.
The sources for this piece include an article in TrendMicro.