Deepfake Voice Scams Rise: Protect Your Family Now

Featured Image

The Rise of Deepfake Scams and How to Protect Yourself

In a shocking incident that left a Florida family in distress, Sharon Brightwell received a frantic call from her daughter, April, claiming she was involved in a car crash that injured a pregnant woman. However, this was not the real April on the line—her voice had been cloned using advanced AI technology. This deepfake scam led to a terrifying sequence of events that ultimately cost Brightwell thousands of dollars.

The scam began when a caller, posing as April, claimed she had been driving while texting and that her phone had been taken by the police. The caller then asked for $15,000 to secure her release. Believing the story, Brightwell gathered the money, placed it in a box, and sent it via an Uber courier. It wasn’t until another call came in demanding $30,000 for the "injured pregnant woman" that her grandson realized something was wrong.

April later shared the emotional toll of the experience on a GoFundMe page, expressing her deep frustration and loss of faith in humanity. “Evil is too nice a word for the kind of people that can do this,” she wrote.

Deepfake Scams Are on the Rise

Deepfake scams are no longer limited to targeting seniors. This summer, the voice of Florida’s Secretary of State, Marco Rubio, was faked using AI, and scammers attempted to use it to contact foreign and domestic leaders. Similarly, Susie Wilkes, President Trump’s chief of staff, was also targeted by deepfake voice scammers earlier this year.

According to Matthew Wright, PhD, a professor and Chair of Cybersecurity at Rochester Institute of Technology, the increase in these crimes is due to advancements in AI technology. “The technology is getting better and easier to use,” he said. “A few years ago, it required more technical skill, but now with services like ElevenLabs, it's very accessible.”

ElevenLabs is one of the most prominent companies offering realistic voice cloning for businesses, including customer service applications. However, a March report from Consumer Reports criticized companies like ElevenLabs, Lovo, and Speechify for being too lenient in their oversight of non-consensual voice cloning. These companies reportedly only require a checkbox confirming that the person whose voice is being used has given consent.

The Human Element Behind the Scams

Wright emphasizes that there are real people behind these deepfake scams. While criminals once relied on cold calls to target vulnerable individuals, they now have a powerful new tool at their disposal. “A lot of it is organized crime,” he explained. “There are organizations that kidnap people in one country and take them to another, like Malaysia, where they are held in secured facilities and turned into slaves.”

The first step in many of these scams is finding vocal examples that can be cloned. Wright noted that even private individuals can be targeted through social media. “If you’ve posted videos of yourself with friends or family, scammers can use those clips to create a convincing deepfake,” he said. “Even 30 seconds of someone’s voice can be enough to make a realistic clone.”

How to Stay Safe from Deepfake Scams

Wright warns that even hastily created deepfakes can be deceptive. Studies have shown that people often struggle to detect whether a voice is genuine or fake, especially with short audio snippets. “It’s not reliable to assume you can tell if it’s your friend or family member just by listening,” he said.

When receiving calls from unknown numbers claiming to be loved ones, it’s important to remain skeptical. “Scammers often create a sense of urgency and try to put you in an emotional state to get what they want,” Wright said. “Requests for money should always raise red flags.”

He also recommends using banks for transactions, as they are trained to recognize scams and can help identify fraudulent activity. Additionally, creating a unique password or code with friends and family can provide an extra layer of security. “If you have a secret that only you and the person know, it can help verify if the call is legitimate,” Wright said.

Final Thoughts

As deepfake technology continues to evolve, so do the methods used by scammers. Staying informed and taking proactive steps to protect yourself is crucial. Whether it's setting privacy settings on social media, using secure transaction methods, or establishing a secret code with loved ones, there are ways to reduce the risk of falling victim to these scams.

If you have a story about a scam or security breach that impacted you, consider sharing it with others. Your experience could help raise awareness and prevent future victims.

Posting Komentar untuk "Deepfake Voice Scams Rise: Protect Your Family Now"