![]() |
|
Cybercriminals are using artificial intelligence to clone people's voices, and the crimes are leading to distressing phone calls for people around the country. All that's needed is a brief clip of someone talking, which is often pulled from social media, and they can make an eerily similar clone of the voice — a deception so convincing that a mother thinks it's her child.
Victims of voice-cloning scams are tricked into thinking they're talking to a distraught relative who desperately needs money because they've been in a car accident, robbed, or kidnapped. Readily available technology enables cybercriminals to respond in real time during these calls by typing out sentences in their voice-cloning apps. Some go as far as to research personal information about the victim's relative to make the call more believable. Cybercriminals often ask for forms of payment that are difficult to trace or recover, such as gift cards, wire transfers, reloadable debit cards, and even cryptocurrency. As always, requests for these kinds of payments should raise a major red flag. To protect yourself from becoming a victim of a voice-cloning scam, do the following:
Trademarks: All brand names and product names used in here are trade names, service marks, trademarks or registered trademarks of their respective owners. Privacy Policy |