AI Voice-Based Scams Rise as One-Third of Victims Can't Tell if the Voice is Real or Not
As audio deepfake technology continues to go mainstream as part of the evolution in AI-based tools, new data shows there are plenty of victims and they aren’t prepared for such an attack. Imagine you get a call from your child or grandchild telling you they’re in some kind of trouble, an accident, etc. and need money. And it really does sound like them. Would you help? Scammers who are making use of AI to synthesize sound-alike voices as part of vishing scam calls are hoping you will.