Phone Scammers Evolve: AI-Powered Voice Mimicry Poses New Threat

 

In an ever-evolving battle against phone scammers and robocalls, a growing concern is the use of artificial intelligence (AI) to mimic victims’ voices, making these scams even more convincing. While efforts have been made to curb scam calls, it’s imperative for individuals to bolster their phone defenses and remain vigilant.
Phone scammers and robocalls have become an epidemic, with billions of spam calls plaguing people worldwide. Voice security company Hyia reported a staggering 6.5 billion instances of phone spam calls in a single quarter. In the United States, the problem is particularly acute, with an average of 12 scam calls per month per person, and one in four calls being unwanted, according to a Q2 report.
AI Voice Mimicry Adds a Dangerous Twist
The latest development in the world of phone scams involves the use of AI technology to record victims’ voices and replicate them in vishing (voice phishing) attacks. This advanced generative AI text-to-speech technology allows scammers to pose as someone familiar to their victims, even incorporating personal details to enhance the believability of the scam. This puts individuals at risk of inadvertently sharing sensitive information with scammers.
As scammers become more sophisticated, individuals need to strengthen their defenses against phone scams. Cross-referencing multiple apps th

[…]
Content was cut in order to protect the source.Please visit the source for the rest of the article.

This article has been indexed from CySecurity News – Latest Information Security and Hacking Incidents

Read the original article: