If you receive an unusual phone call from a family member in trouble, be cautious: the other person on the line could be a scammer impersonating a family member using AI voice technologies. The Federal Trade Commission has issued a warning about fraudsters using commercially available voice-cloning software for family emergency scams.
These scams have been around for a long time, and they involve the perpetrator impersonating a family member, usually a child or grandchild. The fraudster will then call the victim and claim that they are in desperate need of money to deal with an emergency. According to the FTC, artificial intelligence-powered voice-cloning software can make the impersonation scam appear even more authentic, duping victims into handing over their money.
All he (the scammer) needs is a short audio clip of your family member’s voice—which he could get from content posted online—and a voice-cloning program. When the scammer calls you, he’ll sound just like your loved one,” the FTC says in the Monday warning.
The FTC did not immediately respond to a request for comment, leaving it unclear whether the US regulator has noticed an increase in voice-cloning scams. However, the warning comes just a few weeks after The Washington Post detailed how scammers are using voice-cloning software to prey on unsuspecting families.
[…]
Content was cut in order to protect the source.Please visit the source for the rest of the article.
Content was cut in order to protect the source.Please visit the source for the rest of the article.
This article has been indexed from CySecurity News – Latest Information Security and Hacking Incidents
Read the original article: