FTC: Warns of Family Scams Enabled by AI-Enhancements

  |  Source: 

FTC: Warns of Family Scams Enabled by AI-Enhancements

Category: Threat Actor Activity | Industry: Global | Level: Strategic | Source: FTC

The Federal Trade Commission has issued a warning to consumers regarding an advanced version of the family emergency scam. Scammers are utilizing AI technology to mimic the voice of a "distressed family member" to deceive the victim into giving up their money. The dangers of "voice cloning" can enable scammers to initiate a conversation with a target under the guise their loved one is in a distressing situation such as being in an accident, being contained in jail, and other dire situations. People generally didn't have to verify the voice of their loved ones and most victims would be motivated to help their loved ones based on voice verification alone. Recommendations offered by the FTC include contacting the family member/loved one involved through their phone number to confirm the story and/or corroborate the story with other individuals.

Get trending threats published weekly by the Anvilogic team.

Sign Up Now