Content

Home > News > AI fraud warning: Voice clones are turbocharging scams, FTC Chair warns

AI fraud warning: Voice clones are turbocharging scams, FTC Chair warns

time:2025-04-26 20:46:21
«--    --»

AI tools, particularly voice clones, are supercharging scams, FTC Chair Lina Khan warned.

Speaking at an AI and venture capital event on Thursday, Khan warned officials will have to be proactive to stop AI-powered fraud, Bloomberg reported.

She noted that they are already seeing AI "turbocharge" scams, using voice clones that can copy family members in distress as an example, according to Bloomberg.

"We need to be vigilant early," Khan said, according to Bloomberg. "If anything you need to be especially vigilant on the front-end because it’s much more difficult to solve these problems after."

Mashable Light Speed Want more out-of-this world tech, space and science stories? Sign up for Mashable's weekly Light Speed newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!

Voice clones have proven capable of fooling folks. A CNN report noted that fake kidnapping calls demanding ransoms using the tech were becoming more common. Jennifer DeStefano told the outlet her story of getting a call demanding $1 million for their daughter before realizing the caller had used AI to copy her voice.

"The voice sounded just like Brie’s, the inflection, everything," she told CNN.

AI voice clones have even made popular songs featuring fake versions of real artists. Even a relatively easy-to-spot fake image of the Pentagon being attacked caused a real dip in the stock market.

"As this stuff becomes more embedded in how daily decisions are being made, I think they invite and merit a lot of scrutiny," Khan said, according to Bloomberg. "Those problems and concerns are quite urgent and I think enforcers, be it at the state level or the national level, are going to be acting."