Not your grandson...
AI strikes again.
From Washington Post:
The man calling Ruth Card sounded just like her grandson Brandon. So when he said he was in jail, with no wallet or cellphone, and needed cash for bail, Card scrambled to do whatever she could to help.
“It was definitely this feeling of … fear,” she said. “That we’ve got to help him right now.”
Card, 73, and her husband, Greg Grace, 75, dashed to their bank in Regina, Saskatchewan, and withdrew 3,000 Canadian dollars ($2,207 in U.S. currency), the daily maximum. They hurried to a second branch for more money. But a bank manager pulled them into his office: Another patron had gotten a similar call and learned the eerily accurate voice had been faked, Card recalled the banker saying. The man on the phone probably wasn’t their grandson.
That’s when they realized they’d been duped.
“We were sucked in,” Card said in an interview with The Washington Post. “We were convinced that we were talking to Brandon.”
Advancements in artificial intelligence have added a terrifying new layer, allowing bad actors to replicate a voice with just an audio sample of a few sentences. Powered by AI, a slew of cheap online tools can translate an audio file into a replica of a voice, allowing a swindler to make it “speak” whatever they type.
Are the AI companies moving so fast that they’re not thinking about long-term ramifications?
Thanks for reading Dan Decker’s Writing Newsletter! Subscribe for free to receive new posts and support my work.