Deepfake Scams Are Distorting Reality Itself

Date:

Share:


Imagine you meet someone new. Be it on a dating app or social media, you chance across each other online and get to talking. They’re genuine and relatable, so you quickly take it out of the DMs to a platform like Telegram or WhatsApp. You exchange photos and even video call each over. You start to get comfortable. Then, suddenly, they bring up money.

They need you to cover the cost of their Wi-Fi access, maybe. Or they’re trying out this new cryptocurrency. You should really get in on it early! And then, only after it’s too late, you realize that the person you were talking to was in fact not real at all.

They were a real-time AI-generated deepfake hiding the face of someone running a scam.

This scenario might sound too dystopian or science-fictional to be true, but it has happened to countless people already. With the spike in the capabilities of generative AI over the past few years, scammers can now create realistic fake faces and voices to mask their own in real time. And experts warn that those deepfakes can supercharge a dizzying variety of online scams, from romance to employment to tax fraud.

David Maimon, the head of fraud insights at identity verification firm SentiLink and a professor of criminology at Georgia State University, has been tracking the evolution of AI romance scams and other kinds of AI fraud for the past six years. “We’re seeing a dramatic increase in the volume of deepfakes, especially in comparison to 2023 and 2024,” Maimon says.

“It wasn’t a whole lot. We’re talking about maybe four or five a month,” he says. “Now, we’re seeing hundreds of these on a monthly basis across the board, which is mind-boggling.”

Deepfakes are already being used in a variety of online scams. One finance worker in Hong Kong, for example, paid $25 million to a scammer posing as the company’s chief financial officer in a deepfaked video call. Some deepfake scammers have even posted instructional videos on YouTube, which have a disclaimer as being for “pranks and educational purposes only.” Those videos usually open with a romance scam call, where an AI-generated handsome young man is talking to an older woman.

More traditional deepfakes—such as a pre-rendered video of a celebrity or politician, rather than a live fake—have also become more prevalent. Last year, a retiree in New Zealand lost around $133,000 to a cryptocurrency investment scam after seeing a Facebook advertisement featuring a deepfake of the country’s prime minister encouraging people to buy in.

Maimon says SentiLink has started to see deepfakes used to create bank accounts in order to lease an apartment or engage in tax refund fraud. He says an increasing number of companies have also seen deepfakes in video job interviews.

“ Anything that requires folks to be online and which supports the opportunity of swapping faces with someone—that will be available and open for fraud to take advantage of,” Maimon says.



Source link

━ more like this

‘Hey, that’s my voice!’ Veteran broadcaster claims Google stole his voice for AI tool

Former NPR host David Greene is suing Google after accusing the tech giant of stealing his voice for use in one of its...

‘Hey, that’s my voice!’ Veteran broadcaster claims Google stole his voice for AI tool

Former NPR host David Greene is suing Google after accusing the tech giant of stealing his voice for use in one of its...

I tested the most affordable Copilot+ laptop I could find and it surprised me

Asus Vivobook 14 MSRP $649.99 “Asus Vivobook 14 is a good template for serving the best of Windows 11 on an affordable and practically rewarding platter” Pros Solid...

Hideki Sato, known as the father of Sega hardware, has reportedly died

Hideki Sato, who led the design of Sega's beloved consoles from the '80s and '90s, died on Friday, according to the Japanese gaming...

CarPlay is still on track for Tesla cars, but you might have to wait longer

Tesla’s long-awaited adoption of Apple CarPlay is still happening – just not as quickly as some drivers had hoped. After signaling last year...
spot_img