Techno Blender
Digitally Yours.

Man Scammed by Deepfake Video and Audio of His Friend

0 34


A deepfake scam in China has increased fears that artificial intelligence will allow hackers to easily obtain financial information. Hackers using advanced AI software reportedly convinced a man in northern China to transfer money to his friend, but instead directed it to a fraudulent account.

The victim was based in Baotou, a region in inner Mongolia, and local police reported the hacker used AI to recreate the friend’s likeness in a video call including his face and voice. The deepfake scam led the victim to believe he was wiring 4.3 million Yuan ($622,000) for his friend to make a bidding deposit. When the friend told the victim he was unaware of the incident, the man reported the scam to the authorities who say they have been able to recover most of the stolen funds but are still working to recover the rest, Reuters reported.

Weibo, a Chinese microblogging site, is being used as a forum to discuss the ongoing threat of deepfake scams, with the hashtag, “#AI scams are exploding across the country” going viral with more than 120 million views on Monday. “This shows that photos, voices, and videos all can be utilized by scammers,” one user wrote. “Can information security rules keep up with these people’s techniques?”

This recent incident comes as AI scams have drastically increased worldwide, with reports of scammers using AI technology to replicate voices to receive money via phone calls. The Department of Homeland Security released a report warning about deepfake scams, saying, “It is apparent that the severity and urgency of the current threat from synthetic media depend on the exposure, perspective, and position of who you ask. The spectrum of concerns ranged from ‘an urgent threat’ to ‘don’t panic, just be prepared.’”

In the UK, the CEO of a local energy firm wired €220,000 (approx. $243,000) to a Hungarian suppliers bank account after receiving a phone call from his supposed boss. The voice actually belonged to a scammer who used AI voice technology to replicate the boss’s voice, and the CEO told The Wall Street Journal that he recognized the subtle German accent and said it carried the “melody” of his voice.

Meanwhile, in the U.S., a police department in southern Wisconsin warned residents earlier this month about similar scams after receiving a report from someone who said they received a call from someone who “sounded like their relative,” NBC Chicago reported. Although police said they couldn’t officially verify that it was an AI-generated voice, they wrote in a Facebook post, “We want our community to be aware that this technology is out there.”

The Department of Homeland Security said these scenarios will “undoubtedly increase as the cost and other resources needed to produce usable deep fakes simultaneously decreases.” To avoid being a victim, the DHS recommends calling the person who supposedly requested money to verify and be aware of the common scam requests which include asking “for victims to wire money, send cryptocurrency, or buy gift cards and give them the card numbers and PINs.”


A deepfake scam in China has increased fears that artificial intelligence will allow hackers to easily obtain financial information. Hackers using advanced AI software reportedly convinced a man in northern China to transfer money to his friend, but instead directed it to a fraudulent account.

The victim was based in Baotou, a region in inner Mongolia, and local police reported the hacker used AI to recreate the friend’s likeness in a video call including his face and voice. The deepfake scam led the victim to believe he was wiring 4.3 million Yuan ($622,000) for his friend to make a bidding deposit. When the friend told the victim he was unaware of the incident, the man reported the scam to the authorities who say they have been able to recover most of the stolen funds but are still working to recover the rest, Reuters reported.

Weibo, a Chinese microblogging site, is being used as a forum to discuss the ongoing threat of deepfake scams, with the hashtag, “#AI scams are exploding across the country” going viral with more than 120 million views on Monday. “This shows that photos, voices, and videos all can be utilized by scammers,” one user wrote. “Can information security rules keep up with these people’s techniques?”

This recent incident comes as AI scams have drastically increased worldwide, with reports of scammers using AI technology to replicate voices to receive money via phone calls. The Department of Homeland Security released a report warning about deepfake scams, saying, “It is apparent that the severity and urgency of the current threat from synthetic media depend on the exposure, perspective, and position of who you ask. The spectrum of concerns ranged from ‘an urgent threat’ to ‘don’t panic, just be prepared.’”

In the UK, the CEO of a local energy firm wired €220,000 (approx. $243,000) to a Hungarian suppliers bank account after receiving a phone call from his supposed boss. The voice actually belonged to a scammer who used AI voice technology to replicate the boss’s voice, and the CEO told The Wall Street Journal that he recognized the subtle German accent and said it carried the “melody” of his voice.

Meanwhile, in the U.S., a police department in southern Wisconsin warned residents earlier this month about similar scams after receiving a report from someone who said they received a call from someone who “sounded like their relative,” NBC Chicago reported. Although police said they couldn’t officially verify that it was an AI-generated voice, they wrote in a Facebook post, “We want our community to be aware that this technology is out there.”

The Department of Homeland Security said these scenarios will “undoubtedly increase as the cost and other resources needed to produce usable deep fakes simultaneously decreases.” To avoid being a victim, the DHS recommends calling the person who supposedly requested money to verify and be aware of the common scam requests which include asking “for victims to wire money, send cryptocurrency, or buy gift cards and give them the card numbers and PINs.”

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment