Man defrauded of 45000 in AI voice scam; Know how to stay safe from fakes


Over the last few months, the adoption of artificial intelligence (AI) has been rapid. The world’s biggest tech companies have made efforts to incorporate this technology into their suite of products, making the user experience a lot more convenient. However, the rapid advancements in AI have also given rise to AI scams. Yes, even the cybercriminals have jumped on the AI bandwagon. There have been several instances where threat actors have defrauded the victim by using deepfakes or fake voices. And now shocking details have emerged of  a Lucknow man who was defrauded of Rs. 45000 when he was targeted using a fake AI-generated voice of a relative. Know what happened in this AI voice scam.

AI voice scam

According to an ANI report, Kartikeya, a 25-year-old resident of Vineet Khand in Lucknow’s Gomti Nagar, received a call from an unknown number. The person introduced himself as Kartikeya’s maternal uncle and explained that he wished to send Rs. 90000 to someone but wasn’t able to carry out the transaction via UPI. According to Lucknow Police, the man then asked Kartikeya to send money to that person.

Quoting Kartikeya, The Times of India reported, “The impersonator sent five messages of transactions of a total of 90,000 and asked me to send the money to a UPI number as he was not able to send the same through his UPI. After I read the messages, I transferred a total of 90,000 from my bank account to the UPI number the miscreant had given to me.”

However, some of the transactions failed, meaning Kartikeya was only able to send Rs. 44500 to the account.

Sometime later, Kartikeya received multiple SMS stating that Rs. 10000, 20000, and 40000 were credited to his account. However, after checking the account balance, the money was nowhere to be found. He immediately contacted the police, and a case has now been registered at the Gomti Nagar Police Station, Deepak Pandey, SHO, Gomti Nagar Police Station said.

Cyber expert and former SP Cyber cell, Triveni Singh said, “The scammers will often use this technique to pose as family members, friends, or even customer service representatives to trick the victim into divulging personal information or sending money.”

This is not the first instance where AI was used to defraud people. In a previous case, Radhakrishan P S, a former Coal India executive, lost Rs. 40000 in a deepfake scam when he got a video call from a former colleague, requesting him to urgently send Rs. 40000 for his sister’s surgery. This, in fact, was a deepfake scam where the video call was AI-generated using deepfake technology.

How to stay safe against AI scams

1. Be careful when answering calls from unknown numbers; scammers can use legitimate numbers to trick you.

2. Whenever you get any communication that emphasizes words such as “urgent”, or “immediately”, etc, always take a step back, and go through the content carefully.

3. Before acting, take a moment to contact your loved one directly via a different means to verify and confirm fully.

4. Never open a link or scan a QR code that comes with such messages. If they pretend to be from another institution, search online about the same and see if you can find more information.

5. Do not share your financial information with anyone. Remember, discretion is the better part of valor.


Over the last few months, the adoption of artificial intelligence (AI) has been rapid. The world’s biggest tech companies have made efforts to incorporate this technology into their suite of products, making the user experience a lot more convenient. However, the rapid advancements in AI have also given rise to AI scams. Yes, even the cybercriminals have jumped on the AI bandwagon. There have been several instances where threat actors have defrauded the victim by using deepfakes or fake voices. And now shocking details have emerged of  a Lucknow man who was defrauded of Rs. 45000 when he was targeted using a fake AI-generated voice of a relative. Know what happened in this AI voice scam.

AI voice scam

According to an ANI report, Kartikeya, a 25-year-old resident of Vineet Khand in Lucknow’s Gomti Nagar, received a call from an unknown number. The person introduced himself as Kartikeya’s maternal uncle and explained that he wished to send Rs. 90000 to someone but wasn’t able to carry out the transaction via UPI. According to Lucknow Police, the man then asked Kartikeya to send money to that person.

Quoting Kartikeya, The Times of India reported, “The impersonator sent five messages of transactions of a total of 90,000 and asked me to send the money to a UPI number as he was not able to send the same through his UPI. After I read the messages, I transferred a total of 90,000 from my bank account to the UPI number the miscreant had given to me.”

However, some of the transactions failed, meaning Kartikeya was only able to send Rs. 44500 to the account.

Sometime later, Kartikeya received multiple SMS stating that Rs. 10000, 20000, and 40000 were credited to his account. However, after checking the account balance, the money was nowhere to be found. He immediately contacted the police, and a case has now been registered at the Gomti Nagar Police Station, Deepak Pandey, SHO, Gomti Nagar Police Station said.

Cyber expert and former SP Cyber cell, Triveni Singh said, “The scammers will often use this technique to pose as family members, friends, or even customer service representatives to trick the victim into divulging personal information or sending money.”

This is not the first instance where AI was used to defraud people. In a previous case, Radhakrishan P S, a former Coal India executive, lost Rs. 40000 in a deepfake scam when he got a video call from a former colleague, requesting him to urgently send Rs. 40000 for his sister’s surgery. This, in fact, was a deepfake scam where the video call was AI-generated using deepfake technology.

How to stay safe against AI scams

1. Be careful when answering calls from unknown numbers; scammers can use legitimate numbers to trick you.

2. Whenever you get any communication that emphasizes words such as “urgent”, or “immediately”, etc, always take a step back, and go through the content carefully.

3. Before acting, take a moment to contact your loved one directly via a different means to verify and confirm fully.

4. Never open a link or scan a QR code that comes with such messages. If they pretend to be from another institution, search online about the same and see if you can find more information.

5. Do not share your financial information with anyone. Remember, discretion is the better part of valor.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@technoblender.com. The content will be deleted within 24 hours.
AI scamAI scam LucknowAI scam tipsAI scamsAI voiceAI voice scamartificial intelligence scamDefraudedfakeshow to stay safe from ai scamsLatestManonline scamSafeScamStayTechnoblenderTutorialVoiceWhatsApp scam
Comments (0)
Add Comment