'Please Help Me': How Deepfake Can Steal Your Money In Seconds?
'Please Help Me': How Deepfake Can Steal Your Money In Seconds?
Financial institutions have started issuing public interest communications to alert the customers.

The misuse of Deepfake technology in financial fraud has become a concerning issue around the world. Fraudsters leverage this advanced technology to deceive individuals by creating highly realistic but fake videos and audio.

What Is Deepfake Technology?

Deepfake technology allows for the creation of realistic videos, audio recordings and images that can manipulate and mislead viewers by superimposing the likeness of one person onto another, altering their words and actions, thereby presenting a false narrative or spreading misinformation.

Deepfakes, a blend of “deep learning” and “fake,” refer to the creation of synthetic visual content. This technology employs artificial intelligence (AI) to transpose the facial movements of one individual onto another’s face in video or audio recordings.

How Does the Deepfake Fraud Happen?

You are likely to receive messages/calls/video calls.

Using Deepfake technology, a fraudster may initiate a video call and briefly display a face that closely resembles someone you know. They will then quickly disconnect, switching to a voice call and claiming network issues. They can also mimic the voice of the person they are impersonating.

Typically, the fraudster will fabricate an emergency and request money urgently.

They might mention a medical emergency or a similar crisis, appealing to you for help. Given the sophistication of the technology, it is easy to believe that you are being contacted by someone you know in genuine need.

Fraudsters exploit the victim’s emotional connection to the person they are impersonating, requesting money to seem urgent and genuine.

Due to the advanced nature of Deepfake technology, the visuals and audio are highly convincing, leading the victim to believe they are genuinely in contact with their acquaintance in need.

How To Safeguard Yourself From Deepfake Scams?

According to Kotak Mahindra Bank’s safe banking tips, if someone you know contacts you seeking urgent financial help, try and verify their need by contacting those near and dear to them.

If possible, you should meet in person before making any payments.

Don’t accept any alternate numbers to which you can make payments via wallets/UPI.

Don’t transfer funds to anyone unless you have been able to satisfy yourself that it is a genuine need.

In a significant case documented this year, CNBC reported that in January 2024, an employee at a firm in Hong Kong transferred US$25 million to fraudsters. The instruction came during a video call purportedly involving her chief financial officer and other colleagues.

Later, it was revealed she hadn’t been on a call with any of them; fraudsters had used Deepfake technology to imitate their appearances, leading her to unwittingly send the money.

In India, a man in Kerala fell prey to an artificial intelligence scam in 2023 and lost Rs 40,000 after he received a call from someone claiming to be his former colleague.

Considering the threat revolving around the misuse of the technology, financial institutions have started issuing public interest communications to alert the customers.

For example, Bank of Baroda, one of India’s leading public sector banks, recently announced the release of its banking fraud awareness campaign, seeking to raise awareness about new-age financial frauds such as AI-generated Deepfake scams that can succeed in duping even the most vigilant customers.

The ads emphasise that by being alert, watchful and being able to recognise a fraud/fraudster or Con, customers can protect themselves and their confidential financial information and enjoy a safe and secure online banking and shopping experience.

What's your reaction?

Comments

https://umorina.info/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!