(Image source from: Freepik.com)
The misuse of deepfake technology in financial fraud has become a global problem. Scammers use this advanced technology to create very real but fake videos and audios to deceive people.
What is Deepfake Technology?
Deepfake technology can be used to manipulate or mislead viewers by superimposing one person's likeness on another or altering that person's words or actions. Deepfake is a combination of “deep learning” and “fake” and refers to the creation of artificial visual content. This technology uses artificial intelligence (AI) to replace one person's facial movements with those of another in a video or audio recording.
How do deepfake scams happen?
You may receive messages/calls/video calls. Using deepfake technology, scammers can make a video call and temporarily display a face that looks very similar to someone they know. Then it immediately hangs up and switches to a voice call, claiming there are network issues. You can also imitate the voice of the person you are impersonating. Scammers usually create an emergency situation and demand money immediately. They may report a medical emergency or similar crisis and ask for help. Given the complexity of technology, it's easy to believe that someone you know will call you when they really need you. Scammers exploit the victim's emotional connection to the person they are impersonating and make it seem immediate and real and want money. Due to the sophisticated nature of deepfake technology, video and audio are so convincing that victims believe they are actually communicating with someone they know.
How can you protect yourself against deep-fake fraud?
According to Kotak Mahindra Bank's safe banking tips: If someone you know approaches you for urgent financial help, try contacting people close to you to verify their needs. If possible, let's meet in person before payment. Don't accept alternate numbers where you can pay via wallet/UPI. Do not transfer funds to a third party until you are sure that they are really needed. In connection with the remarkable case reported this year, CNBC reported that the company will operate in Hong Kong in 2024. The instruction came during a video call with the participation of the financial director and other colleagues. It was later revealed that he had not spoken to any of them; Fraudsters used deepfake technology to mimic your appearance, resulting in money being sent without your knowledge.
In India 2023, a Kerala man fell victim to an AI scam and lost Rs 40,000 after receiving a call from someone claiming to be his former colleague. Faced with the dangers of misuse of the technology, financial institutions have begun issuing public service alerts to warn customers. Bank of Baroda, one of India's leading state-owned banks, for example, recently unveiled its bank fraud awareness campaign, which aims to raise awareness of emerging financial fraud such as AI-generated deep forgeries. Very attentive customers. The ad emphasizes that customers can protect themselves and their sensitive financial information and enjoy safe online banking and shopping by recognizing, alerting and detecting scammers or fraudsters.