Digital ImpulseInternetPrivacySecurityTech StyleTechnology

How Financial Institutions Can Safeguard Against Deepfakes

As more and more people embrace digital banking, deepfake technology is a serious threat.

With deepfake technology, it is simple and possible to edit a person’s facial and vocal likeness with alarming accuracy. For the most part, this can be seen as harmless entertainment. But what if your likeness was used to drain your savings or commit fraud?

As the technology to create deepfakes becomes easier and cheaper, the need to guard against these cybercrimes has come to the forefront.

A deepfake is a video, visual, or audio recording that has been distorted, manipulated, or synthetically created using deep learning techniques to present an individual, or a hybrid of several people, saying or doing something that they did not say or do.

These deepfakes are often used in digital injection attacks which are sophisticated, highly scalable, and replicable cyberattacks that bypass the camera on a device or are injected into a data stream.

The Chief Operating Officer of iiDENTIFii, Murray Collyer, says, “Digital injection attacks present the highest threat to financial services, as the AI technology behind it is affordable, and the attacks are rapidly scalable.

In fact, a recent digital security report by technology partner, iProov, illustrates how, in an indiscriminate attempt to bypass an organization’s security systems, some 200-300 attacks were launched globally from the same location within a 24-hour period. As more and more people embrace digital banking, deepfake technology is a serious threat.

As more people set up digital accounts and do their banking online, financial crime and cybercrime have become more inextricably linked than ever before. Interpol states that financial and cybercrimes are the world’s leading crime threats and are projected to increase the most.

Collyer noted that “Deepfake technology is one of the most rapidly growing threats within financial services, yet not all verification technologies are resilient to it. Password-based systems, for example, are highly susceptible to fraud. South Africa needs to strengthen their technology to outwit cyber criminals.”

While deepfakes are a severe threat, the technology and processes exist to safeguard financial services companies against this method of fraud.

A growing percentage of face biometric technology incorporates some form of liveness checks — such as wink and blink — to verify and authenticate customers. Liveness detection uses biometric technology to determine whether the individual presenting is a real human being, not a presented artifact. Therefore, this technology can detect a deepfake if it were to be played on a device and presented to the camera.

While many liveness detection technologies can determine if someone is conducting fraud by holding up a physical image (for example, a printed picture or mask of the person transacting) to the screen, many solutions cannot detect digital injection attacks.

Collyer says specialized technology is required to combat deepfakes.

“Within iiDENTIFii, we have seen success with the use of sophisticated yet accessible 4D liveness technology, which includes a timestamp and is further verified through a three-step process where the user’s selfie and ID document data are checked with relevant government databases. This enables us to accurately authenticate someone’s identity,” explained Collyer.

With the right technology, it is not only possible to protect consumers and businesses against deepfake financial crimes but also create a user experience that is simple, accessible, and safe for all.

Collyer will be part of the speakers present at the 8th installment of the AML, Financial Crime Southern Africa Conference. The high-level conference is currently being hosted at the Indaba Hotel Fourways in South Africa —attended by professionals from banks, insurance and investment companies, service providers, government, and MLCOs from non-designated financial service providers.

ALSO READ: UNDERSTANDING DEEPFAKE TECH: HOW IT WORKS AND CONCERNS ARISING FROM ITS IMPLEMENTATION

Tags

Joan Banura

Joan Banura is an aspiring journalist with a passion for all things tech. She is committed to providing insightful and thought-provoking content that keeps our readers informed and engaged.
Back to top button
Close

Adblock Detected

Please disable your adblocker to continue accessing this site.