Imagine receiving a call from a loved one and hearing their unmistakable voice on the other end of the line saying that they need money or your account information urgently. This may sound like a fraud lifted straight out of science fiction, but — with the exponential development of artificial intelligence (AI) tools — it is a growing reality.
Gur Geva, Founder, and CEO of iiDENTIFii, a remote biometric digital authentication, and automated onboarding technology company — said, “The technology required to impersonate an individual has become cheaper, easier to use, and more accessible.” “This means that it is simpler than ever before for a criminal to assume one aspect of a person’s identity.”
All a criminal needs is a short audio clip of a family member’s voice — thanks to the advent of technology that can often be scraped from social media — and a voice cloning program to stage an attack.
The Federal Trade Commission (FTC), an independent agency of the United States government last week issued an alert urging consumers to be vigilant for calls in which scammers sound exactly like their loved ones.
The potential of this technology is vast. Microsoft, for example, has recently piloted an AI tool that, with a short sample of a person’s voice, can generate audio in a wide range of different languages. While this has not been released for public use, it does illustrate how voice can be manipulated as a medium.
While exposing fault lines in voice biometrics, Geva says that historically voice has been seen as an intimate and infallible part of a person’s identity. For that reason, many businesses and financial institutions used it as a part of their identity verification toolbox.
Audio recognition technology has been an attractive security solution for financial services companies, with voice-based accounting enabling customers to deliver account instructions via voice command. Voice biometrics offers real-time authentication, which replaces the need for security questions or even PINs. Barclays, for example, integrated Siri to facilitate mobile banking payments without the need to open or log into the banking app.
“As voice-cloning becomes a viable threat, financial institutions need to be aware of the possibility of widespread fraud in voice-based interfaces. For example, a scammer could clone a consumer’s voice and transact on their behalf,” explains Geva.
The rise of voice-cloning illustrates the importance of sophisticated and multi-layered biometric authentication processes.
With experience, research, and global insight, iiDENTIFii has created a remote biometric digital verification technology that can authenticate a person in under 30 seconds, but more importantly, it triangulates the person’s identity, with their verified documentation and their liveness.
iiDENTIFii uses biometrics with liveness detection, protecting against impersonation and deep fake attacks.
“Even voice recognition with motion requirements is no longer enough to ensure that you are dealing with a real person,” says Geva. “Without high-security liveness detection, synthetic fraudsters can use voice cloning, along with photos or videos to spoof the authentication process.”
Geva emphasized that while identity theft is growing in scale and sophistication, the tools they have at their disposal to prevent fraud are intelligent, scalable, and up to the challenge.