The federal government should use part of the $10 billion allocated in the budget for cybersecurity defenses to combat people using AI to bypass biometric values, including voice authentication, a Green senator said.
On Friday, Guardian Australia reported that Centrelink’s voice authentication system can be fooled using a free online AI cloning service and just four minutes of audio of the user’s voice.
After Guardian Australia journalist Nick Evershed cloned his own voice, he was able to access his account using his cloned voice and customer reference number.
The Voiceprint service, provided by Nuance, the Microsoft-owned voice software company, was being used by 3.8 million Centrelink customers at the end of February, and more than 7.1 million people had verified their voice using the Voiceprint service. same system with the Australian Tax Office.
Despite being alerted to the vulnerability last week, Services Australia has not indicated it will change its use of voice identification, saying the technology is a “highly secure authentication method” and the agency “continually scans for threats.” potential and makes continuous improvements to ensure customer safety.
Green senator David Shoebridge said the finding was “deeply concerning” to people who depend on government services and that there needed to be a regulatory framework for the collection and use of biometric data.
“The concerns here go beyond the use of AI to fool the voiceprint,” he said. “There are few, if any, protections on the collection or use of our biometric data to power and train corporate AI systems.
“We cannot rely on a hit-and-run approach to digital security where problems are only solved once they embarrass the federal government.”
Shoebridge said the $10 billion in funding in last year’s budget for Australia’s Signals Directorate’s Redspice cyber defense program should include investment to ensure threats such as AI misuse can be identified and protected against. the whole government.
Guardian Australia has sought comment from Deputy Prime Minister and Defense Minister Richard Marles, head of the DSA.
Shoebridge said the government should also audit agencies that use speech recognition to ensure more security flaws are identified and fixed.
“The government’s main objective with the use of such technologies is to reduce operational costs rather than what is best for the millions of Australians who depend on government agencies and services,” he said. “These government savings are almost always paid for by Centrelink customers.”
In FY 2021-22, Services Australia reported that it used voice biometrics to authenticate 56,000 calls each business day and to authenticate over 39% of all calls to Centrelink’s main business lines.
Between August 2021 and June 2022 it was used on 11.4% of all child support calls, or more than 450 each business day.