Powered by MOMENTUM MEDIA
accounting times logo

Powered by MOMENTUMMEDIA

Powered by MOMENTUMMEDIA

AI-powered voice-cloning becomes scammers’ weapon of choice

Technology
24 May 2023
ai powered voice cloning becomes scammers weapon of choice

More people are falling victim to elaborate deep-fakes, says cyber specialist Recorded Future.

Voice-cloning technology is being turbocharged by AI and takes bank fraud, executive impersonation and a range of other scams to the next level, according to cyber security company Recorded Future.

Its latest report on deep fakes revealed AI-enhanced voice replication could defeat standard security protocols such as multifactor identification and the technology was readily available.

“Currently, the most effective use of voice cloning technologies is in generating one-time samples that can be used in extortion scams, disinformation, or executive impersonation,” it said.

==
==

“One of the most popular voice cloning platforms on the market is ElevenLabs’s Prime Voice AI, a browser-based text-to-speech software that allows users to upload ‘custom’ voice samples for a premium fee.”

“Voice cloning technologies, such as ElevenLabs, lower the barrier to entry for inexperienced English-speaking cybercriminals seeking to engage in low-risk impersonation schemes and provide opportunities for more sophisticated actors to undertake high-impact fraudulent schemes.”

Recorded Future threat intelligence analyst Alexander Leslie said financial services, retail, media and government services were struggling to fight against “perpetrators that are getting smarter and sneakier”.

“Technologies such as AI are making it easier for scammers to fool consumers and professionals, but also easier to quickly spread out complex scams on a wide scale.”

“The rise of deepfakes is a great example of this and what we're seeing is that more Australians are falling victims of these elaborate schemes.”

The report, titled I Have No Mouth, And I Must Do Crime, highlights six areas of particular vulnerability to voice cloning: executive impersonation, bank fraud, callback scams, family emergency scams, AI music and copyright infringements, and disinformation.

It said the impersonation of high-level executives to defraud enterprises and banks had been used many times over the past four years, with scammers trying to trick targets into participating in recorded phone calls or video chats.

Organisations should develop unique voiceprints for their executives to make it harder to replicate their voices, it said, while bank helplines, which often used voice recognition to authenticate callers, might already be moving away from this technology and deploying more sophisticated detection software.

Mr Leslie said voice cloning detection systems included real-time voice analysis capability and anti-spoofing technology such as liveness detection. Vulnerable organisations should also consider staff training and education.

“I advise Australian organisations to train their employees on the risks associated with voice cloning and how to identify suspicious activity related to voice cloning attacks, including executive impersonations,” he said.

Callback scams, also known as “Wangiri”, were a popular technique that targeted both individuals and enterprises. Scammers called victims and disconnected after one ring in the hope that the victim would return the call. The victim was then manipulated into staying on the call as long as possible, often by being placed on hold. The return call was routed through an international number that unbeknownst to the victim accumulated international calling fees.

Family emergency scams involved the scammer posing as a family member or friend in need of emergency financial assistance. Scammers faked authority figures, such as police, lawyers or doctors, or replicated the voices of loved ones. Organisations could establish emergency contact verification protocols to mitigate the risk of this.

AI was also the driving force behind musical mimickry to get around copyright and the impersonation of public figures appearing to say things they never actually said. These could be used to create fake news reports, manipulate audio or spread disinformation through social media.

Ms Leslie said there was an urgent need for an organised response.

“The outlook for voice cloning and its use in particular in banking fraud, disinformation, social engineering, copyright infringement and more is bleak if we do not immediately adopt an industry-wide approach to mitigating associated risks.

“Mitigation strategies need to be multidisciplinary, and adopting a framework that educates employees, users and consumers will be more effective in the short-term than fighting abuse of the technology itself, which should be a long-term strategic goal.”

The federal government recently funded the $58 million launch of a National Anti-Scam Centre to help address the issue.

About the author

author image

Philip King is editor of Accounting Times, Accountants Daily and SMSF Adviser, the leading sources of news, insight, and educational content for professionals in the accounting and SMSF sectors. Philip joined the titles in March 2022 and brings extensive experience from a variety of roles at The Australian national broadsheet daily, most recently as motoring editor. His background also takes in spells on diverse consumer and trade magazines. You can email Philip on: [email protected]

Subscribe

Join our subscribers get exclusive access to freebies and the latest news

Subscribe now!
NEED TO KNOW