Deepfakes work because they exploit trust
Professor Monica Whitty, Professor of Human Factors in Cyber Security at Monash University – who is well-known for her work on the prevention, disruption, and detection of cyber fraud – says deepfakes tap into people’s natural instincts.
“Humans tend to trust faces, voices and familiar people. Deepfakes take advantage of that instinct.”
She said lack of open discussion increases vulnerability.
“The data shows that many Australians don’t talk openly about deepfake scams – with only a third discussing AI-generated scams with their relatives or friends. That means fewer opportunities to share warning signs or learn from others’ experiences.”
Despite nearly three-quarters of Australians (74%) agreeing that they should set up a safe word with their loved ones to confirm it’s really them, only one in five (20%) say they have set one up.
Roberts says having a simple way to verify who you’re speaking with is becoming increasingly important. “Scammers can fake voices now, so it’s okay to double-check. In fact, it’s smart.”
That’s also why CommBank introduced CallerCheck, allowing customers to verify whether a caller claiming to be from the bank is legitimate by triggering a security message in their CommBank app.
“Be vigilant. Educate yourself. And if things look suspicious talk with others about it,” Professor Whitty added.
What Australians and small businesses are experiencing
Around one in four (27%) Australians say they had witnessed a deepfake scam in the past year. The most common types were:
- Investment scams (59%)
- Business email compromise scams (40%) and
- Relationship scams (38%).
Around four in ten (41%) small business owners are familiar with deepfake scams.
Small businesses reported that half of all deepfake scam attempts (50%) arrived by email, yet only 55% had cross-checked supplier payment details in the last six months.
Roberts said more open conversations at home and work are essential.
“Scammers are using AI to create fake investment videos, deepfake celebrities, and even voice and text clones of loved ones, senior executives and government officials. Talking openly about this technology is one of the easiest ways to help stay ahead of it.”
A national cross-sector effort is needed
Roberts says deepfakes require coordinated action across the scams’ ecosystem.
“We recognise the impact of scams on Australians and support the Australian Government’s Scam Prevention Framework to introduce obligations initially across banks, telcos and digital platforms. Deepfakes are showing up on social media, messaging platforms, websites and even through phone calls – and we welcome stronger protections across those industries, as well as banking.
“Deepfakes are new, but protecting yourself hasn’t changed – and with stronger protections across all channels, we can help keep more Australians safe,” Roberts added.
How to help protect yourself from deepfake scams
Roberts says the core approach remains unchanged.
“The principles of ‘Stop. Check. Reject.’ can still help beat even the most convincing AI-enhanced scams,” Roberts said.
Investment scams — deepfake celebrities and experts with ‘don’t-miss-out’ success stories.
Deepfake videos imitate well-known people to promote fake investments.
|
- Stop: Avoid investing through a social media link and be especially cautious of any investment ad featuring a celebrity.
- Check: Speak with someone you trust like your independent financial advisor before transferring money and check ASIC’s Moneysmart Investor Alert List.
- Reject: If you’re unsure, block, delete and report suspicious content to the platform where you saw the deepfake.
|
“Hey Mum/Dad” phishing scams – urgent calls and texts from someone you love
Voice and text cloning technology can mimic a family member perfectly.
|
- Stop: Slow down – urgency is a tactic used to create panic.
- Check: Set up a safe word for your family to use to help protect each other.
- Reject: Hang up and call back via their usual number.
|
Small business invoice scams — AI-altered documents
Scammers use AI to create realistic invoice copies that change payment details and contact details. |
- Stop: If anything looks different – the account number, the tone, the logo – stop and take a closer look.
- Check: Always verify new payment beneficiaries or changes to banking details through a verified channel. Call the supplier/service provider (e.g. plumber or conveyancer) on a verified number - rather than the one on the invoice - before paying.
- Reject: If something feels off, delete the email or invoice and use your verified contact details instead.
|
- Romance scams — deepfake faces and fake video calls
AI can create real-time face swaps that appear authentic.
|
- Stop: Don’t send money to a romantic interest if you’ve never met them in person – remember even video calls can be faked.
- Check: Talk to a friend – secrecy is a major risk factor – and arrange to meet them safely in person.
- Reject: Never send money to someone you haven’t met in person.
|
- Business impersonation – fake CEOs, fake voices, fake
|
- Stop: Don’t act on unexpected instructions to transfer money.
- Check: Verify urgent payment requests via a trusted independent channel.
- Reject: Report concerns to Finance or IT immediately.
|
Notes to editors
Research conducted September 2025, 1,988 respondents nationally
- 42% said they were familiar with deepfake scams.
- 89% said they could spot a deepfake scam but only 42% were able to correctly distinguish between real and AI-generated images when tested (while those over 65 were 6% less accurate).
- 27% said they had witnessed a deepfake scam in the past year – with 59% being investment scams, 40% business email compromise (payment redirection) scams, and 38% relationship scams.
- 67% have not discussed AI-generated scams with their relatives or friends.
- 74% said they should set up a safe word but only 20% said they have.
Small-business-related insights:
- 41% said they were familiar with deepfake scams.
- 50% of deepfakes arrived by email according to small businesses.
- 55% of small businesses said they had cross-checked supplier payment details in the last six months.
- 48% verify suspicious information.
Multimedia assets relating to this announcement are available. Please email [email protected] for more details.