How good are Australians at spotting an AI-powered deepfake scam?

Our new research shows only four in ten can – but encouragingly the simple steps to help protect yourself stay the same, even against AI-powered scams.

13 January 2026

Nearly nine in ten Australians (89%) are confident to some extent they can spot an AI-generated scam, but new research from CommBank shows the opposite is true – with Australians only able to correctly distinguish between real and AI-generated images 42% of the time, which is below the chance of a random guess. Australians aged over 65 are only 6% less accurate than those younger than them – showing that deepfakes can fool people of all ages.

At the same time, less than half of Australians (42%) are familiar with AI-enhanced scams, despite deepfakes exploding across social media platforms, websites, messaging apps, and telecommunication channels.

Deepfakes are new but the steps to protect yourself haven’t changed

James Roberts, General Manager of Group Fraud, said: “The findings reveal a growing gap between confidence and reality – and that gap is exactly what scammers are looking to exploit as they increasingly turn to AI to target everyday Australians and small businesses.”

He said Australians should not feel overwhelmed by the pace of technological change.

“The good news is that the steps that keep people safe don’t need to evolve at the same speed as the technology does. Deepfakes might be new, but the same tried-and-tested habits – slowing down, checking details and speaking with someone you know and trust, such as a family member, remains your best defence – even against AI-powered scams.”

Some of the the images below are real and some are generated by AI. Can you tell which are real and which are AI?

Deepfakes work because they exploit trust

Professor Monica Whitty, Professor of Human Factors in Cyber Security at Monash University – who is well-known for her work on the prevention, disruption, and detection of cyber fraud – says deepfakes tap into people’s natural instincts.

“Humans tend to trust faces, voices and familiar people. Deepfakes take advantage of that instinct.”

She said lack of open discussion increases vulnerability.

“The data shows that many Australians don’t talk openly about deepfake scams – with only a third discussing AI-generated scams with their relatives or friends. That means fewer opportunities to share warning signs or learn from others’ experiences.”

Despite nearly three-quarters of Australians (74%) agreeing that they should set up a safe word with their loved ones to confirm it’s really them, only one in five (20%) say they have set one up.

Roberts says having a simple way to verify who you’re speaking with is becoming increasingly important. “Scammers can fake voices now, so it’s okay to double-check. In fact, it’s smart.”

That’s also why CommBank introduced CallerCheck, allowing customers to verify whether a caller claiming to be from the bank is legitimate by triggering a security message in their CommBank app.

“Be vigilant. Educate yourself. And if things look suspicious talk with others about it,” Professor Whitty added.

What Australians and small businesses are experiencing

Around one in four (27%) Australians say they had witnessed a deepfake scam in the past year. The most common types were:

  • Investment scams (59%)
  • Business email compromise scams (40%) and
  • Relationship scams (38%).

Around four in ten (41%) small business owners are familiar with deepfake scams.

Small businesses reported that half of all deepfake scam attempts (50%) arrived by email, yet only 55% had cross-checked supplier payment details in the last six months.

Roberts said more open conversations at home and work are essential.

“Scammers are using AI to create fake investment videos, deepfake celebrities, and even voice and text clones of loved ones, senior executives and government officials. Talking openly about this technology is one of the easiest ways to help stay ahead of it.”

A national cross-sector effort is needed

Roberts says deepfakes require coordinated action across the scams’ ecosystem.

“We recognise the impact of scams on Australians and support the Australian Government’s Scam Prevention Framework to introduce obligations initially across banks, telcos and digital platforms. Deepfakes are showing up on social media, messaging platforms, websites and even through phone calls – and we welcome stronger protections across those industries, as well as banking.

“Deepfakes are new, but protecting yourself hasn’t changed – and with stronger protections across all channels, we can help keep more Australians safe,” Roberts added.

How to help protect yourself from deepfake scams

Roberts says the core approach remains unchanged.

“The principles of ‘Stop. Check. Reject.’ can still help beat even the most convincing AI-enhanced scams,” Roberts said.

Investment scams — deepfake celebrities and experts with ‘don’t-miss-out’ success stories.
Deepfake videos imitate well-known people to promote fake investments.

 

  • Stop: Avoid investing through a social media link and be especially cautious of any investment ad featuring a celebrity.
  • Check: Speak with someone you trust like your independent financial advisor before transferring money and check ASIC’s Moneysmart Investor Alert List.
  • Reject: If you’re unsure, block, delete and report suspicious content to the platform where you saw the deepfake.

“Hey Mum/Dad” phishing scams – urgent calls and texts from someone you love
Voice and text cloning technology can mimic a family member perfectly.

 

  • Stop: Slow down – urgency is a tactic used to create panic.
  • Check: Set up a safe word for your family to use to help protect each other.
  • Reject: Hang up and call back via their usual number.
Small business invoice scams — AI-altered documents
Scammers use AI to create realistic invoice copies that change payment details and contact details.
  • Stop: If anything looks different – the account number, the tone, the logo – stop and take a closer look.
  • Check: Always verify new payment beneficiaries or changes to banking details through a verified channel. Call the supplier/service provider (e.g. plumber or conveyancer) on a verified number - rather than the one on the invoice - before paying.
  • Reject: If something feels off, delete the email or invoice and use your verified contact details instead.
  • Romance scams — deepfake faces and fake video calls
    AI can create real-time face swaps that appear authentic.
  • Stop: Don’t send money to a romantic interest if you’ve never met them in person – remember even video calls can be faked.
  • Check: Talk to a friend – secrecy is a major risk factor – and arrange to meet them safely in person.
  • Reject: Never send money to someone you haven’t met in person.
  • Business impersonation – fake CEOs, fake voices, fake
  • Stop: Don’t act on unexpected instructions to transfer money.
  • Check: Verify urgent payment requests via a trusted independent channel. 
  • Reject: Report concerns to Finance or IT immediately.

Notes to editors

Research conducted September 2025, 1,988 respondents nationally

  • 42% said they were familiar with deepfake scams.
  • 89% said they could spot a deepfake scam but only 42% were able to correctly distinguish between real and AI-generated images when tested (while those over 65 were 6% less accurate).
  • 27% said they had witnessed a deepfake scam in the past year – with 59% being investment scams, 40% business email compromise (payment redirection) scams, and 38% relationship scams.
  • 67% have not discussed AI-generated scams with their relatives or friends.
  • 74% said they should set up a safe word but only 20% said they have.

Small-business-related insights:

  • 41% said they were familiar with deepfake scams.
  • 50% of deepfakes arrived by email according to small businesses.
  • 55% of small businesses said they had cross-checked supplier payment details in the last six months.
  • 48% verify suspicious information.

Multimedia assets relating to this announcement are available. Please email [email protected] for more details.

Newsroom

For the latest news and announcements from Commonwealth Bank.

Things you should know

Media releases are prepared without considering an individual reader’s objectives, financial situation or needs. Readers should consider the appropriateness to their circumstances. Visit Important Information to access Product Disclosure Statements or Terms and Conditions which are currently available electronically for products of the Commonwealth Bank Group, along with the relevant Financial Services Guide. Target Market Determinations are available here. Loan applications are subject to credit approval. Interest rates are correct at the time they are published and are subject to change. Fees and charges may apply.