Skip To Main Content

How to stay safe from AI-powered fraud.

From impersonating your bank, the IRS, or even a loved one, scammers are harnessing AI to create more realistic scams that the savviest among us may not recognize. Below, we walk through some of the AI tools scammers are relying on and outline how you can protect yourself and your loved ones from financial fraud.

What are AI-powered scams?

“An AI-powered scam is when a bad actor leverages AI technology to lend authenticity to the scam they’re trying to perpetrate,” says Christopher Garcia, corporate investigations manager with Commerce Bank. “They may use ChatGPT to sound more professional or mimic the language a bank or government agency might use. They can also use voice and even video impersonations, as in ‘grandparent schemes opens in a new window’ where a scammer mimics a grandchild in distress to lure grandparents into sending money.”

Who is susceptible to AI-powered scams?

“The short answer is: anyone of any age,” reports Garcia. In fact, the FTC says that those 18—59 are 34% more likely opens in a new window than those over 60 to report losing money to fraud.

What are examples of recent AI-powered scams?

  • More convincing phishing emails and text messages: “In the past, a lot of phishing emails (emails purporting to be from reputable companies in order to induce the recipient to reveal personal information) were caught by spam filters for red flags like poor grammar, misspellings or strange sentence structures,” says Garcia. “Now, bad actors can run their communications through ChatGPT and even ask it to mimic a bank or government institution. The communications sound more credible, and are more likely to slip through both automated spam filters and our own mental scam detectors.”

  • Video deepfake impersonations: In February of 2024 a financial worker at a multinational firm was duped into transferring $25 million into fraudsters’ accounts opens in a new window after scammers used deepfake technology to impersonate the company’s CFO and several other staffers on a live video conference call.

  • Voice deepfake impersonations: “Grandparent schemes,” or family emergency scams, are when bad actors use AI combined with data from social media accounts to clone a loved one’s voice, then use that voice to trick a parent or grandparent into transferring money to help their family member.

How can you protect yourself from AI-powered scams?

“I used to say ‘trust, then verify’ but these days just ‘verify’ may be a better mantra,” says Garcia. Some other good rules to follow are:

  • Pause and ask yourself: Does this make sense?
    “Often scammers use fear, rushing and intimidation to get us into a flustered state where we’re more likely to divulge information we might not otherwise,” says Garcia. “Always remember to slow down and ask yourself whether what the individual is asking you to do actually makes sense for the situation. If it doesn’t, there’s a very good chance you’re being scammed.” If you receive messages that a loved one is in danger, call the loved one directly and/or get in touch with their parents or friends before divulging any personal information or taking action.

  • Do not answer phone calls or texts from anyone you don’t recognize — even if they reference a company or organization that you do recognize.
    It’s easier than ever for scammers to impersonate institutions, even tricking your Caller ID into identifying scammers as legitimate companies. No matter what the hook is — that you’ve won a prize, that your account service will be disrupted if you don’t pay — find the phone number of the institution in question and phone them yourself.

  • Do not ever give out your personal information without verifying that you are speaking to the actual organization.
    If a caller or inbound text or email asks you to verify your first or last name, address, driver’s license number, social security number, credit card number or any banking information — even if they are claiming they just need to verify your identity — do not provide it. Instead, call the institution directly.

  • Report scams
    Remember that the savviest among us has been tricked by scammers. You can report any scam experiences you have to the FTC at ReportFraud.ftc.gov opens in a new window.

Some final words of advice from Garcia: “Trust your gut — if something feels wrong, it likely is. Talk to friends and loved ones about common scams, and keep in mind that it can happen to anyone.”

Disclosures:

To view or print a PDF file, Adobe® Reader® 9.5 or above is recommended. Download the latest version opens in a new window.

Back to top