Fraud Education and Prevention Articles

The New Guidelines for Identifying AI Deepfakes

Mar 04, 2026
Bank location icon
In September 2025, a partnership between the American Bankers Association and the Department of Justice released new guidelines for detecting AI deepfake media.

There are many variations on deepfake media scams, so consumers should be prepared for new iterations to surface.


How To Detect a Deepfake

When deepfake media first started to emerge, there were often telltale signs, like hands with too many fingers. Technological advancements have made these gaffes less common.

Here's what the ABA and DOJ endorse as deepfake red flags now:

  • Blurry or distorted facial features.
  • A person blinking too much or too little in a video.
  • Inauthentic-looking hair and teeth.
  • Audio and video being out of sync.
  • A flat or unnatural tone of voice.
  • Unnatural shadows or lighting.
  • Emotionally charged messages involving fear or urgency.
  • Requests for money, passwords, personal information, or secrecy.
  • Uncharacteristic behavior from friends, family, or public figures.

How To Avoid Falling Victim to a Deepfake Scam

The partners also recommend these actions to protect yourself from becoming a victim of a scam involving deepfake media:

  • Stop and think before reacting to a video, image, or a voice or video call. Are they asking for money, information, or an action like sharing on social media? (These are things scammers typically do.)
  • Verify the legitimacy of people and requests. Hang up and call back during any urgent phone or video call. (Again, Synovus will NEVER call, email or text you asking for personal information, login credentials or computer access.) Use trusted numbers, official websites and online reverse image/video search tools (like Google Image search) to see if an image has been used elsewhere on the internet in suspicious ways.
  • Create codewords or phrases with loved ones. When a suspicious call or video reaches you, ask them for the code.
  • Do what you can to limit your digital footprint. Scammers use photos, voice clips and videos of people found online to create deepfake media.
  • Never share a video or image online without verifying the source.

It can be frustrating to feel like anything you see online could be fake — and a scam, at that. Staying attuned to the latest AI developments and adopting a "verify everything" policy can help you stay ahead of scammers. And if you do fall for a deepfake — as many people do — the ABA and DOJ ask victims to report the incident to their local police, the FBI's Internet Crime Complaint Center, and your financial institution, if any money was exchanged.2 3

Recent

Important disclosure information

This content is general in nature and does not constitute legal, tax, accounting, financial or investment advice. You are encouraged to consult with competent legal, tax, accounting, financial or investment professionals based on your specific circumstances. We do not make any warranties as to accuracy or completeness of this information, do not endorse any third-party companies, products, or services described here, and take no liability for your use of this information.

  1. McAfee, "McAfee Study Reveals People’s Deep Concerns About the Impact of AI-Generated Deepfakes During Critical Election Year," published Apr 18, 2024. Accessed March 2, 2026. Back
  2. American Bankers Association, "Deepfake Media Scams," published September 2, 2025. Accessed March 2, 2026. Back
  3. Federal Bureau of Investigation, "Internet Crime Complaint Center," accessed March 2, 2026. Back
  4. Audrina Sinclair, “AI deepfakes are easier to make, harder to spot and made to fool you,” CBS News, published February 13, 2026. Accessed March 2, 2026. Back