- Jul 07, 2025
- 9 min read
Hot New Fraud Trends in 2025: AI Scams, Pig Butchering, and Mobile Payment Attacks
In this article, we explore the most dangerous fraud trends and what both businesses and users can do to mitigate risk.

When an American couple received a call from their âgrandsonâ asking for help, they didnât hesitate to act. The caller claimed he was in jail without his phone or wallet following a car crash and urgently needed money for bail. The grandmother rushed to the bank and took out as much cash as she could. Only later did she learn it was a scamâthe voice on the call had been a deepfake.
In 2025, AI is everywhere, with deepfakes now accounting for 7% of global fraudulent activity. However, deepfakes are just one part of a broader wave of new sophisticated threats to watch out for this year.
In this article, we explore the most dangerous fraud trends and what both businesses and users can do to mitigate risk.
Landscape of fraud in 2025
Fraudsters are no longer just sending out phishing attacks that get sent straight to your spam box. Theyâre using recent breakthroughs in AI tech to laser-focus on vulnerable gaps across all sectors, wherever they can find them.
Current fraud trends indicate that criminals are exploiting these advances to make their attacks more personalized, more automated, and easier than ever.
According to Sumsubâs 2024 Identity Fraud Report, the democratization of fraud has also made it easier than ever for individuals to engage in fraudulent activities. Between 2021 and 2024, worldwide identity fraud rates more than doubled due to the sheer scale of attacks made possible by easily accessible fraud-as-a-service platforms and AI.
Suggested read: Fraud-as-a-Service: How $20 Can Cause Millions in Damage | âWhat The Fraud?â Podcast
Sectors where user identity is crucial are particularly at risk, with the top-five industries most affected by identity fraud in 2024 being dating, online media, banking & insurance, video gaming, and crypto. This is also a worldwide problem as fraud rates surged universally in all regions in 2024, from 112% in the US and Canada to 150% in Africa. The most common types of identity fraud are fake documents, chargeback fraud, and account takeovers, followed by rapidly emerging deepfakes.
Meanwhile, some industries are seeing a shift in fraud activities. Bank fraud trends are going mobile; app spoofing, account takeovers, and in-app social engineering attacks are scaling faster than many cyber security teams can respond.
AI-driven fraud is also hitting the crypto sector hard, with a notable shift toward potentially life-ruining pig butchering scams and pump-and-dump schemes. While these scams may have been around for a while, AI has made the barriers to entry for fraudsters lower than ever before.
In short, fraud in 2025 isnât just more common. Itâs become more accessible and more personalized at unprecedented scales.
AI-driven scams: Deepfakes and beyond
Photo, video, and audio deepfakes
The rise of deepfakes and generative AI tools to create synthetic identities has led to nothing short of a revolution in fraud. However, its impacts on public trust are perhaps even more insidious as scammers are known to impersonate people in often convincing ways. What once took considerable expertise on Photoshop now takes a matter of minutes on publicly available AI models. This is leading to scams that can cost the public their entire life savings and con businesses out of millions.
For example, one woman from Salisbury, in England, was scammed out of ÂŁ20k ($27k) after seeing a deepfake advertisement featuring British prime minister Keir Starmer promoting a crypto investment opportunity. Similar incidents occurred in Cyprus, where several citizens fell victim to investment fraud involving manipulated videos of politicians endorsing fake investment opportunities. In one case, losses exceeded âŹ40k ($45.6k), with some victims reporting they had invested their last remaining savings.
These deepfake scams often deliberately target vulnerable people and exploit peopleâs very worst fears for financial gain. Like a motherâs fear of her daughter being kidnapped. In one sinister scam that is tragically becoming more and more common, a man deepfaked the voice of his victimâs 15-year-old daughter and said he would kill her unless she paid him $1 million.
Itâs not just peopleâs fears that scammers exploit, but their hopes and dreams too. In New Zealand, for example, patients with Type 2 diabetes have been conned by deepfake doctors into buying groundbreaking treatments promising to cure them of their illnesses. These promises are untrue and life-threatening if their victims stop taking their treatments.
While users should stay alert to AI threats, businesses can play an important role in reducing these risks by:
- Investing more resources on educating users about the risks of fraud;
- Using Digital Risk Protection services and tools to stop fake advertisements and usages of your brand or name.
However, deepfakes are just the tip of the iceberg. AI has dramatically expanded the range, scale, and believability of fraud.
Synthetic ID scam
Financial fraud trends are also being increasingly driven by synthetic identities, made by stitching together AI-generated headshots, fabricated addresses, and real stolen credentials to pass KYC checks at banks, crypto exchanges, and fintechs. These âFrankenstein identitiesâ can successfully onboard unnoticed, lie dormant, and then be used to commit fraud, sometimes at considerable scale.
As part of a huge investigation by Toronto Police called Project DĂŠjĂ Vu, one person had been found to have opened hundreds of accounts using synthetic identities. These identities were used to apply for and open accounts at banks and financial institutions across Ontario. The scam resulted in confirmed losses of about CA$4 million ($2.9 million).
AI-mutated malware
AI may also be changing how malware acts, making ransomware and phishing campaigns able to adapt in real time to evade detection, before launching an attack based on victim behavior. Cornell University proved this concept with AI-driven attack frameworks capable of bypassing most antivirus systems, which could pose a huge threat to both businesses and the general public.
Fraudsters donât need to be tech geniusesâthey just need access. Deepfake tools are now plug-and-play, and fraud-as-a-service marketplaces are thriving on the dark web. Anyone with as little as $50 can buy a ready-made synthetic ID kit or AI-generated voice bots for social engineering attacks.
As large language models get smarter, and biometrics get harder to spoof-proof, the arms race between fraud and detection is heating up.
Suggested read: Fraud Trends for 2025: From AI-Driven Scams to Identity Theft and Fraud Democratization
Cryptocurrency fraud: Pig butchering scams, pump-and-dumps, and more
According to Sumsubâs Crypto Industry Report, as more people have started using cryptocurrencies, fraud has surged. Global fraud rates in the crypto industry are up 48%, and a persistent threat remains.
As the adoption of digital assets continues to grow, so does the creativity of bad actors exploiting emerging gaps. Like general trends in fraud, fraud-as-a-service and AI are making cryptocurrency fraud more complex, personalized, and manipulative than ever before.
Pig butchering
One of the most damaging new schemes is âpig butchering scamsâ. This type of scam originated from China, where it was called âsha zhu pan or âthe killing pig gameâ. In essence, scammers gain the victimsâ trust to âfatten them upâ before âslaughteringâ them, or in other words, taking their funds.
Scammers typically make contact through dating apps or messaging services, build trust, and then lure targets onto crypto trading platforms. According to Chainalysis, revenue from these scams grew by nearly 40% year-over-year, with many operations linked to organized criminal networks using AI to improve targeting and dialogue.
Suggested read: Pig Butchering: Inside the Billion-Dollar Scam Factories | âWhat The Fraud?â Podcast
Crypto drainers
Crypto drainer kits are also booming. These malicious scripts are designed to empty cryptocurrency from a victimâs wallet to a wallet controlled by an attacker. Victims can be tricked into connecting their wallets to fraudulent platforms, like fake NFT marketplaces or DeFi services.
These kits are often sold as part of fraud-as-a-service packages on the dark web, giving this type of scam an incredibly low barrier for entry. According to the Chainalysis Crypto Crime Report, drainers played a key role in the $2.2 billion stolen from victims in 2024.
Pump-and-dump schemes
While pump-and-dump schemes are nothing new, AI has given crypto scammers vastly more options. Fraudsters can now use bots, synthetic social media accounts, and deepfakes of key opinion leaders in âpumpsâ to drive up the prices of low-cap tokens. Once the price spikes, the scammers âdumpâ their overvalued holdings, leaving retail investors with next to nothing. According to Chainanalysis, 3.59% of all tokens launched in 2024 (74,037 out of 2,063,519) displayed patterns of pump-and-dump schemes.
Last year, four individuals in Australia were criminally charged for allegedly orchestrating a coordinated scheme to inflate the value of Australian stocks before offloading them at artificially high prices. They now face up to 15 years in prison and fines exceeding $1 million.
Suggested read: 8 Crypto Scams to Be Aware of in 2025: A Guide for Businesses and Users
Mobile payment fraud trends: QR code scams to social engineering
From buying a cup of coffee to paying your billsâmobile payment methods make our lives much easier. But they can also make scammersâ lives more convenient if youâre not careful.
Common mobile payment fraud trends to watch out for
- Account takeovers: Fraudsters can use stolen credentials or exploit weak authentications to hijack mobile banking and payment accounts. Once inside, fraudsters can drain balances, make unauthorized transfers, or add new payment methods to siphon off money. Sumsubâs 2024 Fraud Report showed account takeover cases surged by 250% year-over-year.
- QR code scams: QR codes are everywhere and are increasingly used in payments. Fraudsters are fully aware of this and can swap legitimate QR codes for malicious ones that redirect users to phishing sites or initiate fake payment requests. One reported case involved scammers placing counterfeit QR codes in Tyne & Wear Metro parking lots, tricking commuters into sending money to fraudsters.
- Fake payment apps: Cybercriminals can develop spoofed mobile payment apps that look and feel identical to legitimate ones but steal usersâ data and funds. These apps often spread through unofficial app stores or phishing links. Reported in November 2024, a fake application was sent across India via WhatsApp, designed to target users of the Indian instant payment system UPI.
- Social engineering in mobile payments: Fraudsters can impersonate trusted contacts or support agents via SMS or messaging apps, tricking users into approving fake transactions or sharing their one-time passwords.
Suggested read: What Is Social Engineering Fraud and Why Is It on the Rise?
Financial fraud: From BEC to scamming insurers
The financial landscape has changed significantly in recent years and itâs no surprise that the landscape of financial fraud is changing with it.
Key financial fraud trends to monitor
- Synthetic identity fraud: Synthetic identities can be used to open credit lines, loans, or accounts while bypassing red flags.
- Loan stacking and application fraud: Borrowers can apply for multiple loans simultaneously across different lenders, often using fake or stolen documents.
- Invoice and Business Email Compromise (BEC) scams: Fraudsters may impersonate executives or suppliers to trick companies into wiring funds to fake accounts. Business email compromise scams have evolved with AI-enhanced email spoofing, making them harder to spot. According to the 2025 AFP Payments Fraud and Control Survey, 79% of organizations experienced payments fraud attacks or attempts in 2024. BEC remains a major threat, with 63% of respondents citing it as the top fraud avenue.
- Insurance fraud: False claims and staged accidents are being exacerbated by AI-generated fake evidence, including photos, videos, and documents, making fraud investigations more complex and costly.
Romance scams: Incredibly personal and incredibly cruel
Romance scams unfortunately remain a cruel staple in the fraud landscape, made all the more realistic and personal with recent advances in AI. Dating platforms are particularly high risk for identity fraud, with the Sumsub 2024 Fraud Report highlighting how dating platforms lead identity fraud rates with an 8.9% share, double the risk in the banking and insurance sector.
These scams often start on dating apps, social media, or chat platforms, where fraudsters can create believable profiles, sometimes using deepfake photos or videos to appear real. Once trust is built, the fraudster can create an elaborate story, like a sudden medical emergency, to extract money from a smitten victim fast. Romance scammers often urge victims to invest in fake crypto projects or send funds to wallets, exploiting the victimâs desire to help or perhaps build a future with a romantic partner.
Last year, in Hong Kong, police arrested 27 people involved in a large-scale deepfake romance scam. The syndicate used AI face-swapping and voice-changing technology to create convincing fake personas on dating platforms, then lured victims into fake cryptocurrency investments with fabricated profit records. Victims were convinced by real-time deepfake video calls and lost millions.
Romance scams consistently rank among the highest in terms of monetary loss per victim. The emotional manipulation involved can mean victims often delay reporting, allowing scammers to exploit victims for longer periods.
To avoid these scams, users should always remember to cross-check names, photos, and bios onlineâscammers often use stolen identities. Users need to stay alert, and platforms should remind them to be especially cautious if someone asks for money or personal information, even subtly.
Suggested read: Detecting Romance and Dating Scams: A Guide for Dating Platforms and Their Users
Whatâs next? In a changing fraud landscape, anyone can become a scammer
AI and other recent technological changes have led to nothing short of a revolution in fraud, who can use advanced technologies and exploit systemic vulnerabilities at a scale never seen before. Here are some of the most concerning trends:
- Fraud-as-a-service: Fraud-as-a-service has democratized cybercrime, allowing virtually anyone with basic technical skills to launch sophisticated attacks using toolkits often bought on the dark web.
- AI-driven scams: Impersonations generated with AI can deceive consumers and businesses, and are expected to continue in the years to come.
- Vishing with AI: Voice-cloned calls can be made to fool targets into thinking they are talking to someone else.
- Synthetic data attacks: AI-generated fake transaction histories and credit scores can be used to bypass identity verification.
- Subscription renewal scams: Fake renewal notices may trick victims into making payments to fraudsters.
- Formjacking: Malicious code on payment forms can steal card details during legitimate transactions.
- Click fraud: AI bots can mimic human clicks to inflate ad costs for businesses and skew metrics.
- Fake crypto exchanges: Counterfeit platforms can steal user deposits before disappearing.
- Flash loan attacks: These exploit vulnerabilities in DeFi smart contracts, using liquidity from flash loans to manipulate cryptocurrency prices, and steal funds from protocols.
- Ransomware & blackmail: Extortion demands are often made in crypto following data encryption attacks.
- Data poisoning: Fake data can be fed to AI fraud detection systems to weaken defenses.
Staying ahead of fraud trends: How users and businesses can stay safe online
The 2025 fraud landscape is an increasingly complex and often frightening place. Fraudsters are harnessing powerful tools like AI and the democratization of fraud to launch sophisticated attacks that not only bypass our traditional defenses, but also our very understanding of who we can trust.
To stay ahead of these evolving fraud trends, both businesses and individuals need to be proactively cautious. This means staying vigilant by verifying identities, using scam-tracking tools, and being careful of anything that feels wrong, like unsolicited and urgent requests for money.
For businesses a multi-layered and adaptable strategy is key, something Sumsub champions through advanced fraud prevention solutions. By combining AI-powered real-time monitoring, adaptive identity verification, and integration of biometric checks, Sumsub helps companies detect suspicious activity before it has a chance to escalate.
Suggested read: Fraud Detection and PreventionâBest Practices for 2025
Fraud prevention strategies
In 2025 the traditional defenses that have kept us largely safe up until now are also facing new and rapidly changing threats.
The good news is that threat detection prevention practices are evolving just as rapidly as fraud to keep people safe. Companies should implement comprehensive multi-layered solutions that can detect fraud at various stages. And fight AI with AI. The kit to successfully fight fraud in 2025 includes:
- Advanced identity verification solutions
- Biometrics
- Transaction monitoring
- Behavioral pattern checks
- Device fingerprinting
- Background checks
- Password protection
- Deepfake detection
- Public education about staying safe
- Collaboration between financial institutions, law enforcement, and tech firms
Some solutions that companies need to detect and prevent fraud include:
- User Verification
- Business Verification
- Transaction Monitoring
- Geolocation Tracking
- Deepfake Detection
- Device Fingerprinting
- Fraud Scoring
- Fraud Network Detection
- Behavioral Intelligence
-
What is the top fraud trend in 2025?
AI-driven scams and synthetic identities are the top fraud trends in 2025. While there are many more cases of fraudsters using fake documents, fraudsters are also increasingly using deepfakes and automated tools to bypass traditional security measures.
-
What is the FTC fraud trend?
The FTC reports that consumers lost more than $12.5 billion to fraud in 2024, which represents a 25% increase compared to the previous year. Consumers reported losing the most to investment scams.
-
How is AI being used in fraudulent activities?
Fraudsters use AI to create deepfake videos, synthetic voices, and fake documents, allowing them to craft highly convincing scams.
-
What are the trends in check fraud?
Check fraud remains a persistent threat. However, it is evolving with increases in identity fraud and tactics like remote deposit capture scams, where the same check is deposited multiple times.
-
What is pig-butchering in cryptocurrency scams?
Pig-butchering scams involve building a fake, often romantic, relationship over time to lure victims into investing in fraudulent crypto schemes, only to drain their funds once trust is established. It evokes an image of fattening a pig before slaughtering it.