- Jan 09, 2025
- 22 min read
Fraud Trends 2025: How to Stay Resilient. âWhat The Fraud?â Podcast
Dive into the world of fraud with the "What the Fraud?" podcast! đ Todayâs guest is Paul Maskall, Behavioral Lead at UK Finance and the City of London Police. Tom and Paul explore how our relationship with technology creates new vulnerabilities and amplifies risks. They unpack the behavioral science behind fraud and cybercrime, offering key insights on how businesses can fight back.
TOM TARANIUK: Hello and welcome to the season finale of series two of âWhat the Fraud?â, a podcast by Sumsub where digital fraudsters meet their match. Iâm Thomas Taraniuk, currently responsible for some of our very exciting partnerships here at Sumsub, the global digital verification platform helping to verify and secure users, businesses and transactions as well. In this episode, it feels apt to look ahead at the year of 2025, exploring the latest trends and shifts in the fraud landscape. From the rise of AI driven fraud to changes in global regulations, today will provide a view of what to expect next year, the ongoing challenges and how the fight against fraud is transforming.
Joining me today for 360 degree perspective on the future of fraud prevention and detection in 2025 is Paul Maskall. Paul is recognized as a leading UK influencer in fraud prevention, and works closely with UK finance and the City of London Police, leveraging behavioral strategies to combat fraud and empower public awareness. Specializing in psychology, Paul explores why humans are susceptible to online crimes, leveraging his expertise in psychology to address issues like risk assessment, cognitive bias, and emotional manipulation. Paul, welcome to âWhat The Fraud?â How are you doing today?
PAUL MASKALL: Iâm very well, thank you. Iâm very excited about this, to be fair.
TOM TARANIUK: Thank you for joining us, Paul. Looking forward to hearing your fraud predictions today on our show. One thing thatâs really interesting having you on board today is while Paul was in series one, we discussed and we also explored the psychology of fraudsters.
Suggested read: Psychology of a Fraudster: âWhat The Fraud?â Podcast
And now weâre sort of shifting our focus on this show to the perspective of consumers as well. Fraudsters have long exploited emotional triggers with the rise of AI and also increasingly sophisticated technologies.
The evolution of fraudulent manipulations in recent years and emerging vulnerabilities in 2025
How have these manipulations evolved in recent years, and do you think there are going to be any sort of new emotional vulnerabilities that fraudsters are targeting next year as we roll into it?
PAUL MASKALL: I find it interesting because we often talk about criminals becoming more sophisticated, but sometimesâperhaps controversiallyâI disagree.
From a criminology perspective, four factors are needed:
- opportunity
- pressure
- capability
- rationalization.
Rationalization is the internal story a criminal tells themselves to justify committing the crime. And the problem is tech. Well, technology and a lot of that sort of stars is it lowers the barrier to entry for all four of those entries.
Suggested read: From AI to Fraud Democratization and Real Identity Theft: Fraud Trends 2024-2025
Everybodyâs been conditioned to give their data away for pretty much every 10% Gymshark offer or whichever, etc. And youâve been conditioned to give that data away and youâve got an internet connection. So that capability is is also lowered.
And the pressure, because I donât necessarily have to see the whites of your whites, of your metaphorical eyes, is that I donât necessarily have to have that much pressure in order for me to commit the crime, but itâs about remembering that actually, the reason why we have this issue around fraud disguise scams is the way we have changed in this space. I donât think there will be any new emotion, although there might be a different package and a different flavor of the manipulation.
But the biggest aspect is whatâs the difference between fraud and marketing?
If we really dig down there, is very like almost zero difference between how fraud works and marketing works. But we generally refer to social engineering as something very interesting and different.
TOM TARANIUK: There is a funnel at the end of the day and getting from the consumer, which is the person whoâs being defrauded and the and the other side as well.
I mean, but would you say the change is also good people are becoming more tech savvy, right? But from your perspective, why do you even tech savvy individuals fall victim to online frauds?
Why do even tech savvy individuals fall victim to online frauds?
PAUL MASKALL: Because itâs nothing to do with savviness. So itâs when I again, from that marketing aspect is that emotions get links clicked. So the fact that actually just because youâre more familiar with technology or before the more savvy etc., because the ability for in these sorts of situations is that you, your intuition or your emotions in these sorts of situations, are triggered.
So, for instance, if I give you an example, have you ever taken a text message the wrong way for a partner?
TOM TARANIUK: Most dynamic stuff is that one of my blind spots.
PAUL MASKALL: Itâs everybodyâs blind spot, yes. The problem is and no matter how savvy you are with tech and familiar is that text message is the emotional context of whether you had a nice morning that morning or an argument the night before.
It is the emotional context of how you read it. For example, a simple question like âWhat do you want for dinner tonight?â can take on different meanings depending on how you feel.
In these situations, that text message from a partner is no different from how social engineering and online manipulation work. Our intuition, gut instinct, or savviness is often replaced with emotional projection.
So, no matter how intelligent or savvy you are, it doesnât matterâeveryone is vulnerable. At some point, you will be told exactly what you wanted to hear, but at the wrong timeâwhen you shouldnât have heard it.
How can a business integrate behavioral strategies into their fraud prevention frameworks?
TOM TARANIUK: More often than not, emotions can get in the way of everything else. Thatâs from a user perspective, right? But when weâre talking about a business, is there any way a business can integrate, letâs say, behavioral strategies, into their fraud prevention frameworks as they move forward?
PAUL MASKALL: I always find this to be a bit of a misnomer because, when discussing psychology and related topics, people often ask me, âWell, how does this affect businesses?â And my response is: Businesses are also run by people. Your CEO, C-suite executives, security teamâeveryone involved in the organizationâare all people. At the top level, fraud and cyber threats arenât seen as emotional problemsâuntil they are. The question is: Will leadership invest in security? Will they implement the right tools? Will they provide proper education? Or will it remain just a checkbox compliance exercise on a spreadsheet?
Instead of treating security as a compliance task, how do we integrate it into company culture and make the right investmentsânot just in terms of financial resources but also in fostering awareness and responsibility?
From a security perspective, after ten years of teaching, training, and presenting on this topic, Iâve observed that security teams, IT teams, and investigators tend to be the most biased. The problem is that we are exposed to threats day in and day out.
Security professionals are passionate about what they do, and they often expect the rest of the organizationâabove, below, and alongside themâto be equally invested. But the reality is, theyâre not. Again, security isnât perceived as an emotional problemâuntil it is.
Motivation in this space is similar to language learning. Imagine if I told you, âI want you to learn German, but youâll never visit Germany or speak to a German person.â Thatâs not how motivation works.
So, from a business strategy perspective, internal education and security initiatives should start with the reality that most employees donât inherently care about security.
If you acknowledge that from the beginning, you can build an approach that truly engages themâand thatâs the most effective way to create real change.
TOM TARANIUK: Itâs certainly a bleak way of looking at it. Of course, it may also be true. But if we consider the emotional aspect, this is one way individuals fall victim to fraud.
However, when it comes to protecting themselves, invoking an emotional reaction can also be a defense mechanism. Fraud is evolving rapidly. From your perspective, is this one of the practical steps individuals can takeânot only for themselves but also as employees within a companyâto adapt?
Can they not only manage their emotional responses to fraud but also improve their digital literacy and, ultimately, stay ahead of these threats?
The emotional side of fraud: a defense mechanism or a weakness?
PAUL MASKALL: Again, it comes down to the fact that actually, no matter what tools you have in place, the whole principle around it is that no matter how much email filtering, detection tools, intervention that you have. And again, I do a lot of work with the banks in relation to how do we look at intervening somebody that is under manipulation in that sort of scenario.
From a business perspective, itâs important to realize that the expectation shouldnât be that humans will always be able to spot a phishing email or recognize social engineering attempts. Instead, the focus should be on reducing the burden before it even reaches the individual. So, how do we implement effective mitigation measures? Thatâs a crucial question.
But beyond technical defenses, does your e-learning actually work? Is your education effective? Does your company culture reinforce security awareness? Ultimately, it all comes down to emotional mindfulnessâbecause fraud and scams are just symptoms of much larger issues, such as disinformation, online grooming, and economic abuse.
Fraud and scams also raise a fundamental question: Do I believe what I see? If I could provide therapy for everyone in the UK, I absolutely wouldâbut thatâs not exactly fiscally responsible. Still, emotional mindfulness is key.
If someone is under emotional stress or facing external pressuresâjust like with that ambiguous text message from a partnerâit becomes an individual challenge.
Recognizing that security and safetyâboth personally and in businessâare deeply connected to well-being is essential.
Like I do a lot of work in insider risk, I do a lot of work around staff approaches and manipulation and all this sort of stuff. And all it does come down to is, is the staff actually in that kind of wellbeing space?
Just because youâve got anxiety or depression or youâre going something doesnât necessarily make you a security risk, but it is a massive factor. Super interesting, especially on the points around the systems.
TOM TARANIUK: Not every employee wants to go through that. It I mean, for the big companies, often not. Itâs taking a box, making sure that everyone is mindful, right? And statistically, if they say everyoneâs completed the course, it doesnât mean that everyoneâs retained it as well. And it was a really interesting point that you had. The individual also needs to be motivated. Thereâs other factors which are relating to how they take on board everything and how easy they are to manipulate as well. But these are for bigger companies. Theyâre able to actually implement these flows, track them retrospectively, and see who needs help in terms of the targets. When weâre talking about smaller companies as well who donât have the money to invest in awareness campaigns in LMS systems, what could they do in these sort of cases? Is it outside talent or the other sort of avenues that they can go down?
Which strategies should smaller companies implement to successfully combat fraud?
PAUL MASKALL: I think very often it is starting from the wellbeing and itâs itâs interesting because, again, I find very often the larger organizations are far worse at it than, than smaller organizations in these sort of situations.
Itâs very often the case of the question of what is it that youâre protecting? What is the value in which youâre protecting? Are you protecting your data from personal information? Are you protecting intellectual property? Are you protecting the value that comes from itâwhether itâs fraud or cyber, etc? That is essentially what Iâm exfiltrating whether from a pure a point of view and direct access from a pure cyber aspect, or Iâm going to kind of much more social engineering route and I am kind of piggybacking on somebody elseâs credentials. But itâs simplifying it as much as you possibly can.
TOM TARANIUK: From the perspective that there are too many digital services to remember them allâand they also lack a physical presenceâit becomes much harder to create a mental connection.
Think about it: Youâre safe in your home, your car. If they all had passcodes, youâd remember them easily because they are tangible. But when it comes to dating websites, Facebook, or other online platforms, forming an emotional connectionâsimilar to a physical object you can holdâis much more difficult.
Iâm not sure if itâs a factor, but it would be interesting to hear your thoughts.
PAUL MASKALL: It is 100% a factor. How do we make a problem or risk feel even more emotional than it already is? I can give you the top ten security tips in the world, but I canât make you care. Motivation is the real challengeâespecially in education.
So, how do we bridge that gap? One approach is to create an emotional connection by using real-world analogies, making abstract concepts feel more tangible and relevant. Many of the internal security programs I run are framed around a simple but crucial question: Why donât you care about security? Because, truthfully, most people donâtâand thatâs the best place to start.
Then thereâs the intuition mythâthe idea that âif itâs too good to be true, it probably is.â We encourage people to trust their gut in these situations, but intuition isnât some magical white knight that will always save you from manipulation. Thatâs a dangerous misconception.
Your intuition isnât designed to be correctâitâs designed to make you comfortable. And that distinction is critical.
TOM TARANIUK: We often overestimate the power of our intuition and gut instinct, treating them as something almost magical. But in reality, they are simply the result of our accumulated experiences.
The problem is that peopleâand businessesâtend to be reactive rather than proactive. We donât take action until something directly affects us, triggering an emotional response. Maybe I lost my job because of a mistake. Maybe I lost a customer because they were defrauded on my platform. Itâs only when the consequences hit home that we start to pay attention.
So, how can businesses and individuals shift their mindset from being reactive to being proactive? Do they have to experience emotional turmoil before truly understanding the impact of fraud?
Shifting from a reactive to a proactive mindset: How businesses and individuals can combat fraud
PAUL MASKALL: In a perfect world, weâd have an incredible strategy that prevents the pain of a data breach or the emotional turmoil that comes with it. But the reality is that we must first recognize the importance of setting aside our biases. The challenge is that this cultural shift needs to be driven from the top.
Iâm frequently brought in by C-suite executives and security teams, often to address the âhearts and mindsâ aspect of the issue. Organizations, including banks, bring me in to help make security an emotional issue, because people need to reflect on their own behavior. Fraud is one of the few crimes where you can be the victimâoften unknowingly.
The language around this topic is often inherently shaming, which can be problematic. Itâs the same for businesses. They might say, âWeâre not big enough to be targeted. We donât have anything to lose.â But I remind them that, even with just ÂŁ50, losing ÂŁ45 is still significantâitâs all relative.
The key is to shift the conversation and make people realize that this could happen to them. We need to get individuals and organizations to reflect on their own behavior, because case studies arenât always effective. You can be in the same industry, with the same turnover, same demographic, and yet still fall victim to fraud.
Thereâs a tendency to distance ourselves from the problem, to think, âIt wonât happen to us.â But recognizing our own emotional biases is the first step in changing that mindset. Itâs getting over that bit. And thatâs thatâs probably 90% of the challenge.
TOM TARANIUK: And Iâm sure individuals and businesses can also imagine, âOh, well, that happened to someone else. Itâs not going to happen to me because I did X, Y, and Zâ, but you donât know what you donât know, right? And thereâs some bias to escape that is very difficult.
Paul, with fraud tactics evolving so quickly, what do you think businesses have learned from 2024 that they can apply in the coming year?
What have businesses learned from 2024 that they can apply in 2025?
PAUL MASKALL: From a business and banking perspective, itâs clear that criminals are increasingly relying on social engineering tactics. Weâve seen that, in many cases, unauthorized fraud has decreased from a banking standpoint, and direct cyberattacks have also gone down. However, over the last few years, weâve witnessed a growing reliance on fraud methods like Authorized Push Payment fraud for individuals, as well as SEO scams, invoice fraud, and supplier scams targeting businesses.
Suggested read: What is Authorized Push Payment (APP) Fraud?âComplete Guide
Whatâs interesting is that, despite the rise in these types of fraud, we have a variety of controls in place, including AI-driven interventions. The question is: How important will these measures be in the future?
The real challenge lies in reducing these risks. We must reevaluate our educational approachesâhow effective is our current training, and how does it align with our organizational culture? One of the key lessons from the past year has been the significant impact of friction.
For instance, a bank, in one way I am expected, especially with the new, PSR APP reimbursement that came in October. Generally speaking, weâre looking at thatâs kind of mandatory reimbursement. So you will most likely get your money back. But the fact is, that itâs very much the case of going well, actually, in that result, banks and organizations are going to have to really put in a lot of more friction in order to try and intervene and to show you: âHold on a second. This isnât real. This is a scam.â We believe this is a scam. But in these sorts of situations and what weâve learned, itâs like the problem is, have you ever tried to tell a person that theyâre in a relationship they shouldnât be in?
TOM TARANIUK: You just press the Skip button.
PAUL MASKALL: Yes. And again, the issue is, if youâve got an emotional bulldozer, especially in these sorts of situations that are pushing you through that process, no matter what process you have in place, if you are being manipulated, clueâs in the title. You donât know youâre being manipulated. But the fact is, especially in the banks, and from a finance point of view, theyâve learned âOkay, well, how do I intervene and get people you can lead the horse to water, but how do I make them drink?â How do we explore the behavioral elements and intervention in getting them to come to the realization themselves? And thatâs no different to things like insider risk and internal security, because if somebody is being manipulated, that friction is between safety, but also convenience. And thatâs the balance thatâs really difficult to find, especially now.
TOM TARANIUK: In a situation where an individual has been manipulated into making a paymentâletâs say theyâre about to send 3K or 4K in a romance scamâwhere is the point of no return? At what moment do they reach that critical decision? Theyâll get to a page asking, âAre you sure you want to send money to this account?â
It used to be that, back in the day, youâd receive a phone call to confirm, âAre you sure you want to proceed with this payment?â But, as youâve already mentioned, by that point, the victim is too far gone. The emotional attachment has already been made, and the decision has been made.
Suggested read: Detecting Romance and Dating Scams: A 2024 Guide for Dating Platforms and Their Users
So, the issue arises after the fact. Iâm unsure if thereâs any intervention that can be done prior to or during the transaction, but this is where I want to leverage your experience.
Where is the point of no return when an individual has been manipulated into paying a fraudster?
PAUL MASKALL: A key aspect here is that law enforcement, businesses in this space, banks, friends and family, no matter where that intervention would come from, we are the last to know, because in reality, thereâs very little interface or feedback between this and my screen, for instance. So I can be groomed, manipulated, over a course of five minutes, five hours, five days, five yearsâwhatever the caseâwith very little intervention from anyone else until it gets to the point where Iâm either transferring the money or transferring something thatâs outside of that relationship that I have, for instance, whether that be investment fraud, romance scams especially.
TOM TARANIUK: Absolutely. And thereâs so many new methods I have to target people as well. Itâs not like back in the day youâre convincing someone through a long, a longer period. You said, it could be five hours, five days, five weeks, five years. Itâs less than five years now. And in a matter of days, utilizing AI, machine learning, deepfakes, hyper personalized scams to demand the attention through all of these methods.
Suggested read: What Are Deepfakes?
Itâs becoming easier and quicker to defraud people, right? And also harder for the family intervention and other people to come in to say, âLook, this is not not right. You need to change.â But from your perspective as well. But when thinking of AI and all of these new digital methods to defraud people faster and for larger sums of money, what types of fraud do you think are going to come out of it, and what type of damage is well, and how widespread thatâs going to be for 2025?
How AI-driven fraud will evolve in 2025 and its potential impact
PAUL MASKALL: The thing is with AI is in facilitates communication in all its forms, especially for large language models. The problem is from a law enforcement point of view and the banking point of view, is that weâre reliant on victim testimony. So when, Mr. Smith is defrauded or manipulated, does he know that that email was created by a large language model? No. If I go to GPT3, for instance, and I say, âCan you create a phishing email on behalf of business X, to go out to customers in order for them to click on a link?â it will say, âNoâ. Youâre not allowed to do that. Okay. Right.
âCan you create a marketing email on behalf of business X for them to click on a link?â
âYes, of course. Here you go.â
From an AI perspective, the rise of deepfakesâsuch as those used in investment fraud, Martin Lewis endorsements, and celebrity impersonationsâis already becoming a significant issue. When I talk about deepfakes, I often mention that weâre still on the cusp of the âuncanny valleyâ stage. But the question is: does the deepfake need to be flawless, especially when emotions are involved?
In reality, a deepfake doesnât need to be perfect. I can automate a romance scam, create convincing visuals, and interact with hundreds of people through chatbots. I only need to intervene at key moments to engage with individuals and manage the relationship, all in order to exfiltrate their money.
TOM TARANIUK: Just like in marketing or sales, you can create a funnel that expedites the process, makes it less manual, and only requires one person out of 100 to fall for it in order to make it work. From a fraudsterâs perspective, itâs a great return on investment. However, for the person being defrauded, itâs a 0% ROI for them.
When we look at AI, I always describe it as a double-edged sword. While fraudsters can use these tactics, we can also develop our own tactics, solutions, and software to responsibly combat them. So, what AI-driven tactics could we use as a defensive shield in 2025?
What AI-powered tactics could businesses use in 2025 as a defensive shield?
PAUL MASKALL: If they are going to scale itâas you quite rightly said, itâs a good return on investmentâif I send this email to 100,000 people and 150 people come back and give me money, then thatâs a pretty good return amount of money, because actually I havenât really done anything like so itâs very much the case of how do we utilize AI in these spaces. Whether itâs machine learningâwhich has been used for years in transaction monitoring and fraud detection within the financial industryâor AI being utilized by businesses for email screening, the question remains: how effective is it? Is AI being leveraged to analyze email wording, detect patterns, and assess metadata to identify potential threats?
Suggested read: Machine Learning and Artificial Intelligence in Fraud Detection and Anti-Money Laundering Compliance
It also comes down to the customer journey. How can we integrate AI into that process, particularly within the user interface (UI), to both support and detect potential threats? Additionally, if a person accesses their account, can AI be used to analyze device telemetry and build a profile of the individual to enhance security?
TOM TARANIUK: Thatâs really interesting. You mentioned that this is an arms raceâa constant game of cat and mouse where weâre always trying to catch up with the bad actors, who remain one step ahead. They have access to new technologies and the financial motivation to stay ahead.
When it comes to AI and its use cases, this is just my perception, but it seems that regulated marketsâsuch as banksâand well-funded businesses will have the advantage. Similar to how the U.S. leads in the global arms race, these organizations have the resources to push forward, adopt, and implement new technologies more rapidly.
But where does that leave unregulated or smaller firms that still need to protect themselves and their users from these kinds of scams? Without the same fiscal capabilities, how can they keep up in this evolving landscape?
How can small unregulated firms protect themselves from AI scams?
PAUL MASKALL: Itâs very difficult to measure AI fraud, because AI facilitates communication and it facilitates everything and it well, again, facilitates marketing. Think about going through your social media feed and you see it AI generated imagery everywhere, youâre seeing AI generated copy everywhere. Whatâs the difference? So actually in reality itâs going to be very, very difficult for us to measure across the board. However, it still comes back down to the culture aspect, the emotional aspect and the education aspect. Yes, putting those tools in place as well.
But itâs also teaching that kind of almost that digital media literacy in relation to saying that if something comes through, how do we actually rely on our processes? So hypothetically, invoice scams, the biggest scam that we have in the country to businesses across the board. Now that invoice scam has always been relatively effective, regardless of AI intervention or not.
Suggested read: Payment Fraud Guide 2024: Detection and Prevention
As a supplier or an organization filtering emails, I might notice small discrepanciesâa slight difference in the URL, a suspicious link, or an unfamiliar sender. In some cases, it could even be an email account takeover. However, if I use AI to generate an invoice that looks more legitimate, or to refine the copy of an email to make it more convincing, the scam becomes even more believable.
Strong verification
But no matter how sophisticated the deception, it canât override a strong verification process. If I implement a system where two sets of eyes review a payment before itâs approved, or if I make it a policy to call my supplier using a trusted, pre-verified number, I can significantly reduce the risk.
Ultimately, the key is recognizing that employees may not inherently care about security. Thatâs why education is crucial, but beyond that, organizations should also map out their processes and identify where human judgment is a primary factor. If a system relies heavily on human judgment, additional safeguards should be in place to confirm legitimacyâones that donât depend solely on an individualâs ability to detect fraud. This helps reduce the burden on employees and minimizes the risk of a single point of failure.
TOM TARANIUK: There might be a single point of failure there. But when weâre talking about hybrid attacks, these are on the rise as well, it combines multiple techniques as youâve mentioned. So it might be the automated factor with the fact that theyâve identified gone through identity fraud using deepfakes. When weâre talking about fraudsters favoring this type of approach, what can businesses do? I mean, youâre just saying we do have a framework in place. We have to get rid of that immediate bias from individuals. But thereâs always going to be some which fall through the gap, right? Because they can get more and more convincing over time. Even if the proof of trust can be manipulated with AI voices, everything else, emails, etc.
Combating hybrid fraud attacks
PAUL MASKALL: If I was to pose this as a buyer, for instance, in this sort of situation, I know I want an invoice being paid. Now itâs again the accessibility for me being able to utilize tools in this space, etc., is is lowering and lowering, criminals share a lot of stuff like on things on Telegram groups, etc. and theyâre better at tool sharing and data sharing than we are. But itâs interesting because, again, is that the chances of me being able to take over a mobile number that youâve already used in the past, the chances of being able to take a full email account takeover, etc. simultaneously as well is also, reduced risk. Yes, I can look at voice synthesizing as well, but itâs very much the case of going thereâs a number of different factors that, are less likely.
Again, itâs not completely unlikely, but itâs that process of saying, well, actually, that process is still going to rely on the bias and actually being able to get past it. If hypothetically, it actually happens if somebody targets you in those sort of situations because no security is 100%. If somebody really wants to break into your house, what do you do in the event of? Because if I have got through that manipulation, if I have made sure Iâve taken over the supplierâs email, if Iâve synthesized the voice, if Iâve gone through all that sort of stuff, the fact that you have put those processes in place and you have shown the accountability around the fraud protection, even with the new, failure to prevent fraud legislation thatâs coming into accepted in 2035, as long as you have put those processes in place and youâve shown the investment, youâve shown the willingness and youâve not been, negligent about it, itâs very much that case, but itâs also the resilience. Itâs like, how do you and your staff respond to a situation? How does your entire organization, whether you are two people or a multi-million pound international organization, respond to this situation? Because it will get in. If something really wants to get in, they will.
TOM TARANIUK: It sounds like we need to strike the fear of fraud into individuals and companies globally.
PAUL MASKALL: Yes and no. Iâm always a bit on the fence of positive versus negative framing because especially things like frauds is that yes, fear sells potentially you the fear of of that perspective fail at a potentially sales aspect. But itâs also the larger aspect of actually protecting the things that youâre valuing. So whether it is your organization, whether itâs the data in which your customers have trusted you with your data or that information, itâs the positive framing. And very often, especially on a per person security and person safety and getting to value staff, thereâs a positive framing route as well.
How fraudsters use victimsâ emotional state
TOM TARANIUK: But I like to look at it like that. But when we look at the psychology, whatâs the negative framing while weâre doing this? So you donât lose a license or we donât lose the bottom line. We donât lose employees, and we donât lose our customer base. So I think thatâs why I mentioned the sort of fear around it, because the fear of loss from taking a hit like that, and also from the point of the actual fraudster coming in, targeting an individual, itâs, âOkay, weâre targeting you because youâve lost your bank account. Please sign in here.â So itâs often that theyâre using those fear tactics to actually gain control of the emotions of the person as well.
PAUL MASKALL: And in reality, thatâs what I want you to do. If I want you on either end of those spectrums. So I want you to be really especially of manipulation. Loss aversion is, is a one of those cognitive sticks, etc. I will overvalue or overinflate a loss over the relative gain. But itâs also the case in those situations is I want you either excited or thrilled about something, or I want you to be panicked and fearful because, again, thereâs something called the affect heuristic that you make judgments very-very quickly based on your emotional response or your immediate emotional response of bias.
So with the affect heuristic, itâs very much the case if I can manipulate you through an emotional state.
TOM TARANIUK: Thereâs been a lot of great references here, not only to how people can protect themselves, but also, how people are targeted. And I think that understanding can help a lot of listeners on our side. But lastly, Paul, Iâd like to sort of identify from your perspective, weâve mentioned quite a lot here, just to summarize three business priorities that should be focused on to stay resilient as a business and an individual working in a business in 2025.
3 key priorities for staying resilient as a business and individual in 2025
PAUL MASKALL: First of all, realizing that this is much more of an emotional problem than it is anything else. I think thatâs the first aspect is realizing what is your company well-being? What is your staff well-being? What is the support that you do for your organization? Is it just a compliance exercise, or is it actually is something that we are looking towards and really fortifying not just your culture, but realizing that actually because of our relationship with technology, the security and safety is synonymous with our well-being.
Two: map out your processes. What is your crown jewels and your values in these sort situations and map out those intervention points? What are the points in which you can reduce the burden on the individual? So you make sure that you filter out or or engage etc. and actually have those processes.
Three: is about the policy and your processes. So looking at the reducing the burden, look at interventions. But the third one policy relying on okay what is actually getting rid of the bias in the judgment. What is reliant on a human making a decision that could result in either money or data or data breach or impact, in these sorts situations? And how do we circumvent that or put a process to verify and have that trust?
TOM TARANIUK: Three very excellent points actually, which Iâm going to take on board as well.
Quick fire-round
Paul, before we let you go, weâd like to wrap up the show with a bit of fun. Five quickfire questions to get to know you a little bit better. Are you ready?
PAUL MASKALL: Okay. All right. I mean, Iâm intrigued.
TOM TARANIUK: When choosing a digital wallet, do you go for more features or better security?
PAUL MASKALL: Probably better security, but thatâs my own bias.
TOM TARANIUK: Second passwords or biometric authentication?
PAUL MASKALL: Biometric.
TOM TARANIUK: Very interesting as well. Is online fraud more about technology flaws or human error?
PAUL MASKALL: Human? I might give them that way.
TOM TARANIUK: I agree, and number four, whatâs one habit that you rely on to stay safe online?
PAUL MASKALL: Emotional mindfulness.
TOM TARANIUK: Number five. Finally, if you could have any other career other than the one that you canât leave, what would it be?
PAUL MASKALL: Actor.
TOM TARANIUK: Actor, really? What would you act in?
PAUL MASKALL: Iâm a presenter at heart. Anyway, so I present probably 150 to 200 times a year as it is, and I like being on stage, etc. So if there was probably anything else I would have done, probably that. Well, it sounds like youâre following your dreams anyhow.
Wrapping up
What a great conversation weâve had today with Paul Mescal. Companiesâsmall or largeâyou must look to protect yourself from fraudulent actors with new technologies, remits and frameworks of course, as well. It is an emotional issue more than everything else. So making sure to invigorate an emotional reaction when addressing these issues is key. But case studies may not often work. As Paul has also shared, emotional susceptibility is one of the main drivers of being a victim of fraud, and making sure that youâre able to take a step back is key. Where bias needs to be destroyed, and individuals need to be open to the threats and challenges others face within the business, but also outside of it.
Secondly, whether or not youâre a small or large multinational company, you can implement changes and also challenge the current frameworks and education programs that your employees are currently in to stay ahead of the threats of fraud in the new year. You canât teach through the fear of fraudsters alone, but you also need to put a positive spin on why you need to protect your business.
Thirdly, and finally, but not least, make sure to sit down and map out all of the touchpoints for your business and the processes. Money going in and out of the business, such as through paying bills, procurement and other factors.
And on the other side, enhancing your onboarding journey for users and merchants to be robust enough to filter out fraudsters. Remember, as fraudsters democratize access to nefarious technologies looking to coach money from businesses and individuals, organizations such as Sumsub on the other side of the battle and are doing the same. Democratizing access to technologies which can stop fraud, and playing the cat and mouse game that is required to stay ahead of the fraudster.
Relevant articles
- Article
- 3 weeks ago
- 4 min read
The fraud game has changed. WTF is your move?

- Article
- 3 weeks ago
- < 1 min read

- Article
- 2 days ago
- 7 min read
Everything you need to know about the #1 defense against fraudsters and money launderers.

- Article
- 1 week ago
- 7 min read
Why both businesses and end users need to know if theyâre dealing with a legitimate company.
