Santander has created a series of deepfake videos to warn people how realistic they already are and highlight the threat from fraudsters.
The bank, which has a branch in Bolton., is placing the videos, depicting its fraud lead Chris Ainsley and influencer Timi Merriman-Johnson – also known as @mrmoneyjar, on social media to help raise awareness.
In one of the videos, Mr Ainsley appears to say: “For scammers, it is a powerful tool they can use to steal your money.”
He goes on to list the tell-tale signs of a potential deepfake, saying: “Look for blurring around the mouth. The person might blink less frequently than usual.
“If they are wearing glasses, the light reflections might not look right. The background may not feel natural. If something looks strange, trust your instincts.”
The video shows Mr Ainsley saying HM Revenue and Customs (HMRC) and banks are often impersonated.
He adds: “I’ve even been impersonated myself. This video is just the latest example. As you might have guessed, this isn’t me, this is a deepfake, created to warn you about deepfakes.”
He adds that people should consider whether a video was received from a reliable source and whether it is asking for money.
“If they are asking for money, that’s a big red flag,” he adds.
The video depicting Mr Merriman-Johnson meanwhile purports to offer an “incredible investment opportunity”.
The real Mr Merriman-Johnson then says: “That definitely wasn’t me. That was a deepfake video,” adding that it is “something we’re all likely to see more of in the future”.
Deepfakes can be videos, sounds or images of real people that have been digitally manipulated through artificial intelligence (AI), to convincingly misrepresent a person or organisation.
Santander warned that with generators and software widely available, fraudsters simply require authentic footage or audio of their intended victim – often found online or through social media – to create a deepfake.
An Opinium survey of 2,000 people for Santander in July found that just over half (53%) of people have either not heard of the term deepfake, or have misunderstood what it meant, with just 17% of people confident they could easily identify a deepfake video.
The survey also indicated that many people have encountered a deepfake, often on social media. Over a third (36%) of people surveyed said they have knowingly watched a deepfake.
Six in 10 (59%) people said they are already more suspicious of what they see or hear because of deepfakes.
Mr Ainsley, head of fraud risk management at Santander, added: “Generative AI is developing at breakneck speed, and we know it’s ‘when’ rather than ‘if’ we start to see an influx of scams with deepfakes lurking behind them.
“We already know fraudsters flood social media with fake investment opportunities and bogus love interests, and unfortunately, it’s highly likely that deepfakes will begin to be used to create even more convincing scams of these types.
“More than ever, be on your guard and just because something might appear legitimate at first sight – doesn’t mean it is.”
Mr Merriman-Johnson said: “As I said in the video, if something sounds too good to be true, it probably is.”
He added: “If you are ever in doubt as to whether a company or individual is legitimate, you can always search for them on the Financial Conduct Authority Register.”
Here are Santander UK’s top tips to spot a deepfake:
1. Most deepfakes are still imperfect. Whether there is blurring around the mouth, less blinking than normal, or odd reflections – look out for the giveaways.
2. But at some point, deepfakes will become impossible to distinguish from real videos, so context is important. Ask yourself the same common-sense questions you do now. Is this too good to be true? If this is real, why is everyone not doing this? If this is legitimate, why are they asking me lie to my family or my bank?
3. Know what types of scams deepfakes are likely to be used for. Deepfakes are likely to be used by criminals to scam people through investment scams and impersonation fraud, such as romance scams. If you know the tell-tale signs of these scams, you will know how to spot them – even if a deepfake has been used.
Comments: Our rules
We want our comments to be a lively and valuable part of our community - a place where readers can debate and engage with the most important local issues. The ability to comment on our stories is a privilege, not a right, however, and that privilege may be withdrawn if it is abused or misused.
Please report any comments that break our rules.
Read the rules here