New Data Shows 70% of Fintechs Report Increased Fraud as Synthetic Identities Become Harder to Detect
IDology, a GBG Company, today announced new data highlighting the growing problem of fraud fueled by generative artificial intelligence (Gen AI), putting pressure on fintechs seeking a way to protect against fraud without compromising the customer experience.
Also Read: CrewAI Launches Multi-Agentic Platform to Deliver on the Promise of Generative AI for Enterprise
Gen AI has given criminals a path to work faster, scale attacks, and create more believable phishing scams and synthetic identities. Fraud is taking place on an industrial scale, and fintechs remain a prime target.
With Gen AI in the hands of fraudsters, fintechs are more vulnerable than ever. As Gen AI empowers fraudsters to create synthetic identities that are increasingly believable and difficult to detect, having an accurate and complete picture of customers is mission-critical for deterring fraud and improving the onboarding experience.
"These numbers indicate a need for action," said James Bruni, Managing Director at GBG IDology. "While Gen AI is being used to escalate fraud tactics, its ability to quickly scrutinize vast volumes of data can also be a boon for fintechs, allowing them to fast-track trusted identities and escalate those that are high-risk. The powerful combination of AI, human fraud expertise and cross-sector industry collaboration will help fintechs verify customers in real-time, authenticate their identities and monitor transactions across the enterprise and beyond to protect against difficult-to-detect types of fraud, such as synthetic identity fraud."
To learn more about GBG IDology's end-to-end coverage of the customer identity lifecycle with standalone and multi-layered capabilities, thousands of diverse data sources and leading technology for document authentication, tamper detection and ID + selfie verification, stop by booth #3601 during Money 20/20 October 27-30 in Las Vegas, Nevada.
Also Read: What is a CAO and are they needed?
[To share your insights with us as part of editorial or sponsored content, please write to [email protected]]