Here’s the thing about fraud: You don’t want to have to calculate the ROI of fraud when it’s too late. Because the flip side can be treacherous. Unfortunately, the prevailing mindset of banks and other companies may be that they don’t have the resources to invest into those anti-fraud efforts unless they’ve been under attack.
Garrett Laird, director of product management at Amount, a digital origination and decisioning SaaS platform powering consumer and small business deposit account opening and loan origination, told PYMNTS that many financial institutions (FIs) don’t reconsider their anti-fraud methods until it’s too late.
“You may not have realized it yet,” Laird said, “but they’re going to hit you.” And, he observed, “the fraudsters are jerks — and they like to hit you on holidays and on weekends, at two in the morning.”
The conversation was part of the “What’s Next in Payments” series focused on protecting the perimeter of various organizations from cyberattacks and hacks — keeping fraudsters out while letting good customers in and letting them transact with ease and speed.
Working with banks and credit unions, and helping them originate credit products digitally, said Laird, means that decisioning, pricing, fraud and verification are all key — and simultaneous — considerations that must be handled in real time. For a bank, when a new application shows up, and someone’s opening a new deposit account or applying for a loan or credit card, well, if a fraudster does manage to get through, the impact can be serious.
A single account, he said, can act as a “gap” or a “loophole that enables a broader group of criminals to take advantage — and fraudsters are notorious for looking for banks’ “soft spots.” Thus, a single application gives way to waves of hundreds of other applications all seeking to provide an entry point for a scam or breach.
“We’ve been direct lenders ourselves,” he said, in speaking about his platform’s functionalities, “and we have built technology that we ourselves have used, and we feel confident about giving to other financial institutions and helping them launch new products.”
A tech-enabled onboarding experience, said Laird, underpinned by artificial intelligence (AI) and machine learning, can not only beef up security but also foster a good customer reaction so that legitimate relationships prove sticky and long-lived.
“It all leads to better conversions when you keep your customers happy,” said Laird, rather than losing that same would-be customer to the FI that offers a relatively better user experience.
He noted there are several data sources that can be used to glean insights into emails, password, linked bank accounts and uploaded documents all in the service of identity verification.
“There’s a waterfall that we can put applicants through,” he said. “Suppose we’ve just discovered a fraud ring and they’re really good at forging documents and they’re beating some [of an FI’s] controls. “We can put an extra layer of friction in their way,” he said, “escalating to manual review queues so the fraud operations teams can put ‘eyes’ on how that fraud ring is evolving … and not getting in the front door in the first place.” AI, he said, helps with third-party fraud models to help detect fraudulent applications, representing another tool in the (rules-driven) anti-fraud toolbox.
“We’ve sought to be proactive about having the right data and processes in place to make decisions in an intelligent way,” he said, adding that “it’s not just about keeping out the ‘bad,’ it’s about letting the ‘good’ in and making things as painless as possible for them.”