Visa employs AI and machine learning to combat fraud, according to James Mirfin, global head of risk and identity solutions. From October 2022 to September 2023, Visa prevented $40 billion in fraudulent activity, nearly double from the previous year.
Scammers use AI to generate primary account numbers (PANs) and test them repeatedly. This enumeration attack method, which involves submitting various combinations of PANs, CVVs, and expiration dates until a valid combination is found, results in $1.1 billion in annual fraud losses. Visa assigns a real-time risk score to each transaction to detect and prevent such attacks, particularly in online transactions where physical cards are not used.
“We look at over 500 different attributes around [each] transaction, we score that and we create a score –that’s an AI model that will actually do that. We do about 300 billion transactions a year,” Mirfin told CNBC.
The company also rates the likelihood of fraud for token provisioning requests to counter social engineering and other scams. “Every single one of those [transactions] has been processed by AI. It’s looking at a range of different attributes and we’re evaluating every single transaction,” Mirfin said.
In the past five years, Visa has invested $10 billion in technology to reduce fraud and enhance network security. However, cybercriminals are now using generative AI, voice cloning, and deepfakes to create more convincing scams. “Romance scams, investment scams, pig butchering — they are all using AI,” Mirfin noted.
Mirfin explained: “If you think about what they’re doing, it’s not a criminal sitting in a market picking up a phone and calling someone. They’re using some level of artificial intelligence, whether it’s voice cloning, whether it’s a deepfake, whether it’s social engineering. They’re using artificial intelligence to enact different types of that.”
“So, if you see a new type of fraud happening, our model will see that, it will catch it, it will score those transactions as high risk and then our customers can decide not to approve those transactions,” Mirfin added.
One such tactic, known as “pig butchering,” involves building relationships with victims and convincing them to invest in fake cryptocurrency platforms.