Finance

Three Reasons Finance Pros Don’t Love Your AI (Yet)


Marne Martin is CEO of Emburse, whose innovative travel and expense (T&E) solutions power forward-thinking organizations.

While companies pour resources into AI, finance departments have mixed emotions about the very investments made to benefit them.

My company’s survey of 1,500 finance professionals across the U.S. and U.K. revealed that two-thirds of respondents’ companies had invested in AI. They’re greenlighting AI-driven enterprise to enhance internal processes and GenAI licenses, and allocating a substantial portion of their own finance budgets on AI-related initiatives.

But fewer than one in five named AI as a top departmental priority. Instead, cost-cutting and compliance continue to dominate finance agendas. Even as they approve its purchase left and right, finance departments aren’t using AI to its full extent within their own functions.

Where Finance Teams Are: Three Major AI Concerns

I interpret this gap between investment and prioritization to mean there’s still a lack of trust in AI. Finance professionals have real concerns about AI accuracy and risk. Some share the feeling that they’re being left out of the conversation. If technology providers want to win users’ hearts (and their dollars), we need to start understanding and addressing the cultural tensions shaping AI adoption.

1. AI is designed for Gen-Z, but today’s workforce is multigenerational.

It’s not surprising that younger generations are quicker to adopt AI. It’s even less surprising when you consider the lack of training. Just 38% of companies offer formal AI education.

Digital natives can often teach themselves how to use a new app using YouTube with no issues, but it’s a different story for Baby Boomers. Many tech providers build products with their early adopters in mind, but this subset doesn’t always ultimately reflect the whole user base.

For the first time in history, we have five generations represented in the workforce. That means tech companies must account for users with a greater diversity of learning styles and technical literacy than ever before. If you want to encourage adoption across the entire finance department, you need to design for multigenerational teams with unequal access to training and AI experience.

2. Privacy and accuracy outweigh AI’s potential efficiencies.

Many finance professionals still approach AI with a healthy dose of skepticism. Some of the primary concerns today include data privacy and security, risk of errors or inaccurate outputs and whether AI-generated insights can be trusted at all.

That concern is justified. If AI sits at the heart of an organization’s most critical systems, it must meet the highest standards of compliance and performance to earn finance’s trust. Finance professionals should be reassured that their data isn’t used to train commercial models or shipped to third parties without added complexity. They should know their customer data is safe and secure in virtual private clouds, for example. Large language models (LLMs) that generate accurate AI insights require domain specificity; explaining how your LLMs are trained on proprietary finance datasets may help quell those fears.

Sometimes, building trust means refining AI products based on customer feedback. At Emburse, for instance, users found the reliability of our OCR technology frustrating. OCR kicks off nearly every expense workflow, with users snapping a photo of a receipt or uploading an invoice for reimbursement. Our accuracy hovered at around 60% to 80%. On the low end, the solution especially struggled with global currencies and languages.

This level of accuracy wasn’t good enough for global enterprise customers. So, we ditched our third-party OCR solution and built our own. We trained our models on more than 50,000 real receipts and invoices, including multiple currencies, date formats, languages and tax structures. We layered in quality control checks that use AI to audit AI, cross-validating labels with competing models and escalating mismatches for human review.

As a result, accuracy levels improved significantly. We knew getting OCR right would be a turning point for building customer trust in AI. By working to improve outputs, concerns around reliability and errors can be eliminated.

3. They don’t trust what they can’t see—and if they don’t trust, they’ll leave.

The AI trust gap doesn’t just affect adoption. It can also affect retention. Almost one-quarter of finance professionals said they would consider changing jobs if their employer failed to take sufficient precautions to manage AI risk. Thirty percent said they might leave if their company prioritized AI investments over human development.

That’s a clear message for their employers: manage your AI investments carefully, or we’ll walk. But what should fintech providers take from this?

Financial professionals, maybe even more so than any other function, want to know how things work under the hood. Understandably, they want to know whether the technology they’re being asked to use is a partner or a threat. They navigate strict compliance standards. They need to understand how models work, where risk lives and when human review and oversight are necessary.

In the spend management space, for example, we see a lot of platforms with AI-driven fraud detection or policy enforcement, but few that show finance teams how these systems actually perform.


As the CEO of a fintech company developing AI-powered products, I’ve found one way to close the understanding gap is to give finance teams direct visibility into how your AI is performing. That could look like a tool or dashboard that illustrates where the AI is highly accurate or where human intervention might be needed. Being able to track model accuracy over time and understand how their inputs affect future outputs is what can convert skeptical users into advocates.

Finance is a high-stakes, audit-heavy function, and fintech providers must be prepared to develop products that meet the industry’s growing demands for clearer AI performance metrics.

Beloved AI products start with better listening.

Finance teams need to believe that their expertise still holds value and that their organizations are investing not just in AI, but in them. It’s up to us to understand where finance teams are coming from and deliver AI solutions that build trust between them and their tools.

The stickiest, most trusted technology companies know loyalty is a game won in the margins between “user” and “evangelist.”


Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?




Source link

Leave a Reply