The Financial Revolutionist

View Original

Bridging the trust gap between AI and human advice

The chorus of financial services brands that say they’re exploring the use of generative artificial intelligence to improve customer experiences keeps growing. The hope is that customer frustrations with rules-based chatbots will be addressed by generative AI agents that can offer contextually relevant answers to customer questions. They’ll also be able to interpret slang, misspelled words and terms regular people use instead of financial jargon. (Think “I want to get paid electronically” instead of “I want to set up a direct deposit.”)

The hurdle banks and fintechs may face is a customer base that may not fully trust the input they get from an AI agent.

The findings of a recent FINRA Investor Education Foundation study shed light on the consumer trust deficit. A survey of 1,033 consumers earlier this year found that just 5% of consumers said they used AI when making financial decisions. This compares to 63% who said they sought advice from financial professionals and 56% who spoke with friends and family. About a quarter of respondents said they consulted financial management apps, though it’s not clear if they were AI-supported.

FINRA Foundation researchers presented four different hypothetical statements about personal finance to survey participants. For each statement, half of study participants were told the information was provided by AI, while the other half were told it was offered by a human financial professional. 

Interestingly, the trust gap between AI and humans got smaller on some questions. For example, in response to a statement on stock and bond performance, 34% said they trusted it when delivered through AI, compared to 33% who said they trusted it when delivered through a financial professional. Some 24% of respondents distrusted AI-provided information compared to 19% saying they didn’t trust the human input.

“If there's a lesson here, it's that one should not make assumptions about consumer responses to AI and that we should strive to make financial information and its sources work well for everybody,” Gerri Walsh, president of the FINRA Foundation, told American Banker last week.

Experts who spoke with the industry publication said the level of consumer trust in AI-powered input likely depends on the complexity of the question.

“If I want to check my balance, I want to make a payment, then automation is great," said Max Ball, principal analyst at Forrester Research. "If it is fraught, if it is something really important to me like my financial future … I need something that has the human element and the human empathy."

A key takeaway for banks and financial firms is to be transparent with consumers about what an AI-powered assistant can do, suggested Andrew Way, senior director of research at Corporate Insight.

Others commenting on the study’s conclusions say AI can be a useful tool for informing the input human financial advisers provide for their clients. 

“From an investment perspective, AI has the possibility of both shortening and improving the research process for advisors, allowing them to get informed ideas to clients faster," Nathan Wallace, wealth manager at New York-based Savvy Advisors, told Financial Planning earlier this month. “We do not consider AI a threat to human financial advice, but rather an amazing tool that supercharges our advisors' ability to deliver insightful and timely advice to clients."

What are your thoughts on consumer trust in AI-provided financial advice? Let us know at editorial@thefr.com.