CFPB Issue Spotlight Analyzes “Artificial Intelligence” Chatbots in Banking

Share This Post

WASHINGTON, D.C. – The Consumer Financial Protection Bureau (CFPB) today released a new issue spotlight on the expansive adoption and use of chatbots by financial institutions. Chatbots are intended to simulate human-like responses using computer programming and help institutions reduce the costs of customer service agents. These chatbots sometimes have human names and use popup features to encourage engagement. Some chatbots use more complex technologies marketed as “artificial intelligence,” to generate responses to customers.

The CFPB has received numerous complaints from frustrated customers trying to receive timely, straightforward answers from their financial institutions or raise a concern or dispute. Working with customers to resolve a problem or answer a question is an essential function for financial institutions – and is the basis of relationship banking.

“To reduce costs, many financial institutions are integrating artificial intelligence technologies to steer people toward chatbots,” said CFPB Director Rohit Chopra. “A poorly deployed chatbot can lead to customer frustration, reduced trust, and even violations of the law.”

Approximately 37% of the United States population is estimated to have interacted with a bank’s chatbot in 2022, a figure that is projected to grow. Among the top ten commercial banks in the country, all use chatbots of varying complexity to engage with customers. Financial institutions advertise that their chatbots offer a variety of features to consumers like retrieving account balances, looking up recent transactions, and paying bills. Much of the industry uses simple rule-based chatbots with either decision tree logic or databases of keywords or emojis that trigger preset, limited responses or route customers to Frequently Asked Questions (FAQs). Other institutions have built their own chatbots by training algorithms with real customer conversations and chat logs, like Capital One’s Eno and Bank of America’s Erica. More recently, the banking industry has begun adopting advanced technologies, such as generative chatbots, to support customer service needs.

Financial products and services can be complex, and the information being sought by people shopping for or using those products and services may not be easily retrievable or effectively reduced to an FAQ response. Financial institutions should avoid using chatbots as their primary customer service delivery channel when it is reasonably clear that the chatbot is unable to meet customer needs.

The spotlight found the use of chatbots raised several risks, including:

  • Noncompliance with federal consumer financial protection laws. Financial institutions run the risk that when chatbots ingest customer communications and provide responses, the information chatbots provide may not be accurate, the technology may fail to recognize that a consumer is invoking their federal rights, or it may fail to protect their privacy and data.
  • Diminished customer service and trust. When consumers require assistance from their financial institution, the circumstances could be dire and urgent. Instead of finding help, consumers can face repetitive loops of unhelpful jargon. Consumers also can struggle to get the response they need, including an inability to access a human customer service representative. Overall, their chatbot interactions can diminish their confidence and trust in their financial institutions.
  • Harm to consumers. When chatbots provide inaccurate information regarding a consumer financial product or service, there is potential to cause considerable harm. It could lead the consumer to select the wrong product or service that they need. There could also be an assessment of fees or other penalties should consumers receive inaccurate information on making payments.

Federal consumer financial protection laws place a variety of relevant legal responsibilities on financial institutions, such as obligations to respond to consumer disputes or questions or otherwise competently interact with customers about financial products or services. When market participants can deploy new technologies, they should do so in ways that comply with existing law and, ideally, to increase the quality of customer care.

Read today’s issue spotlight.

The CFPB is actively monitoring the market, and expects institutions using chatbots to do so in a manner consistent with their customer and legal obligations. The CFPB also encourages people who are experiencing issues getting answers to their questions due to a lack of human interaction, to submit a consumer complaint with the CFPB.

Consumers can submit complaints about financial products and services by visiting the CFPB’s website or by calling (855) 411-CFPB (2372).

Employees of companies who they believe their company has violated federal consumer financial laws are encouraged to send information about what they know to whistleblower@cfpb.gov.

###

The Consumer Financial Protection Bureau (CFPB) is a 21st century agency that helps consumer finance markets work by making rules more effective, by consistently and fairly enforcing those rules, and by empowering consumers to take more control over their economic lives. For more information, visit www.consumerfinance.gov.

Official news published at https://www.consumerfinance.gov/about-us/newsroom/cfpb-issue-spotlight-analyzes-artificial-intelligence-chatbots-in-banking/

Images courtesy of PixaBay

Related Posts

The Future of Electric Commercial Aviation

The Future of Electric Commercial AviationOver the next ten years, the commercial aviation industry is poised to undergo a transformative shift from traditional hydrocarbon-based propulsion toward electric propulsion. This change to electric commercial aviation is being driven by several key factors: environmental concerns related to greenhouse gas emissions, technological advancements in battery energy storage, rising interest in the use of […]
green agriculture project
- Part of VUGA Media group -best seo company