Banking

Pros and cons of ChatGPT for finance and banking


ChatGPT and other large language models show tremendous promise in streamlining many aspects of finance and banking. At the same time, they introduce new challenges that teams must address to ensure safety and compliance and mitigate risks.

“While many banks remain cautious due to the heavily regulated nature of the financial services industry, the use of generative AI is likely to increase in the coming years,” said David Donovan, executive vice president and head of financial services, North America, at digital transformation consultancy Publicis Sapient.

Financial institutions will likely see the best outcomes by focusing on use cases where ChatGPT has already shown positive results.

“At this point in development, companies in the financial sector can have a degree of confidence when using LLMs for practice areas where these tools have a demonstrated record of success,” said Fabio Caversan, vice president of digital business and innovation at digital consultancy Stefanini Group.

These areas include customer service and support, marketing and human resources — functions that primarily involve the use of language, something ChatGPT and other large language models (LLMs) can handle well. However, “at this point, I don’t think that LLMs’ numerical and statistical capabilities are at a high enough level to recommend using them for fraud detection or financial forecasting,” Caversan said.

Banks and other financial institutions are cautiously exploring new use cases, but Caversan has yet to see any breakout successes. It will take more time to fully understand LLMs’ potential for effectiveness in financial services.

9 use cases for LLMs in finance and banking

Industry experts believe ChatGPT-like technology could provide value to the financial sector in several ways.

1. Summarizing complex insights

Priya Iragavarapu, vice president of data and analytics delivery at digital transformation consultancy AArete, expects to see ChatGPT and other LLMs play a leading role in communicating complex insights derived from critical analysis — such as gap, trend and spend analysis — as well as forecasting. LLMs could summarize these insights into plain English to provide a unique, customized message for each customer.

2. Streamlining underwriting processes

Administering mortgages and auto and personal loans involves a thorough underwriting process with regulatory requirements. Iragavarapu believes that LLMs can assist in this process when domain-specific data is incorporated into model training.

3. Enhancing customer service and experience

Sujatha Rayburn, vice president of information management and analytics at Delta Community Credit Union, Georgia’s largest credit union, is exploring how LLMs could improve existing chatbots. She has found that LLMs complement traditional chatbot approaches with better semantic and language processing and natural conversational capabilities. This helps them perform complex, multistep tasks and assist in customer decision-making.

4. Automating compliance

Rayburn is also experimenting with using generative AI to reduce compliance costs by decreasing the time and labor expended on document reading and comprehension. For example, automated regulatory relevance checking might take in complex regulatory conditions and match them against aspects of the business. This could help identify potential compliance violations with know-your-customer and anti-money laundering processes that require identity verification, checks against sanctions lists and suspicious transaction monitoring.

5. Risk assessment and management

Donovan is exploring LLMs’ ability to analyze vast amounts of data to identify patterns, trends and correlations in the area of risk assessment. This capability could enable financial institutions to make better risk assessments and more informed lending decisions, leading to lower default rates and higher profitability.

6. Improving personalization

Peter-Jan Van De Venn, vice president of global digital banking at digital consultancy Mobiquity, is investigating how LLMs can help banks adjust the language and content of their digital interactions to match clients’ specific needs, preferences and behavior. This, in turn, could make customer journeys more personalized.

7. Automating document processing

Iu Ayala, CEO and founder of AI consultancy Gradient Insight, is looking at how ChatGPT can automate processing of financial documents, such as loan applications, insurance claims and account opening forms. By understanding and extracting relevant information from unstructured data, LLMs can accelerate and streamline document processing, resulting in improved efficiency and reduced processing times.

8. Translating legalese into plain language

Jay Jung, president and founder of strategic consulting firm Embarc Advisors, is exploring how ChatGPT can help interpret legal documents such as term sheets, purchase agreements and side letters. In his experience, most of these are written in legalese, which is difficult for nonexperts to understand — but ChatGPT can quickly translate such documents into succinct plain language.

9. Generate executive briefs

Jung has also explored ChatGPT’s ability to package financial analysis and insights into executive briefs. Writing up key bullet points is quick, but packaging takeaways into board or executive briefs can be laborious. “ChatGPT is a great tool where I can drop in the bullets and then have ChatGPT write out the brief and adjust the tone,” Jung said. “It saves me a ton of time.”

Challenges of using ChatGPT in finance

Financial enterprises, which tend to be risk averse, should proceed cautiously when adopting generative AI. Addressing the following challenges is key to safely scaling use of ChatGPT and other LLMs.

  • Duty of care. Van De Venn is concerned about how using ChatGPT could affect duty-of-care requirements. In his view, first-line support on simple questions about products is acceptable, but providing financial advice is typically bound by regulations designed to protect consumers from unsuitable advice. “It’s up to the regulators to act fast and provide clarity,” he said.
  • False information. Given LLMs’ propensity to fabricate false information that appears legitimate, Caversan said he ensures that qualified personnel vet all results and use cases to ensure accuracy. Organizations should focus on AI augmentation rather than end-to-end automation to combine tools’ benefits with employees’ creativity and critical thinking skills.
  • Data security and privacy. Rayburn is concerned about ensuring the security and privacy of customer data when using LLMs. Organizations must protect sensitive data entered into chat interfaces from unauthorized access or disclosure.
  • Contextual understanding. According to Rayburn, the current crop of LLMs lacks nuanced understanding of a customer’s context. Further training on vast amounts of banking data and documents will be required for models to understand the language and terminology specific to the banking industry.
  • New cost models. Rayburn also mentioned that LLMs require significant investment in specialized hardware, software and cloud services to operate efficiently. Organizations must establish new approaches to evaluate the total cost of ownership for new LLM services, including the workflows required to properly customize and contextualize new models.
  • Bias. Ayala said that organizations must take care to avoid perpetuating biases present in models’ training data, which could have detrimental consequences in finance. Implementing bias detection, ongoing monitoring and continuous model improvement can help mitigate these risks.
  • Transparency and explainability. Popular LLMs like ChatGPT currently have limited ability to explain their reasoning or cite sources for their claims. The EU’s Digital Services Act, due to go into effect in 2024, mandates that firms provide algorithmic transparency for decisions made by online platforms. Ayala said that finance firms must develop techniques to understand and interpret LLM output to ensure compliance with regulatory requirements and provide clarity to users.



Source link

Leave a Response