Building a Foundation for Generative AI in Financial Services
This article explores strategic approaches for IT leaders in financial services organizations to implement GenAI, focusing on establishing a center of excellence, identifying initial use cases and maintaining regulatory compliance.
Generative AI (GenAI) has the potential to revolutionize how organizations operate by introducing innovative solutions that can enhance efficiency, improve customer experiences and drive competitive advantage. However, many financial services organizations are still in the early stages of exploring and adopting GenAI solutions.
How to lay the groundwork for GenAI success
Build an AI center of excellence
As a first step, we recommend establishing a dedicated center of excellence (CoE). This cross-functional team, comprising representatives from IT, business units, risk management, compliance and audit, will guide the organization's GenAI strategy and implementation. We also strongly recommend identifying an executive sponsor who can help make sure that GenAI projects have the necessary resources, funding and organizational support to succeed. The best practice is to have co-leads of this governance group — ideally the CIO and the head of risk. This pairing not only creates a great balance but can also help facilitate a deeper organization-wide understanding of the potential and risks of these emerging technologies.
The CoE's responsibilities should include:
- Validating and prioritizing potential GenAI use cases based on business value, complexity, data readiness, and market/business needs and requirements, including corporate and regulatory compliance. Use cases also provide an opportunity to teach the broader organization about GenAI technologies, its associated risks and the mitigations that can be employed.
- Developing and enforcing governance, risk and compliance (GRC) policies and controls around the development, testing and deployment of GenAI models.
- Aligning with applicable regulatory requirements and engaging with oversight bodies as needed.
- Facilitating knowledge sharing, education and awareness-building across the organization.
- Ensuring consistency, clarity and definitiveness in the artifacts developed to define the standards, policies and procedures to be followed.
- Getting leadership buy-in and sponsorship to spearhead execution, adoption and diligent use across the organization.
- Incorporating continuous improvement into the CoE framework by constantly curating and updating CoE materials.
- Conducting post-implementation reviews of use cases to assess whether the expected results were achieved. Such learnings will help the broader organization better understand the risks and potential of GenAI technologies and foster cross-functional learning about maximizing the business value of future use cases.
By involving key stakeholders from the beginning, the CoE can help financial services IT leaders more easily manage the evolving risks and complexities of GenAI. This proven approach fosters a collaborative environment that supports the effective adoption of these transformative technologies and helps the organization learn from all use case implementations, both successful and unsuccessful.
Check out our CEO's Guide to AI
AI education and training
Next, we highly encourage investing in comprehensive training and education programs to build GenAI literacy among your employees. From IT practitioners to business users, a shared understanding of the capabilities, limitations and best practices around GenAI is essential for successful implementation, solution scaling and mitigating potential risks.
The education and training program should cover a range of topics, including:
- Fundamentals of GenAI: How these models work, their underlying technologies and the key differences compared to traditional AI/ML.
- Use case identification and evaluation: Guidance on how to assess potential GenAI use cases, considering factors like business value, data readiness and risk profile.
- Responsible AI practices: Instruction on ethical AI principles, bias mitigation, data privacy and explainability requirements.
- Governance and control frameworks: Overview of the organization's GenAI policies, approval processes and monitoring procedures.
- Hands-on workshops and simulations: Opportunities for employees to experiment with GenAI tools and apply their learnings to real-world scenarios.
- Security requirements and considerations within a GenAI pipeline: Detail where, how and why security should be considered and applied.
What GenAI use cases should you pursue?
Begin with low-risk internal applications
When implementing GenAI solutions in the financial services industry, we recommend starting with relatively low-risk, non-customer-facing use cases. These early applications demonstrate tangible results, allow teams to build confidence using the tools and gain additional buy-in for further initiatives.
Compliance and audit
Compliance and audit processes are both promising areas for early use case exploration. AI-powered tools can assist auditors by automating the review of historical documentation, transcripts and other data sources, allowing them to focus more on analysis and decision-making. This improves productivity, especially in the audit preparation process ahead of fieldwork, and helps establish a framework for the controlled use of GenAI within the organization.
Security
Other high-value use cases can be found in cybersecurity and information security. As bad actors increasingly leverage AI and machine learning (ML) to identify vulnerabilities and automate attacks, InfoSec teams can harness these same technologies to enhance their defensive capabilities. Continuous penetration testing, automated threat detection and AI-driven incident response are all areas where GenAI can strengthen an organization's security posture — especially at a time when qualified security experts can be hard to find.
Coding assistants
GenAI can be a powerful tool for code, especially when dealing with legacy systems. It can help developers understand and remediate old code, particularly when the original creators are no longer with the company. For instance, when faced with end-of-life platform migrations, AI can analyze legacy code, provide insights and even suggest initial remediation steps. This gives developers a head start in updating or replacing outdated systems. Additionally, AI-assisted code generation can improve productivity by automating repetitive coding tasks, allowing developers to focus on more complex problem-solving. While coding represents a low-risk use case that doesn't directly impact customers, it can significantly enhance efficiency and help modernize financial institutions' technology infrastructure.
What customer-facing GenAI solutions should you consider?
Financial services leaders should also explore opportunities to leverage GenAI in customer-facing scenarios, such as chatbots and virtual assistants.
- Intelligent chatbots: Advanced AI-powered chatbots can handle customer inquiries, provide product information, and assist with basic transactions via 24/7 support.
- Automated call summarization: After customer service calls, AI can generate accurate summaries, saving agents time and improving service quality.
- Personalized customer interactions: AI can help customer service representatives access relevant customer information quickly, enabling more personalized and meaningful conversations.
- Financial calculations: AI can assist in tasks like risk profiling, portfolio recommendations (based on customer risk profiles), investment outlooks, patterns, market trends and demographic patterns.
- Customer loyalty: AI can drive higher satisfaction rates via personalized communication campaigns (e.g., direct mail, email, phone calls, marketing, etc.).
Because they directly impact the customer experience and carry higher risks, customer-facing GenAI applications require a more cautious approach, with robust controls and more human oversight to ensure the accuracy and appropriateness of the generated outputs. These types of applications often handle sensitive customer data and financial transactions, making them prime targets for cybersecurity threats. Moreover, any errors or inconsistencies can lead to immediate customer dissatisfaction, reputational damage and potential regulatory issues.
Customer-facing AI needs to be thoroughly vetted and controlled for accuracy, compliance and appropriate responses. Employ strong testing methodology to ensure thorough modeling of GenAI large language models (LLMs), training models and synthetic data, with conditions and permutations measured against desired results. Control testing is essential to mitigate risks and ensure the reliability of AI-generated outputs
Regulatory considerations for GenAI
Financial organizations must navigate a complex regulatory landscape that demands a high degree of transparency and accountability. Regulatory bodies will expect these institutions to demonstrate compliance with established policies and guidelines and a clear understanding of the intent behind their use of these powerful technologies.
A key area of focus for regulators will be financial services firms' ability to prove the decision-making processes and controls surrounding their GenAI implementations. Organizations must be prepared to provide detailed documentation about how they have evaluated potential risks, trained their models and implemented safeguards for the appropriate and ethical use of AI-generated outputs.
Regulators will also scrutinize the data used to train GenAI models, looking for evidence that financial services firms have taken proactive steps to address issues of bias, privacy and data integrity. The ability to trace the provenance of data and demonstrate the deliberate curation of training datasets will be crucial in satisfying regulatory requirements.
Additionally, data sovereignty and residency are critical considerations, especially in light of regulations such as the Securities and Exchange Commission (SEC) guidelines, General Data Protection Regulation (GDPR), Gramm-Leach-Bliley Act (GLBA), Payment Card Industry Data Security Standard (PCI DSS) and Personally Identifiable Information (PII) protection laws. Financial institutions must store and process data in compliance with these regulations, which may involve keeping data within specific geographic boundaries and adhering to strict data handling protocols.
Beyond the technical aspects of GenAI deployment, financial services leaders must also be prepared to engage with regulators on the broader implications of these technologies. Discussions around the potential impact on customer experiences, the management of reputational risks and the overall governance frameworks will be essential to building trust and maintaining a constructive dialogue with oversight bodies.
Learn more: Check out our CISO's Guide to AI
Get started now
While GenAI technology may still be in its early stages, the pace of innovation shows no signs of slowing. The time to act is now.
By cultivating a culture of continuous learning and adaptation, financial services leaders can equip their organizations to navigate the evolving GenAI landscape and unlock the full potential of these transformative technologies.
This report may not be copied, reproduced, distributed, republished, downloaded, displayed, posted or transmitted in any form or by any means, including, but not limited to, electronic, mechanical, photocopying, recording, or otherwise, without the prior express written permission of WWT Research. It consists of the opinions of WWT Research and as such should be not construed as statements of fact. WWT provides the Report "AS-IS", although the information contained in Report has been obtained from sources that are believed to be reliable. WWT disclaims all warranties as to the accuracy, completeness or adequacy of the information.