The financial services industry is at an artificial intelligence (AI) inflection point. The advancement of AI technology and the third-party services (vendors) that offer them offer an array of exciting products, solutions, and other opportunities to save time and money moving forward.
However, these new opportunities come with the potential for risks, including data privacy and compliance issues.
Let’s explore some of the most common third-party AI-related risks in financial services and how your financial institution can better navigate the fast-evolving AI landscape.
Related: Webinar: Managing Third-Party AI Risk: What You Need to Know Today
Before financial institutions (FI) can address AI risk, they need to understand common AI vendor risks.
Related: TPRM 101: Top Third-Party Vendor Risks for Financial Institutions
While third-party AI risk shouldn’t be underestimated, there are some action steps FIs of all sizes can take to ensure they properly mitigate and manage risk:
Whether AI systems and solutions are used internally or via third-party relationships, it’s crucial to understand how AI is used across your organization. Regularly assess AI decision-making processes, such as automated credit scoring and loan underwriting systems and AML/CFT transaction detection services, to ensure decisions are not based on biased or incorrect data—which leads us to our next actionable insight.
Related: Credit Risk - You’ll Take the Blame If Your Vendor Doesn’t Have the Credit
One of the main critiques of AI use in financial services and other industries is bias. AI, specifically generative AI (genAI), systems can produce false, biased, or otherwise unacceptable results at any point. While this data can lead to operational and compliance risks, it can also cause reputational risks.
AI washing, which occurs when companies or organizations exaggerate or misrepresent their use of AI in marketing or communications, can also result from a vendor-related incident. For example, suppose a vendor distributes your FI’s marketing materials. In that case, issues can arise if the system’s data points or parameters are incorrect and the distribution methods don’t follow compliance guidelines.
Ensure your FI has a monitoring process to detect bias and other issues with AI systems. Implement mechanisms to detect unacceptable results early so your FI can take corrective action immediately.
Related: Regulating the Future: What Financial Institutions Need to Know About AI and Regulatory Risks
Not all FIs will take—or should take—the same approach to vendor relationships and AI. Knowing your FI's risk appetite is part of safely using AI, as well as other services and technology.
An institution's risk appetite, determined by its board, should be documented in a statement that guides strategic decision-making and resource allocation. This statement typically outlines the institution's tolerance for risk in key areas like lending, technology, and operational risk.
Institutions should set a risk tolerance that accounts for both the benefits and the risks of AI use in specific areas. For example, AI used in credit scoring or decision-making should be within the institution's risk tolerance, especially if errors could lead to consumer harm.
Related: ERM 101: What’s Your FI’s Risk Appetite?
A risk management control—a measure, process, or mechanism put in place to mitigate risk—reduces the likelihood of a risk event occurring or minimizes the impact if it does occur.
FI should implement AI controls, monitor their deployment, and adjust them as needed. For example, an FI could set up a committee to oversee ethical AI use or have a human review before any AI-suggested action.
Related: Risk Management Controls in Banking
AI is constantly evolving. Over time, we’ll continue to see AI technology advance, but with every advancement comes new barriers. That’s why teams involved in AI system deployment must continually train in the latest best practices and ethical considerations as part of the organization’s broader compliance training program.
FIs should regularly assess AI usage internally and with vendors to ensure an understanding of data usage.
Related: 5 Mistakes That Will Sink Your Compliance Training Program
If your FI recently integrated AI or plans to use it soon, the National Institute of Standards and Technology’s (NIST) AI Risk Management Framework is an excellent resource. The core components for AI risk management outlined in the framework include:
Ncontracts also offers various resources, including vendor management and contract management, for FIs as they begin their AI vendor journeys.
Related: The Vendor Management Solution Buyer’s Guide
A risk assessment is the first step in the risk assessment lifecycle, a multi-step process that includes audits, findings, and actions based on those findings. Before performing a risk assessment, especially for a new risk area such as vendor AI risks, it is helpful to refer to a model risk assessment provided through a knowledge-as-a-service (KaaS) provider like Nrisk.
As you examine your FI’s AI risk management framework, consider the key elements of model risk management, which can also be applied to AI:
Related: Whitepaper: Creating Reliable Risk Assessments
Too often, FIs don’t perform due diligence on their vendors, and by the time they notice red flags, the relationship is on a downward spiral. Or worse, the FI’s operations, compliance posture, reputation, finances, and consumer relationships are ruined because of the partnership.
Before using a vendor’s services or solutions, you should consider its finances and compliance posture, evaluate its security systems, and study how the vendor handles nonpublic information to ensure data security.
Below are some questions to ask potential vendors about their use of AI:
Another valuable tool is the Financial Services Information Sharing and Analysis Center’s (FS-ISAC) Generative AI Vendor Risk Assessment Guide. The guide aims to simplify the vendor due diligence process and includes a structured assessment model and a vendor questionnaire that facilitates reporting and record-keeping.
Related: Webinar: Mastering Vendor Tiering
If your FI notes red flags before or after a vendor relationship has begun, there are ways to address the issues.
When drafting contracts and service level agreements (SLAs), specify requirements mandating transparency and accountability around AI use. If an existing contract exists, consider addendums to address new AI-related risks or draft new agreements, as needed.
If the vendor hesitates to share the information, frame the needed transparency as a mutual benefit to enhance trust and collaboration. Remind the vendor that regulators are carefully evaluating FIs for vendor-related violations and that being proactive could save both parties headaches (and even penalties) in the long run.
Related: How to Develop an SLA for Third-Party Providers
While we’ve discussed the risk associated with third-party AI relationships in depth, it’s also important to remember why your FI is using AI in the first place. AI can significantly enhance your FI’s strategic capabilities, saving you valuable time and resources. FIs adopting AI are seeing the benefits of increased productivity. The McKinsey Global Insitute estimates that genAI could add between $200 billion and $340 billion in value annually, or 2.8 to 4.7 percent of total industry revenues, across the global banking sector.
Third-party AI tools are also helping FIs uncover new opportunities. For example, predictive analytics, which uses data to forecast future events, can identify new market opportunities and customer needs. By analyzing large data sets, AI can identify trends and patterns that may not be immediately visible, allowing institutions to tailor their offerings and approach.
Related: Q&A: The Future of Artificial Intelligence and Contract Management
Managing AI and vendor risks can be a cumbersome task, but the right vendor management solution can help your FI stay informed about your vendors and their activities so you can mitigate risk and act with confidence.
Want to learn more about vetting and onboarding technology vendors?
Download The Ultimate Guide to Fintech and Third-Party Vendor Onboarding.