When autocomplete results are available use up and down arrows to review and enter to select.
As we continue to see regulatory focus on innovation and emerging technologies, it comes as no surprise that the agencies may be turning their attention to artificial intelligence (AI) and machine learning (ML) and how they are being employed at the bank level.
In fact, late last year, the federal agencies issued an Interagency Statement on the Use of Alternative Data in Credit Underwriting, which began offering some guidance on these topics. While recognizing the potential of such technologies and the benefits they can bring to consumers, the statement cautions financial institutions to consider the impacts of their use, particularly as they relate to: fair lending laws; prohibitions against unfair, deceptive, or abusive acts or practices; and the Fair Credit Reporting Act. This guidance strikes a balance between encouraging “responsible use” of data and reminding banks of certain lending regulations.
And while the guidance moves the ball in the right direction, ICBA is working to ensure that community banks have sufficient information to utilize these technologies to the fullest and in accordance with applicable regulatory laws. So, when the Federal Reserve, the Federal Deposit Insurance Corp., the Office of the Comptroller of the Currency and the Consumer Financial Protection Bureau reached out this summer to meet on AI and ML, ICBA made sure community banks were at the table.
In what quickly became a discussion about use of these technologies, ICBA shared member feedback concerning potential benefits for community banks and their customers.
Perhaps most importantly, we emphasized the novelty of the technology and cautioned regulators from prematurely creating parameters around their use, which could stymie market development. Some regulators are still reviewing their authorities relevant to AI applications. They will be submitting plans to the Office of Management and Budget to achieve consistency with an executive order on AI that President Trump issued in 2019.
Right now, the primary barrier to adoption of AI and ML is uncertainty with examiner expectations. And while community banks could benefit from regulatory clarity, it is also clear that any specific guidance at this juncture would supersede actual use cases.
That is why we encouraged regulators to let the market mature further before putting up any potential guardrails or prohibitions. Instead, we recommended they provide illustrative examples of permissible data inputs and approved back-testing models.
AI and ML represent the next level of technology, allowing for more efficient underwriting, risk management and other processes. But in and of themselves, they do not create new products or answers.
For example, in the case of lending, AI and ML can help to provide an additional data-rich complement to a FICO score for underwriting. But when evaluating it through this lens, it is important to recognize that it is not the technology itself that warrants scrutiny; it is the application of the technology that needs to be evaluated.
With projections that AI and ML can save the financial services industry $1 trillion, we expect that it will continue to take hold in banking, offering efficiencies in everything from lending processes to risk management solutions. But, just like all technology that community banks may use or partner with others to use, it is important to thoroughly understand it and know how to mitigate any associated risks.
Community banks can identify the sweet spot between risk and innovation that allows them to leverage AI and ML for the customer’s benefit and ensure compliance in the process. And ICBA will continue to work with regulators and others to support this evolution.
Michael Emancipator is ICBA vice president and regulatory counsel.