Artificial intelligence
Artificial intelligence, or AI, is creating new possibilities in a variety of fields, including financial services. However, AI can also create risk, including safety and soundness risks for lending institutions.
As the regulator of the Farm Credit System, the Farm Credit Administration wants to ensure that System institutions have the resources they need to manage these risks while taking advantage of the benefits that AI affords.
In 2023, the FCA board adopted innovation as one of our agency’s top priorities, and in January 2024, the board issued an innovation philosophy statement (PDF) to lay out our position on innovation, including AI, in the System and the agency. The statement included a five-part framework to guide our efforts in this area.
Listed below are resources from our regulations, our examination manual, and outside sources that provide relevant guidance to System institutions on managing the risks associated with AI. Please note that we will continue to monitor the use of AI in the System, so please check this page periodically for updates and additional guidance.
Statutory and regulatory requirements
System institutions should consider all statutory and regulatory requirements when deciding whether and how to use AI in their business operations. Depending on their use, AI-based technologies could require compliance with any number of legal requirements. Several areas may have implications for AI, such as electronic communications, credit decisions, and risk management.
Also, System institutions should remain vigilant against model bias — to ensure that AI-based systems do not impair their ability to comply with nondiscrimination and data privacy laws. Using AI does not absolve any institution from the responsibility of complying with legal requirements.
In October 2023, FCA issued a final rule on cyber risk management , which has particular relevance to AI. This rule, which takes effect on Jan. 1, 2025, amends and expands portions of the following regulatory sections:
- Regulation 609.905, which covers general risk management practices for cyber risk
- Regulation 609.930, which covers cyber risk management in detail
- Regulation 609.935, which covers business planning
- Regulation 609.945, which covers records retention by System institutions
Examination manual
Before adopting technologies that rely on artificial intelligence, System institutions should review the following sections of the FCA examination manual:
- 31.7 Information Technology & Security (PDF) , which provides guidance on evaluating the effectiveness of IT and security processes to determine whether sufficient internal controls are in place to support critical business functions and protect information assets (including any processes that utilize AI)
- 31.1 Direction & Control of Operations:
- Model Risk Management (PDF) , which provides guidance on evaluating the overall model risk framework, including any models that use AI
- Third-Party Risk Management (PDF) , which provides guidance on the overall processes and controls for managing third-party risk and relationships, including relationships with third-party vendors that provide AI resources
External sources of guidance
System institutions may find these external sources helpful when considering AI-related risks:
- Federal Financial Institutions Examination Council IT WorkPrograms :
- National Institute of Standards and Technology Trustworthy & Responsible Artificial Intelligence Resource Center
For more information
If you work for a System institution and have questions about AI and our expectations for your approach to AI-based systems, please contact your examiner-in-charge.