President Biden issued an executive order on 30 th October titled "Advancing Responsible Development and Use of Trustworthy Artificial Intelligence" that addresses the governance of AI technologies across various sectors, including financial services. This directive focuses on both fostering AI innovation and managing associated risks. The order was issued just prior to the AI Safey Summit in the UK and was also released before the final text of the EU AI Act. It is important to understand how this Act may shape AI governance in financial institutions.
AI Safety and Security
The executive order emphasizes the need for AI systems within financial institutions
to be safe and secure. It mandates comprehensive evaluations of AI systems to
enhance their reliability in processing financial transactions and managing data. This
requirement is essential for protecting financial assets and consumer data.
Regulatory Oversight and Innovation
The directive encourages financial regulators such as the Federal Reserve, FDIC,
OCC, and CFPB to use their authority to protect consumers from AI-related threats
and to ensure financial stability. It suggests the necessity for rulemaking and
clarification on the application of existing regulations to AI, identifying risks like fraud,
discrimination, and threats to privacy.
Third-Party AI Services and Due Diligence
The order advises financial institutions to conduct rigorous due diligence and
ongoing monitoring of third-party AI services. This recommendation aims to mitigate
risks associated with the use of external AI applications and services, while aligning
with consumer protection laws and ethical standards.
Bias in Automated Systems
The order specifically calls attention to potential biases in automated underwriting
models and collateral valuation systems within the housing finance sector. It
suggests that entities regulated by the Federal Housing Finance Agency and the
Consumer Financial Protection Bureau should assess these systems for biases
affecting protected groups.
Tenant Screening and Housing/Credit Advertising
The directive includes a task for the Department of Housing and Urban Development
and the Consumer Financial Protection Bureau to issue guidance on the use of
tenant screening systems and AI in housing and credit advertising. The aim is to
clarify the application of federal laws to algorithms used in these sectors.
Consumer Protection and Oversight
The executive order implies an increased need for oversight of AI tools in financial
services. It highlights the importance of ensuring AI systems' compliance with
consumer protection laws and managing emerging risks to prevent discrimination.
This includes a focus on the explainability and transparency of AI systems.
Responsible AI Development
The order suggests that responsible AI development is key to maintaining consumer
trust and financial stability. It advocates for AI systems in the financial sector to be
transparent, fair, and equitable, aligning with ethical practices.
Considerations for Financial Institutions
Financial institutions are encouraged to undertake a series of actions in response to
the executive order. These include conducting AI risk assessments, enhancing AI
transparency, investing in AI literacy, engaging with regulators, updating data
governance policies, adopting privacy-enhancing technologies, forming AI development partnerships, and staying prepared for changes in the regulatory landscape.
In summary, the executive order reflects a significant movement towards structured
AI governance in the financial sector. It promotes the adoption of enhanced safety,
security, and ethical standards in AI, along with regulatory oversight, to balance AI
benefits with potential risks.
Comments