This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 2 minute read

EU institutions explore AI in financial services

As our mid-year legal outlook shows, the legal impact of AI adoption is not limited to the tech sector. Financial services firms are investing heavily in the technology but as they roll out AI they need to do so in a way that meets the standards imposed on them by financial regulation.* Recent papers from EU institutions shed some light on the areas that firms should focus on.

AI through the lens of MiFID II

The European Securities and Markets Authority has released a statement on the use of artificial intelligence in the provision of retail investment services. ESMA hopes the statement will remind banks and investment firms using AI about their key obligations under MiFID II.

The guidance tells firms to:

  • Focus on clients: ESMA stresses that firms should have an “unwavering commitment” to act in clients’ best interests.
  • Be transparent: Firms should give clear, fair and not misleading information about how they use AI to provide investment services.
  • Have bespoke controls: ESMA says that its crucial to have effective risk management frameworks specific to AI implementation. Controls should assess both AI inputs (e.g. quality of data) and outputs (e.g. accuracy of information).
  • Be dynamic: Firms should tailor risk management processes and procedures to address the unique challenges posed by AI. These processes, such as testing and monitoring, should be dynamic to allow for timely changes to be made.
  • Look beyond approved tools: Firms should consider their controls around how employees use AI systems both with and without the approval of senior managers.
  • Think about third party risks: ESMA reminds firms to do their due diligence when outsourcing critical and important operational functions to third parties.
  • Keep records: These should include records of decisions made during AI deployments and records of client complaints.
  • Upskill the board…: The management body should understand how AI is used in the firm and foster a culture of risk ownership, transparency and accountability.
  • …and employees: Staff training should cover AI risks, ethics and regulation.

ESMA also notes that firms should consider other requirements under the EU rulebook such as the Digital Operational Resilience Act (DORA) and the AI Act.

Taking the pulse of financial services on AI 

Businesses from all sectors are starting to ramp up preparations to comply with the 113 articles, 13 annexes and 180 recitals of the EU’s AI Act. Touted as the world’s first AI law, the AI Act regulates the development and use of AI in the EU and introduces significant new obligations for high-risk systems.

The European Commission has launched a consultation to better understand the practical impact of AI – and AI regulation – in financial services. The consultation takes the form of a survey which covers the general questions on how AI is used, specific questions about particular use cases and additional questions on the AI Act as it applies to the financial sector.

The consultation closes on 13 September 2024. The Commission then intends to publish a report on its findings. Things to look out for include:

  • Analysis on the governance and risk management measures that firms have put in place around AI
  • How firms respond to the questions on general purpose AI (including generative AI aka genAI)
  • How the responses differ for different sub-sectors (e.g. comparing AI adoption between banks and asset managers)

This survey presages future guidance to clarify the supervisory expectations for the use of AI by the financial services sector.

Looking ahead

As AI becomes mainstream, the financial sector is deepening its reliance on the technology. For example, we have recently written about how payments firms are using AI for more than just tackling incidents of fraud and how banks can use GenAI as a productivity tool across their business. In any case, as regulated firms explore novel areas to deploy AI and GenAI they need to pay close attention not only to incoming AI-specific rules but also to the existing financial services framework.

 

* See for example this FT article which explains how European banks and fintechs embrace AI but notes that regulatory concerns may slow down deployment at scale.

Banks also had to be careful to roll out the nascent technology while adhering to the industry's strict compliance rules and navigating an unchartered regulatory environment.

Sign up for real-time updates on the latest ESG developments, delivered straight to your inbox - subscribe now!

Tags

artificial intelligence, ai, mifid, eu, banking, fintech, funds, insurance, payments, securities, operational resilience