EU financial services firms must consider how their adoption of AI relates to both the EU AI Act and the existing financial regulatory framework. Recent updates from supervisors and the European Parliament on how firms are using AI highlight the extent of this overlap.
ECB: AI’s impact on banking
The European Central Bank has published a supervision newsletter on AI’s impact on banking. This follows workshops it ran with 13 banks in 2025, focusing on use cases for credit scoring and fraud detection.
The ECB’s findings include:
Changes to risk management: Banks have had to update their governance and compliance frameworks. About half of the banks in the sample have introduced dedicated policies or committees to oversee AI.
Accountability gap: More work is needed to make sure that second and third lines of defence adequately oversee the use of AI. Some banks have appointed dedicated senior roles in the first line e.g. a Chief AI Officer. Sometimes this is a dual role for the Chief Data Officer.
Third party exposure: As reliance on external providers grows, so too do data privacy and operational resilience risks. The ECB is interested in potential deficiencies in operational resilience frameworks regarding cybersecurity and third-party risk management.
Data governance yet to adapt: The ECB only found a few examples of firms effectively applying data management standards and adjusting them to the specific requirements of AI models. The ECB plans to follow up on data quality in the context of its work on risk data aggregation and risk reporting.
Preparing for the AI Act: Banks are conducting compliance self-assessments and system inventories and implementing processes for new AI use cases.
Looking ahead: The ECB will continue monitoring how banks are using AI, with a focus on governance and risk management.
EBA: Mapping the AI Act
The European Banking Authority has mapped the AI Act against existing rules for banks and payments firms. In a letter to the European Commission, the EBA describes how EU financial services law already addresses key AI Act requirements.
The EBA focused on the use of AI for creditworthiness and credit scoring which are high-risk AI systems under the Act. For example, the Act requires deployers of high-risk AI systems to monitor operations, keep logs and report incidents. The EBA found that these complement existing requirements to identify, manage, monitor and report risks under CRD, keep records under CRR and PSD2, and report major ICT-related incidents under DORA.
A factsheet summarises the EBA’s view that:
the AI Act does not contradict EU banking and payment law,
firms may need to integrate AI Act compliance into existing frameworks, and
the EBA does not need to introduce or review existing guidelines in response to the AI Act.
Parliament: Impact of AI on the financial sector
The European Parliament has adopted a resolution on AI in financial services. This follows a report from ECON (see our previous blog post: Parliamentary committee warns of overlap between EU AI rules and financial regulation).
The resolution notes the opportunities from AI, including fraud detection and transaction monitoring. It also flags risks arising from data bias, model opacity and over-reliance on a few providers.
The Parliament asks the Commission and supervisors to issue guidance rather than producing new rules. It also urges regulators to cooperate better on this topic, for example by aligning interpretations and improving cross-border coordination.

/Passle/60746e77e5416b13f482811b/SearchServiceImages/2025-07-15-11-03-14-711-687635723f3ceaccf87fa77a.jpg)
/Passle/60746e77e5416b13f482811b/SearchServiceImages/2025-11-19-23-20-47-869-691e50cf15097ee1edefc4bd.jpg)
/Passle/60746e77e5416b13f482811b/SearchServiceImages/2025-11-18-11-01-04-036-691c51f0e22d9808a78f7cb1.jpg)
/Passle/60746e77e5416b13f482811b/SearchServiceImages/2025-11-17-11-31-40-070-691b079c79368e3779b27b56.jpg)
/Passle/60746e77e5416b13f482811b/SearchServiceImages/2025-11-14-21-58-34-531-6917a60a8967166e34962cef.jpg)