Yesterday, on February 27, 2025, the European Court of Justice (ECJ) issued a landmark ruling on the transparency of automated credit ratings in case C-203/22 “Dun & Bradstreet Austria”. The ruling strengthens the rights of data subjects and obliges companies to disclose their assessment procedures. The ECJ’s decision and its reasoning have direct implications for companies that use automated decision-making (ADM), including AI applications.
Background
The plaintiff, an Austrian consumer, wanted to conclude a mobile phone contract but was rejected by the provider due to a negative credit rating. This rating was based on an automated analysis by Dun & Bradstreet Austria, without the plaintiff receiving any further information on the underlying criteria or the data used. She then sued for information about the rating method and the data used. Ultimately, the ECJ had to clarify whether such an automated credit rating assessment satisfies the requirements of the General Data Protection Regulation (GDPR).
The ECJ ruled that there was a violation of the GDPR because the company had not sufficiently fulfilled its transparency obligations. I have already reported in detail on the prohibition of automated decisions under Art. 22 GDPR in connection with HR tools, also in light of the Schufa ruling from 2023. Such ADM are only permissible under data protection law in certain exceptional cases, see para. 2. lit. a) – c): necessary for the conclusion, or performance of a contract; based on specific legal provisions; with the express consent of the data subject.
Prerequisite for fair and transparent data processing
However, even if such a legal basis exists, it is and remains necessary to provide the data subjects with sufficient information about what the company does with which personal data, how and when, see Art. 12 GDPR: “The controller shall take appropriate measures to provide any information referred to in Articles 13 and 14 and any communication under Articles 15 to 22 and 34 relating to processing to the data subject in a concise, transparent, intelligible and easily accessible form, using clear and plain language, …” This therefore concerns both the information that must be provided before the start of data collection and processing (Articles 13 and 14), and the information that the controller must provide at the request of the data subject, as in the present case (Article 15). The scoring company Dun & Bradstreet fell far short of this requirement. The plaintiff was not provided with any information whatsoever when the data was collected (in accordance with Articles 13 and 14 of the GDPR), nor was the information provided in accordance with Article 15 sufficient.
Articles 13 to 15 of the GDPR list in detail what information must be provided. According to Art. 13(2), this includes information that is “necessary to ensure fair and transparent processing,” which, with regard to ADM, is specified in Art. 22 as “meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.” (see also Art. 15(1)(h) GDPR). The ECJ has now clarified what this means and thus provided clear guidelines for companies that (want to) use ADM.
The ECJ’s transparency requirements for ADM
The controller must describe the procedure and the principles that are specifically applied in such a way that the data subject can understand which of their personal data has been used in the context of automated decision-making and in what way. According to the ECJ, this means, within the meaning of Art. 15 (1) lit. h) GDPR, all information “which is relevant for the procedure and the principles of automated processing of personal data to achieve a specific result based on this data.”
In the present case, this would be the personal data processed in the context of creating a “factor,” such as date of birth, address, gender, etc. How was the automated assessment determined in each individual case? Which data sources and calculation methods were used? In this case, this would concern the mathematical formula used to calculate the “score” in question, but also the specific intervals within which different data are assigned the same value for the same factor. In the previous instance, experts also requested that a list of the “scores” of individuals (“scoring”) be provided that had been created in the six months prior to and six months after the calculation of the plaintiff’s “score” and were based on the same calculation rule.
Ultimately, the information or explanation must be such that it enables the data subject to understand the automated decision and to challenge it. According to recital 58 of the GDPR, this includes ensuring that the information is concise, easily accessible, and understandable, and that it is provided in clear and plain language, using visual elements where appropriate. The ECJ refers here to a right to an explanation “of the logic involved in the automated processing of the data subject’s personal data and the logic reached by the automated processing in order to achieve a particular outcome, such as a credit rating.”
In doing so, the ECJ rejected certain arguments put forward by Dun & Bradstreet and made statements that go far beyond the present case:
Transmission of the algorithm is not sufficient
Automated decision-making processes based on complex machine learning algorithms (e.g., neural networks) are difficult for consumers and even companies to understand—the black box problem. The ECJ states: “Neither the mere transmission of a complex mathematical formula (such as an algorithm) nor the detailed description of each step of an automated decision-making process satisfies these requirements, as neither constitutes a sufficiently precise and comprehensible explanation.” Meaningful information about the logic is required, not a detailed explanation of the algorithm or even its disclosure. In my opinion, it is obvious that this would regularly be gibberish to the layperson. And the GDPR does not require this anyway.
Secrecy does not protect against the obligation to provide information
According to the ECJ, it is not sufficient to simply invoke the fact that such an explanation or the information provided would constitute the controller’s own trade secrets or protected data of third parties in order to refuse or severely restrict the scope of information provided to data subjects. If such an argument is put forward, the allegedly protected information and data must be transmitted to the competent supervisory authority or the competent court, which will then weigh up the conflicting rights and interests in order to determine the scope of the data subject’s right to information with regard to this information.
Counterfactuals as a possible solution!
The court also makes it clear that, for a transparent and comprehensible explanation, it may be sufficient to provide information on the extent to which a deviation in the personal data taken into account would have led to a different result. This is a very important and forward-looking statement that companies in the field of ADM should take to heart!
This refers to so-called counterfactual explanations, i.e., hypothetical scenarios that show a data subject what changes would need to be made to the input data in order to obtain a different decision. This means that not only is the result of an automated decision disclosed, but also which factors influenced the decision and how they would need to change in order to achieve a different result.
Such a counterfactual explanation could look like this:
“Your application was rejected due to a low credit score of 540. If your annual income had been €5,000 or higher or if you had been able to prove that you had been employed in your current job for two years longer, you would have been approved.”
Counterfactuals provide a clear, understandable, and interpretable explanation that helps to understand the decision-making process and the reasons behind a decision. The person concerned not only learns that they have been rejected, but also why — and what they could change. Counterfactuals can thus provide exactly the meaningful logic that is required by showing which data was relevant. The person affected can identify errors or inaccurate or unfair criteria (e.g., incorrect income data) and request a correction or manual review, or lodge an appeal, i.e., assert their rights.
Counterfactuals can also alleviate the black box problem by making decisions more tangible and allowing certain patterns to be identified and corrected more easily. For example, they can reveal potential discrimination, such as people from certain neighborhoods systematically receiving lower credit ratings. However, the truth is that with highly complex AI-based scoring models, the creation of comprehensible counterfactuals is technically challenging in itself.
First Reactions
The Austrian Data Protection Authority (Datenschutzbehörde, DSB) has closely followed the proceedings and emphasized in statements that transparency in automated decision-making processes is essential for the protection of personal data. It welcomed the ECJ’s clarification that data subjects have a right to comprehensive information about how such systems work.
At the European legislative level, the ruling was seen as confirmation of the rights enshrined in the General Data Protection Regulation (GDPR). Members of the European Parliament stated that the ruling underscores the need to strictly apply existing data protection laws and, where necessary, adapt them to meet the challenges posed by advancing automation.
Consumer protection organizations in Germany and Austria welcomed the ruling as an important step toward strengthening consumer rights. They called on companies to make their credit assessment procedures more transparent and to ensure that consumers can understand how decisions about their creditworthiness are made.
The business community was divided: while some companies see the decision as an opportunity to strengthen customer trust through increased transparency, others expressed concerns about the protection of trade secrets and the practical implementation of the required disclosures.
Overall, the ECJ ruling is expected to have a long-term impact on the practice of automated data processing in Europe.
Conclusion
All processing of personal data must have an effective legal basis, in particular from the catalog of Art. 6 GDPR. At ADM, the narrow limits of Art. 22 must also be taken into account. However, this alone is not enough, because for processing to be lawful, i.e., fair and transparent, the data subject must also be adequately informed in such a way that they actually understand the facts and circumstances, the context, and the implications. Highly complex technical functions and abstract formulas must be explained to laypersons in a way that they can understand. The ECJ has now issued groundbreaking guidelines on what this must look like and how it can work. In my opinion, counterfactuals are an effective means of complying with transparency obligations as far as possible. The prerequisite is, of course, that those who use ADM have a minimum level of knowledge and understanding of the system they are using or offering. Unfortunately, there is no way around this effort.