Data Protection & Privacy 2024

Last Updated February 13, 2024

Belgium

Law and Practice

Authors



Osborne Clarke is an international legal practice with over 330 partners and more than 1,300 lawyers in 26 locations. In Brussels, the firm’s data, IP and IT experts work together as a team to support high-profile Belgian and international clients on complex regulatory matters, including the implementation of the Digital Services Act, the Digital Markets Act, and the Digital Operational Resilience Act (DORA). Osborne Clarke has a strong international client base in a range of industry sectors, including life sciences, retail, financial services and in particular fintech, as well as specialist technology clients and companies in the digital sector. The team intercedes in data privacy matters at different levels, from communicating with the Belgian Data Protection Authority, drafting data protection policies, and carrying out data protection audits, to assisting clients in the scope of disputes before the Belgian Data Protection Authority.

Article 22 of the Belgian constitution provides for the right to protection of private and family life, and forms the cornerstone of the Belgian-specific laws governing or impacting privacy in general. In addition, Article 8 of the European Convention on Human Rights has direct effect in Belgium and is a cornerstone of the Belgian law enforcement system. 

However, from the point of view of digital technologies and innovation, the most important regulation in Belgium for businesses is the General Data Protection Regulation also referred to as the “GDPR” (Regulation (EU) 2016/679) which applies to all member states of the European Union. Along with the European legislation, the Belgian Law of 30 July 2018 on the protection of natural persons with regard to the processing of personal data also applies. This Belgian legislation adopts a number of principles enshrined in the GDPR in respect of the activities of specific state and public bodies. In respect of businesses, it does not add or deviate much from the standard rules laid down by the GDPR.

Belgium has established its supervisory authorities by implementing the Law of 3 December 2017, as required by the GDPR. The main supervisory authority is vested with investigative and corrective powers and is entitled to fine a controller or processor if they do not comply with the GDPR or the Belgian Law of 30 July 2018. The fines as listed in Article 83 of the GDPR may not however be imposed on public authorities and their appointees or agents, unless it is a legal person governed by public law offering goods or services on a market (Article 221, §2 of the Belgian Law of 30 July 2018).

In addition to the GDPR and the Belgian Law of 30 July 2018, other laws have been enacted to respect privacy and fundamental rights in different fields, such as electronic communications, electronic commerce, direct marketing, use of CCTV, etc.

At present, no specific legal regime has been enacted with respect to artificial intelligence (AI).

The Belgian Data Protection Authority (DPA) consists of:

  • an executive committee;
  • a general secretariat;
  • a first-line service;
  • an authorisation and opinion service;
  • an inspection service; and
  • a litigation chamber.

The DPA has the right to conduct audits. 

Furthermore, investigations may be launched on the initiative of the DPA, where a complaint is lodged by a data subject or a body, organisation or association which has been properly constituted in accordance with the law of an EU member state, has statutory objectives of public interest, and is active in the field of protection of data subjects’ rights and freedoms.

Alongside the DPA, different regulators and public authorities have a role to play in respect of data sharing, open data and the national implementation of the EU data spaces strategy.

With respect to AI, it is still unclear whether the DPA will be vested with regulatory powers and, if so, to what extent. That being said, there is little doubt that the DPA will exercise its powers in relation to automated decision-making, and the impact of AI projects on fundamental rights, as often as it can.

The DPA must comply with the GDPR and the Belgian Law of 30 July 2018. When a complaint is filed or an investigation is launched, there will usually be an initial fact-finding phase where the authority will ask a business to provide factual information. Afterwards, proceedings on the merits can be started in front of the Litigation Chamber of the DPA, in the scope of which, parties can submit their respective arguments in writing, and possibly be heard.

After the proceedings, the Litigation Chamber is entitled to:

  • dismiss the complaint;
  • order the dismissal of the prosecution;
  • order the stay of proceedings;
  • propose a settlement;
  • issue warnings and reprimands;
  • order compliance with the requests brought by the data subject relating to the exercise of their rights;
  • impose periodic penalty payments; or
  • impose administrative fines.

In the event that the DPA imposes an administrative fine, such fine must be effective, proportionate and dissuasive, pursuant to Article 83 of the GDPR. Furthermore, specific circumstances must be taken into account when imposing an administrative fine and deciding on its amount.

If the respondent does not agree with the decision handed down by the Litigation Chamber, the respondent may lodge an appeal before the Market Court (Brussels Court of Appeal) within 30 days of notification of the decision. The Market Court can overturn the decision in whole or in part, and remand the case, or decide on all grounds and substitute its decision.

As mentioned in 1.1 Laws, in Europe the GDPR applies to all EU member states, including Belgium. As this is a regulation, it is directly applicable and no implementing act is needed. For the GDPR to be consistently applied across the EU and for the supervisory authorities of the EU member states to co-operate with each other, the European Data Protection Board (“the Board”) has been established.

The Board consists of the head of each member state’s supervisory authority, including the head of the DPA and the European Data Protection Supervisor, or their respective representatives.

Furthermore, the European directive on privacy and electronic communications of 12 July 2002, also known as the ePrivacy Directive (Directive 2002/58/EC), was transposed into Belgian legislation through the Law of 30 July 2018 on the protection of natural persons with regard to the processing of personal data. As Belgium is a federal state, some powers are assigned to the communities and regions. Nevertheless, this has no impact on the data protection legislation which applies to the entire territory.

There are currently no major privacy or data protection (including AI) non-government organisations (NGOs) or industry self-regulatory organisations (SROs).

As the GDPR was one of the first data protection legislations ever implemented, it is one of the most developed.

The GDPR applies across the EU and is directly applicable at national level, which means that the main data protection principles enshrined in the GDPR apply to all EU member states. The GDPR leaves little room for regulating certain matters, however, the member states are, for example, allowed to deviate in the following cases.

  • The minimum age for a child’s consent to be lawful is 16 years old, however, member states may lower this age to a minimum of 13 years old. Belgium has availed itself of this possibility and reduced the age to 13 years.
  • Member states may further specify the conditions for processing a national identification number or any other identifier of a general nature. In Belgium, the processing of a national identification number is in principle forbidden and is only permitted to a limited range of public bodies and organisations, under the conditions provided by the Law of 8 August 1983 on organising a national register of natural persons. Narrow use is authorised in the context of electronic signature or online authentication processes.
  • Member states may maintain or introduce further conditions, including limitations, with regard to the processing of genetic data, biometric data or health-related data. Belgium has imposed specific security measures with respect to the processing of special categories of data.

There were several new developments in Europe in 2023.

Adequacy Decision on the EU-US Data Privacy Framework

A new Adequacy Decision for the EU–US Data Privacy Framework (“Privacy Framework”) was adopted by the European Commission on 10 July 2023, allowing personal data transfers from the EU to the US. Personal data can now safely be transferred from the EU to US companies that participate in the Privacy Framework. The Privacy Framework introduces new binding safeguards to address all the concerns raised by the European Court of Justice.

Decision of the DPA of 28 June 2023

The DPA also handed down some key judgments. In a decision dated 24 May 2023, the Litigation Chamber ruled that the processing of the personal data of a complainant is unlawful, including data transfers based on the Foreign Account Tax Compliance Act (FATCA) Agreement. Furthermore, it ruled that the exchange of international information for tax purposes is forbidden. The Market Court overturned this decision in its judgment dated 20 December 2023, ruling that such transfers are not forbidden. In light of the foregoing, it appears that the various departments of the DPA do not shy away from handing down decisions that differ from those of prior departments, if these are manifestly ill founded.

Decision of the European Court of Justice

In the course of 2023, the European Court of Justice (ECJ) continued to issue landmark rulings that clarify or contextualise several key rules and principles of the GDPR. It is not possible to provide a systematic and exhaustive report on this body of case law, but it is worth highlighting, in particular, the rights of data subjects and the possibility to claim compensation for infringement of the GDPR.

Data subject’s rights

In several decisions, the ECJ clarified the scope and extent of the right of access. Firstly, the right of access entails not only the right to be informed about the processing operations, but also the right to receive a copy of the personal data undergoing processing. This means that, as a rule, the data subject also has the right to obtain a copy or an extract of documents, or even entire documents or extracts from databases, when this appears essential to enable data subjects to exercise their rights effectively. The intention or the reasons underlying the request to access such documents do not matter and cannot be invoked to dismiss the data subject’s access request (ECJ, 4 May 2023, C-487/21; 26 October 2023, C-307/22). Secondly, the data subject must also be informed about the specific recipients to whom the data has been made available or disclosed, as well as the consultation operations carried out by employees on behalf of their employer, and employers must maintain an up-to-date overview of all data recipients (ECJ, 12 January 2023, C-154/21; 22 June 2023, C-579/21).

Damages and compensation

Data subjects have the right to claim compensation for material and non-material damages in the event of an infringement of the provisions of the GDPR. Through a series of cases in 2023, the ECJ clarified the nature of such right and delineated the respective powers of the EU and its member states to regulate the details of compensation claims and awards. Firstly, it is now clear that material and non-material damages are concepts that must receive a uniform and harmonised interpretation. Member states cannot define a “de minimis” threshold, but they are free to define the extent of financial compensation by setting rules on evidence, procedure, etc. However, in any case, the compensation must be effective and essentially equivalent to what other member states offer. In other words, member states have a limited margin in that respect. Secondly, data subjects must demonstrate the existence of an infringement, and also show that they have suffered some harm. Said harm can be purely moral and may even result from the mere fear of being exposed, but must differ from the mere infringement of the GDPR provisions. Subject to that requirement, the infringing entity is deemed to have breached its duty of care, unless it can demonstrate that the damage is not attributable to it, in other words, that there is no causal link between the infringement and the harm suffered. It is also clear that damages cannot be punitive or have a deterrent function: the severity of the infringement will not lead to higher damages (ECJ, 4 May 2023, C-300/21; 14 December 2023, C-456/22; 14 December 2023, C-340/21; 21 December 2023, C-667/21). Interestingly, this line of cases is more or less aligned with the general views on compensation claims for GDPR infringement under Belgian law.

In 2024, several hot topics are likely to keep the data protection community busy.

New Data Protection Regulator

A new commissioner or commissioners will be appointed to the Irish Data Protection Commission. Given Ireland’s position as the lead data protection regulator for numerous EU-based (tech) companies, it is crucial to stay informed about new potential approaches or priorities that may emerge. Consequently, it will also be important to monitor whether the Belgian authority is likely to adopt any new corresponding approaches or priorities in response.

Adequacy Decision on the EU–US Data Privacy Framework

As mentioned in 1.7 Key Developments, a new adequacy decision has been approved for the transfer of personal data outside the EU to US companies that are part of the Privacy Framework. It is still unclear whether the EU–US Data Privacy Framework will stand. A number of complaints have been lodged to challenge this adequacy decision as adopted by the European Commission, but it remains to be seen whether these complaints will succeed.

A first review of the framework is scheduled for July 2024.

In the US, the two-tier redress mechanism of the framework is operational, while European Data Protection Authorities are gearing up in 2024 for the implementation of the live complaints mechanism and are actively promoting awareness among individuals.

Artificial Intelligence Act

At the end of 2023, the European Parliament and the Council reached a political agreement on the Artificial Intelligence Act which contains comprehensive rules for trustworthy AI, to ensure that AI in Europe is safe, and complies with fundamental rights and democracy, while ensuring that businesses thrive and expand.

Since many AI applications process personal data – for example, to contribute to data sets used to train machine-learning systems or to apply such models, this newly proposed legislation needs to be taken into account in combination with the legal framework that applies to personal data processing.

As previously mentioned, the most relevant data protection legislations in Belgium are the GDPR and the Belgian Law of 30 July 2018. In addition to these rules, other sector-specific laws could be applicable, depending on the sector.

The GDPR will, in principle, apply in the event that the following conditions have been met:

  • personal data is being processed (wholly or partly) by automated means/is intended to form part of a filing system (Article 2 of the GDPR); and
  • the controller or processor has an establishment in the EU or the processing relates to personal data of EU data subjects (Article 3 of the GDPR).

Data Protection Officers

According to Article 37 of the GDPR, a data protection officer (DPO) must be appointed when:

  • data processing is carried out by a public authority or body, with the exception of courts acting in their judicial capacity;
  • the core activities of the controller or the processor consist of processing operations, which require regular and systematic monitoring of data subjects on a large scale; or
  • the core activities of the controller or the processor consist of processing special categories of data on a large scale (as set forth in Articles 8 and 10 of the GDPR).

The contact details of the DPO must be notified to the data protection authority, which has an online form for this purpose.

Criteria Necessary to Authorise Collection, Use or Other Processing of Data

In order to process personal data, the data controller must select an appropriate legal basis among those permissible under Article 6 of the GDPR. This includes the  consent of the data subject, although the GDPR standards are fairly demanding – consent must be freely given, specific, informed and unambiguous.

Other possible legal bases are:

  • compliance with a legal obligation;
  • performance of a contractual obligation;
  • protection of the data subject’s or another natural person’s vital interests;
  • the performance of a task carried out in the interest of the public or in the exercise of an official duty; or
  • the pursuance of the legitimate interests of the data controller or of a third party.

The DPA applies a strict approach to the choice of this legal basis and requires controllers to demonstrate in detail that they have not only listed or described their legitimate interest, but also that they have assessed in detail why and how the latter prevails over the data subjects’ fundamental rights.

Privacy by Design or by Default

Both when determining the means of processing and at the time of processing, the controller is required to implement the appropriate technical and organisational measures (eg, pseudonymisation) designed to meet the requirements of the GDPR, including all the data protection principles, and to protect the rights of data subjects (Article 25.1 of the GDPR). In addition, the controller is obliged to implement appropriate technical and organisational measures by default to ensure that the processing relates only to the personal data considered necessary for each specific purpose (Article 25.1 of the GDPR).

In order to clarify privacy by design and by default, the European Data Protection Board has published a set of guidelines and a list of examples that can be taken into account (Guidelines 4/2019 on Article 25, Data Protection by Design and by Default, Version 2.0, adopted on 20 October 2020).

Impact Assessment

In some instances, the data controller might be required to carry out a data protection impact assessment – for example, when using new technologies that are likely to significantly affect the rights and freedoms of natural persons. Such impact assessment is always required in the following cases:

  • a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing and on which decisions are based that produce legal effects concerning the natural person, or similarly significantly affect the natural person;
  • processing on a large scale of special categories of data or of personal data relating to criminal convictions and offences; and
  • systematic monitoring of a publicly accessible area on a large scale.

Furthermore, in Belgium a list of processing operations requiring an impact assessment has been approved. For instance, such assessment must be performed when the health data of a data subject is collected by automated means using an active implantable medical device (Decision of the General Secretariat, No 01/2019 of 16 January 2019). If the processing is not included in the list of processing operations, the controller still needs to take into account the criteria provided for in Article 29 Working Party (Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679).

Internal or External Privacy Policies

As the GDPR imposes several information requirements, companies have implemented a privacy policy to ensure that data subjects are aware of the processing of their personal data and their rights under the data protection legislation. Furthermore, with such policies, companies can more easily demonstrate compliance with the accountability principle.

Rights of Data Subjects

When a controller processes the personal data of a data subject, it needs to ensure that the following rights are made available to the data subject:

  • the right of access (Article 15 of the GDPR);
  • the right to rectification (Article 16 of the GDPR);
  • the right to erasure or the right to be forgotten (Article 17 of the GDPR);
  • the right to restriction of the processing (Article 18 of the GDPR);
  • the right to data portability (Article 20 of the GDPR); and
  • the right to object (Article 21 of the GDPR).

In addition to these rights, in the event that the data subject has given consent, they may withdraw their consent at any time (Article 7.3 of the GDPR). Furthermore, the data subject enjoys the right to lodge a complaint with the supervisory authority, which is the DPA in Belgium (Articles 13 and 14 of the GDPR)

Anonymisation, Pseudonymisation of Data

In the event that data is anonymised, it does not qualify as personal data, meaning it does not fall within the scope of the GDPR. This is not the case with pseudonymised personal data. Personal data that has been pseudonymised can still be attributed to a natural person by using additional information. Therefore, pseudonymised data should still be considered as information relating to an identifiable natural person. However, the use of pseudonymisation is an efficient tool to mitigate the risks to data subjects and may enable data controllers and processors to comply with their data protection obligations.

Automated Decision-Making, Profiling

In Belgium, automated decision-making, including profiling, is in principle not allowed if it is a solely automated decision, and if there are significant legal effects or other similar effects on the subject. Automated individual decision-making is still allowed in the following three cases:

  • if it is necessary for the conclusion or performance of a contract;
  • if a law allows it (eg, to prevent tax fraud); or
  • if the decision-making is based on explicit consent being provided by the data subject.

Furthermore, profiling resulting in discrimination against natural persons on the basis of the special categories of personal data is also prohibited.

Injury or Harm

See 1.7 Key Developments.

Sensitive Data

Article 9 of the GDPR provides an overview of all the data that is considered to be sensitive, namely:

  • racial or ethnic origin;
  • political opinions;
  • religious or philosophical beliefs, or trade union membership;
  • genetic data;
  • biometric data for the purpose of uniquely identifying a natural person;
  • data concerning health; and
  • data concerning a natural person’s sex life or sexual orientation.

In principle, processing the above-mentioned data is not allowed, unless one of the conditions of Article 9.2 of the GDPR applies.

The GDPR imposes additional obligations with regard to the processing of personal data falling under special categories. For example, the processing of personal data for the purpose of providing healthcare must be carried out under the responsibility of a professional who is bound by professional secrecy or by another person bound by a duty of confidentiality.

EU member states may maintain or introduce further conditions, including limitations, with regard to the processing of genetic data, biometric data or health-related data. Belgium has taken this opportunity to impose additional obligations on controllers when processing such data. As a result, controllers must:

  • designate the (categories of) persons who may consult the sensitive personal data;
  • make this list available to the DPA; and
  • ensure that these persons are obliged to respect the confidential nature of the data processed (eg, by entering into a contract).

The GDPR does not mention children’s sensitive data. However, the regulation provides for increased protection for the processing of children’s personal data (Article 8 of the GDPR).

Electronic Communication Data

The GDPR does not impose additional obligations on natural or legal persons in relation to processing in connection with the provision of publicly available electronic communication services in public communication networks across the EU relating to matters for which they are subject to specific obligations with the same objective set out in Directive 2002/58/EC (the “ePrivacy Directive”) (Article 95 of the GDPR). In principle, the ePrivacy Directive as transposed into the Belgian Electronic Communication Act and the Act of 30 June 2018 prohibits the processing of communication data, in the event that the users have not consented to it. There are limited exceptions – for instance, for the purposes of evidence in commercial transactions or in the context of call-centre operations.

Employment Data

In principle, the employer is allowed to process the personal data of its employees as this is necessary to fulfil its specific rights and obligations under employment law (contractual necessity, legal necessity, business interest). The processing of sensitive data is in principle prohibited – however, if permitted by labour law or in the event that the employee has given their permission, the processing is allowed. In this regard, in light of the relationship of authority between the parties, it ought to be considered that using consent as a legal basis is often unlikely to be reliable. Therefore, it may prove difficult for the employer to prove the lack of pressure.

Digital Services Act

On 17 February 2024 all EU intermediary service providers, marketplaces, social networks, content-sharing platforms, app stores, and online travel and accommodation platforms needed to comply with the Digital Services Act (DSA), which applies to all EU member states. The DSA includes the intermediary liability rules enshrined in the e-Commerce Directive, and imposes additional rules on intermediary service providers, depending on the type of services provided. For example, hosting providers must implement a notice and take-down mechanism and online platforms must implement an internal complaint mechanism and suspend the account of users who frequently upload illegal content on their platform. It must be noted that on 31 January 2024, the Belgian government tabled a bill to implement the DSA.

In Belgium the use of email for advertising is prohibited without the prior, free, specific and informed consent of the addressee of the messages (Article XII.13, Belgian Economic Code of Law). When sending marketing communications via email, the company sending them must ensure that (i) the recipient has been provided with clear and intelligible information on the right to object to the receipt of such advertising in the future; and (ii) designate suitable means ensuring that this right can be exercised efficiently by electronic means and make such means available.

The Belgian Royal Decree of 4 April 2003 regulating the sending of electronic commercial communications provides for two exceptions to this rule, when the following cumulative criteria are met:

  • the customer has purchased goods and has directly given their contact details;
  • the contact details are only used to advertise the company’s own similar products; and
  • the customer is informed that they can object to such use at any time (free of charge and in an easy way).

Besides the application of general data protection rules, the protection of employees’ privacy and personal data in Belgium is guaranteed by specific protection mechanisms such as collective bargaining agreements. For example, Collective Bargaining Agreement No 68 of 16 June 1998 (“CBA 68”) lays down the conditions and principles with regard to camera surveillance in the workplace and CBA No 81 of 26 April 2002 (“CBA 81”) develops a specific regime concerning the monitoring of internet and emails.

  • CBA 68: Camera surveillance in the workplace is only permitted for the purposes specifically set out in this CBA and only if the employer has informed the employees of such surveillance. The objectives relate to health and safety, the protection of company property, the monitoring of the production process or the monitoring of the employee’s work. Only in the first three cases may the monitoring be continuous, provided that the monitoring of the production process relates to the monitoring of machinery.
  • CBA 81: Monitoring electronic communication is authorised only for the specified purposes. Continuous monitoring is never justifiable, as it is deemed disproportionate.

However, an employee’s right to privacy is not absolute. A balance between an employee’s right to privacy and an employer’s legitimate interest to protect the business or comply with its own obligations is always required. Hence, it is likely that, as part of the employer’s authority, there might be a legitimate interest in monitoring employees to the extent that the process is relevant and proportionate.

Artificial Intelligence

The draft AI Act provides that:

  • AI applications in the workplace are classified as high risk; and
  • workers and their representatives must be informed when AI systems are deployed in the workplace (transparency is essential in safeguarding employees’ interests).

Whistle-Blower Hotlines and Anonymous Reporting

Belgium chose to transpose the EU Directive in two stages: for the public sector and for the private sector. The Belgian law implementing the Directive came into force on 2 January 2023 for the public sector, and 15 February 2023 for the private sector. The material scope of application of the Belgian law is broader than that of the Directive, in that it focuses on tax and social fraud.

As from 15 February 2023, legal entities in the private sector with at least 50 employees are obliged to:

  • set up or arrange for the outsourcing to external providers of an internal reporting channel and follow-up procedure;
  • provide clear and accessible information to their employees regarding the internal and external reporting channels; and
  • ensure confidentiality and appropriate protection for whistle-blowers.

Each legal entity in the private sector (broadly defined as “an organisation, incorporated or not, carrying out one or more activities”) with at least 50 employees needs to comply with the new obligations. It should be noted that:

  • if such a Belgian legal entity is part of an international group of companies, it is not sufficient to set up one central notification procedure at group level, due to the particularities of Belgium and other jurisdictions as compared to the minimum protection set at EU level;
  • any global whistle-blowing policy must always comply with the stricter local rules in each country;
  • the internal reporting channel must be implemented by each legal entity after consultation with the existing employee representative bodies; and
  • there is no mandatory medium through which implementation must take place, but the adoption of a simple policy (rather than introducing the new procedure as part of the work rules or in a collective agreement) is the most appropriate approach.

Legal Standards of Regulators

In Belgium, the procedure to be followed can be summarised as follows: proceedings are either initiated when (i) a data subject files a complaint alleging that their personal data has been processed in an incorrect manner; or (ii) the DPA initiates an investigation. Following proceedings before the Litigation Chamber, the DPA has the power to impose fines (see also 1.3 Administration and Enforcement Process).

Prior to the imposition of a sanction by the Belgian DPA, an investigation must be launched. After said investigation, the investigator can decide whether to refer the matter to the Litigation Chamber of the Belgian DPA.

In the scope of this phase, after both parties hand in their submissions and following the hearing, a preliminary sanction can be imposed by the Litigation Chamber. The person who is suspected of breaching data protection laws is then entitled to submit some observations in the sanction form. Once observations are communicated in the sanction form, a final sanction is imposed, which is still subject to appeal.

Such appeals may be lodged within 30 days of notification of the decision before the Brussels Court of Appeal (Market Court section). The fines are transferred to the state treasury. Note that in Belgium there is no official calculation method. There is only a so-called penalty form, which is sent to the parties following the hearing before the Litigation Chamber and the DPA’s initial decision. The DPA does however follow the EDPB Guidelines 04/2022 on the calculation of administrative fines under the GDPR, issued by the European Data Protection Board.

All decisions are published on the website of the DPA (including the judgments of the Court of Appeal). These decisions contain information on the relevant facts, imposed fines and other procedural steps. The involved parties are often anonymised. The Belgian DPA issued a draft and non-binding settlement policy in December 2023, consolidating and codifying a recent practice. This is a very welcome development and an invitation to data controllers and processors to engage in a constructive and interactive discussion with the regulator in the event their practices are challenged as part of a complaint or investigation.

The goal is to take a pragmatic approach, enabling faster and more efficient handling of cases, where parties can be heard, and data controllers can openly discuss the technical and operational aspects of their compliance. A settlement proposal will be submitted for comments by the parties. The policy mentions that a hearing can be organised if needed and that appropriate confidentiality arrangements can be made. Plaintiffs will be heard as well, and a formal decision will be issued and published (albeit, it could be anonymised). This decision can still be appealed by any interested party. 

A settlement implies that the defendant acknowledges the facts but not necessarily the existence of an infringement of the legal provisions. It may involve a monetary fine, along with a commitment to remedy the (alleged) non-compliance.

Some procedural aspects still need to be worked out or detailed, such as the extent to which the terms of the transaction will be kept confidential, but this will become clearer as the practice develops.

Potential enforcement penalties

The supervisory authority has been granted several powers under the GDPR, among them (Article 58 of the GDPR):

  • investigative powers; and
  • corrective powers to issue warnings and reprimands, and the authority to order the controller/processor to act in accordance with the data protection legislation.

Furthermore, the authority has the possibility to impose administrative fines, in light of the general conditions (Article 83 of the GDPR).

Leading Cases

Third parties can lodge an appeal against Belgian DPA decisions before the Market Court

Article 78, §1 of the GDPR states that “without prejudice to any other administrative or non-judicial remedy, each natural or legal person shall have the right to an effective judicial remedy against a legally binding decision of a supervisory authority concerning them”.

Article 108 of the Belgian law of 3 December 2017 establishing the Belgian DPA, provides that the decisions of the latter’s Litigation Chamber may be appealed before the Market Court of the Brussels Court of Appeal by the parties to the dispute at stake. It does not stipulate that third parties, who are not directly involved in the dispute but may have claims against it, have the right to appeal before that particular court.

However, the Constitutional Court has ruled that the fact that Article 108 does not include a rule of appeal for third parties not involved in the initial dispute at stake is inconsistent with Articles 10 and 11 of the constitution. In addition, it has given a clear indication of the remedy to be set by the legislator (ie, to extend the appeal to the Market Court to third parties, based on terms and conditions yet to be defined) and has organised a transitional regime that appears to allow all third parties harmed by a decision of the Litigation Chamber of the DPA to lodge an appeal against it before the Market Court within a period of 30 days from the date on which the party can reasonably be considered to have knowledge of the decision, and at the earliest, 30 days from the publication of the decision of the present Constitutional Court in the Belgian Official Gazette as regards old decisions.

Decision regarding a complaint concerning the use of a geolocation system

The DPA ordered a reprimand following a complaint about the geolocation system installed in service vehicles of a local government. Although the DPA does not rule out that such GPS tracking can be lawful under strict conditions, the Litigation Chamber ruled that this was not so in that particular case. The data controller had initially carried out this processing without a legal basis and subsequently based it on an incorrect legal ground.

Case C-604/22: Request for a preliminary ruling from the Brussels Court of Appeal (Belgium) lodged on 19 September 2022 — IAB Europe v Belgian DPA

In a dispute between IAB Europe and the Belgian DPA, the CJEU was asked whether IAB Europe qualifies as a (joint) controller within the meaning of the GDPR and whether the TC-string developed by IAB Europe and used by publishers and advertisement companies, constitutes personal data.

The Belgian DPA’s Anticipated Enforcement Priorities

The following elements were high on the agenda of the Belgian DPA:

  • the processing of sensitive personal data;
  • the legitimacy of the processing;
  • the transfer of personal data outside the EEA;
  • the processing of biometric data; and
  • the online collection of personal data using cookies and similar technologies.

On 12 December 2019, the Belgian DPA published a draft Strategic Plan 2019–2025, highlighting its priorities and areas of focus, which include the following five main sectors: telecommunications and media, public authorities, direct marketing, education, and SMEs. Three important social topics are also closely monitored by the authority: online data, sensitive data and images/CCTV.

In November 2021, the Belgian DPA also published its plan for 2024, as part of the long-term objectives of the Belgian DPA’s strategic plan for 2020–2025. The main goals are:

  • timely processing of cases;
  • quality in processing cases;
  • the efficiency of the Belgian DPA;
  • co-operating with partners;
  • raising awareness and increasing knowledge about data protection;
  • the well-being of the Belgian DPA’s employees; and
  • increasing the visibility of the Belgian DPA.

The Belgian DPA also highlighted its key sectors of enforcement for 2024: the position of the Data Protection Officer (including the correct or incorrect appointment of a DPO and whether the DPO has in practice sufficient time and resources to properly perform their tasks); direct marketing (including direct marketing communications for commercial, not-for-profit or political purposes, especially with the upcoming elections in Belgium, as well as data trading for direct marketing purposes on the basis of gaming tournaments); and transparent and accessible information about data processing activities and cookies (including investigations based on the “Cookie Checklist” recently released by the Belgian DPA).

Private Litigation

In order to obtain damages, Article 80 of the GDPR requires the following:

  • firstly, a breach of data protection rules;
  • secondly, damage;
  • thirdly, that the damage must result from the breach; and
  • fourthly, liability for the breach.

Under these conditions, the data subject may claim damages from the controller or the processor.

Class Actions

With the Law of 28 March 2014 regarding collective redress, the Belgian legislature introduced the concept of class actions into Belgian law (so-called actions for collective redress). Actions for collective redress are only available to consumers and small and medium-sized enterprises represented by a “group representative”, when they suffer damage as a result of a common cause.

For the filing of class actions, Belgium has also laid down a specific approval procedure for consumer organisations established outside its territory.

For example, in 2020 the Belgian Official Gazette published the Ministerial Decree of 30 September 2020 approving NOYB (None of Your Business), Max Schrems’ privacy rights organisation, as a qualified entity under the collective action scheme set out by the Belgian Code of Economic Law. Hence, NOYB is able to file representative actions in Belgium and claim damages on behalf of users of a company for breaching various laws relating to consumer protection, including data protection legislation in Belgium, similar to Test-Achats (referred to as “Test Aankoop” in Dutch), which has already filed a lawsuit against Facebook for breaching data protection legislation in relation to Cambridge Analytica.

Law enforcement authorities have the authority to request information on personal data for serious crimes. Generally speaking, the power to request access lies with the public prosecutor or the investigatory judge, an independent magistrate. In most cases, the access requires prior authorisation, except in limited circumstances and for specific investigation techniques. The Belgian Constitution, the European Convention on Human Rights and the EU Charter of Fundamental Rights set the standards for the admissibility of any interference with the right to privacy, data protection and family life. International treaties on human rights are deemed to have direct effect in the Belgian legal order and can hence be directly applied by a judge. 

With the adoption of the E-Evidence Regulation, the legal landscape and practice could evolve significantly.

Belgium is a signatory to the OECD Declaration on Government Access to Personal Data Held by Private Sector Entities. It also implemented Directive (EU) 2006/24/EC of the European Parliament and of the Council of 15 March 2006 on the retention of data generated or processed in connection with the provision of publicly available electronic communications services or of public communications networks (and amending Directive 2002/58/EC). However, the Belgian implementing legislation has faced significant challenges and some aspects thereof have been declared invalid by the Constitutional Court of Belgium.

Access to data for foreign intelligence, national security or anti-terrorism purposes is regulated in specific laws in Belgium. These laws usually appoint an internal body or administration. The Act of 30 June 2018 governs the processing of personal data by said bodies or administrations, in specific chapters, in accordance with Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities, for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA. 

There is currently no legal basis in Belgium that allows organisations to invoke a foreign government access request as a legitimate basis to collect and transfer personal data. This will change to some extent after the E-Evidence Act of the EU comes into force.

There are currently no key privacy issues, conflicts or public debates in relation to government access to personal data for law enforcement or national security purposes.

As the GDPR is a European instrument, all EU countries are subject to the same requirements. However, when personal data is transferred outside the EU, the following rules must be taken into account to ensure that the level of protection of data subjects under the GDPR is not undermined.

The transfer of data outside the EU is subject to:

  • an adequacy decision of the European Commission (Article 45 of the GDPR);
  • appropriate safeguards (Article 46 of the GDPR), such as binding corporate rules; or
  • in the event of a specific situation, one of the derogations as set forth in Article 49 of the GDPR.

General

As mentioned in 4.1 Restrictions on International Data Issues, the transfer of data outside the EU is, in principle, subject to an adequacy decision or having appropriate safeguards in place.

The European Commission has (so far) recognised Andorra, Argentina, Canada (commercial organisations), the Faroe Islands, Guernsey, Israel, the Isle of Man, Japan, Jersey, New Zealand, the Republic of Korea, Switzerland and the United Kingdom under the GDPR and the LED, and the United States (commercial organisations participating in the EU–US Data Privacy Framework) and Uruguay as providing adequate protection.

In the absence of an adequacy decision, data transfers outside the EU are still possible if appropriate safeguards have been enforced. These could be binding corporate rules, standard data protection clauses as adopted or approved by the European Commission, etc.

Derogations

In the absence of an adequacy decision for a specific third country or appropriate safeguards, it is still possible to transfer personal data to a third country or an international organisation subject to one of the following conditions:

  • the data subject has explicitly consented to the proposed transfer, after having been informed of the possible risks;
  • the transfer is necessary for the performance of a contract between the data subject and the controller or the implementation of pre-contractual measures taken at the data subject’s request;
  • the transfer is necessary for the conclusion or performance of a contract concluded in the interest of the data subject;
  • the transfer is necessary for important reasons of public interest;
  • the transfer is necessary for the establishment, exercise or defence of legal claims;
  • the transfer is necessary in order to protect the vital interests of the data subject or other persons, where the data subject is physically or legally incapable of giving consent; or
  • the transfer is made from a register which, according to EU or member state law, is intended to provide information to the public and which is open to consultation.

As mentioned in 4.2 Mechanisms or Derogations That Apply to International Data Transfers, for a company to transfer data, an adequacy decision or other adequate safeguards are required. As many of these mechanisms have been approved before, no additional government notification or approval is required.

However, in the event that the company invokes binding corporate rules, the latter must have been approved by a supervisory authority before personal data can be transferred to a third country (Article 47.1 of the GDPR).

There are currently no specific data localisation requirements in Belgium.

There are currently no obligations in Belgium that would oblige a data controller or data processor to share any software code, algorithms, encryption, or similar technical detail with the government.

The general rules apply with regard to organisations collecting or transferring data in connection with foreign government data requests, foreign litigation proceedings or internal investigations.

In this regard, see 3.3 Invoking Foreign Government Obligations and 4.2 Mechanisms or Derogations That Apply to International Data Transfers.

European entities can sometimes face repercussions from the extraterritorial enforcement of unilateral sanctions by third countries. The EU considers that such enforcement is contrary to international law and has implemented Regulation 2271/96, (the blocking statute) as a way to protect itself. The blocking statute has been transposed into Belgian legislation through the Law of 2 May 2019.

The blocking statute prohibits European entities from complying with specific sanctions, prohibiting co-operation with the relevant third country’s authorities.

Since 2018, the blocking statute has applied to US sanctions against Iran and Cuba.

AI

With regard to AI, reference is made to the paragraph on Artificial Intelligence in 2.4 Workplace Privacy. However, the following is worth re-stating.

  • A new Artificial Intelligence Act is in the pipeline to address the increased use of AI machines and tools. In addition, two proposals have been made to address the liability issues that come with AI tools.
  • In Belgium there is likewise a growing concern regarding the transparency of the government agencies using AI systems. In light of this, a legislative proposal was introduced to amend the law on access to government information (freedom of information). The proposal emphasises that governments can only reap the benefits of AI systems if they make transparency a top priority. Under this proposal, public authorities would be obliged to disclose the main algorithmic rules online, in particular when they are used for individual decisions. Furthermore, if administrative documents include individual decisions partly or entirely generated by algorithms, citizens have the right to receive more detailed information, in an easily intelligible text, regarding the extent to which and the manner in which algorithmic processing contributed to the decision-making process, the data processed and its sources, which input factors or combination of input factors have contributed to the decision-making process and the operations carried out with the processing. Finally, public authorities would be required to conduct and disclose an impact assessment in accordance with Article 35 of the GDPR. These safeguards should confer additional benefits such as increased transparency and trust in AI systems.

Biometric Data

In Belgium, the situation regarding biometric data is a bit cumbersome as the Belgian DPA has issued strict guidance on the processing of biometric data, thereby sparking controversy. In short, in its recommendations, the Belgian DPA examines the conditions for the lawful processing of biometric data under the GDPR and takes the view that the processing of all biometric data is prohibited under Article 9 of the GDPR – regardless of the fact that such data is processed for identification or verification/authentication purposes. Furthermore, the Belgian DPA states that unless the data controller can cumulatively rely on legal grounds in accordance with Article 6 of the GDPR and one of the exception grounds exhaustively listed in Article 9.2 of the GDPR, the processing of biometric data is prohibited. To summarise, the Belgian DPA considers that the processing of biometric data can only be carried out on two possible grounds for exception: “explicit consent” or “compelling public interest”.

“Dark Patterns”

The Digital Services Act (DSA) has introduced mechanisms to avoid disinformation, deep fakes, or other online harm. In addition, the problem of so-called “dark patterns” is addressed by this new piece of legislation. Online platforms sometimes use dark patterns to persuade users to make undesirable decisions that have several negative consequences for them. The new DSA rules address this issue and ensure that online platforms cannot design, organise or operate their online interfaces in a way that deceives or manipulates their users or distorts their ability to make informed decisions. 

Organisations in Belgium have not yet established protocols for digital governance, AI or fair data practice review boards or committees to address the risks of emerging or disruptive digital technologies.

See 2.5. Enforcement and Litigation.

When carrying out due diligence operations in corporate transactions, responsibility lies with the prospective buyer to make sure that the target company:

  • has set out all the necessary policies (including privacy policies and cookie policies);
  • has entered into a data processing agreement in the event that another company processes data on behalf of the target company or the target company is the data processor;
  • holds a data processing register.

Furthermore, it needs to be determined whether the company is required to appoint a DPO, as set out in 2.1 Omnibus Laws and General Requirements.

In addition, since personal data (eg, of employees and customers) could be disclosed to potential buyers in the course of due diligence operations, the target company must ensure that the necessary safeguards have been provided.

There are currently no specific laws mandating the disclosure of an organisation’s cybersecurity risk profile or experience.

Companies and organisations are urged to publish a “Co-ordinated Vulnerability Disclosure Policy”. Through sectoral authorities, professional organisations and the Cyber Security Coalition in Belgium, they will be informed of significant threats or vulnerabilities. Organisations of vital interest will also receive targeted and non-public alerts through the regulator’s (Cybersecurity Centre Belgium’s) Early Warning System (EWS). 

In recent years, Europe has introduced a number of new legislative initiatives to keep up with the latest technological developments. As a member of the EU, Belgium is subject to all the new legal regulations that have recently come into force, such as the Digital Markets Act, the Digital Services Act and the Data Act.

Furthermore, several other proposed regulations are expected to be adopted soon, for example, the Artificial Intelligence Act, as mentioned in 1.8 Significant Pending Changes, Hot Topics and Issues

There are currently no other significant issues.

Osborne Clarke

Bastion Tower
Marsveldplein 5 Place du Champ de Mars
B–1050 Brussels
Belgium

www.osborneclarke.com
Author Business Card

Trends and Developments


Authors



Osborne Clarke is an international legal practice with over 330 partners and more than 1,300 lawyers in 26 locations. In Brussels, the firm’s data, IP and IT experts work together as a team to support high-profile Belgian and international clients on complex regulatory matters, including the implementation of the Digital Services Act, the Digital Markets Act, and the Digital Operational Resilience Act (DORA). Osborne Clarke has a strong international client base in a range of industry sectors, including life sciences, retail, financial services and in particular fintech, as well as specialist technology clients and companies in the digital sector. The team intercedes in data privacy matters at different levels, from communicating with the Belgian Data Protection Authority, drafting data protection policies, and carrying out data protection audits, to assisting clients in the scope of disputes before the Belgian Data Protection Authority.

Credit Scoring – Caught Between the Rock of the AI Act and the Hard Place of the GDPR?

In the constantly evolving financial landscape, credit scoring plays a key role in determining a person’s credit standing. As artificial intelligence advances, it is essential that credit-scoring mechanisms are examined through the lens of the Artificial Intelligence Act (AI Act) and the GDPR. This article aims to provide an insightful update on the AI Act and its impact on credit scoring, while also exploring the interplay with the GDPR.

There is no legal definition of credit scoring under the GDPR or under the AI Act. It is typically described as a systematic and mathematical method of calculating and predicting the credit risk and/or creditworthiness of potential borrowers. Credit scores are typically generated for financial institutions by credit bureaus, which compile data on the potential borrower’s credit history from various sources, such as banks, credit card companies and other financial institutions. In practice, credit applications are often first scored by an automated system that takes into account the applicant’s credit score. The application is then either automatically approved by the system or passed on for human review (again, using the credit score).

The AI Act

On Friday 8 December 2023, the European Union reached a political agreement on the shape and content of the ground-breaking EU regulation on artificial intelligence – the AI Act – after 38 hours of discussion over three days. Uncertainty about the final text persisted until 22 January 2024, when it was leaked online, consisting of an 892-page table comparing the different mandates of the AI Act, followed by a 258-page document containing the consolidated text. On 2 February, the AI Act was finalised and approved by all 27 EU member states. On 13 February, it was adopted by the European Parliament’s Committee on the Internal Market and Consumer Protection. It is scheduled to be submitted for a plenary vote on 10–11 April 2024.

What’s new?

The overall shape of the AI Act, with a tiered, risk-based approach (prohibited systems, high-risk systems, limited/minimal systems), has not changed from the original April 2021 proposal from the European Commission, but there have been some significant changes and additions, as outlined below.

1. Broad definition of an AI system, with a focus on autonomy

The definition of AI in the final text is intended to distinguish AI from “simpler systems” and is said to be aligned with the OECD’s definition of AI as:

“A machine-based system designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments”.

The main elements in this definition are “infers” and “autonomy”, which clearly differentiate an AI system from any other software where the output is pre-determined by the programming. The new definition came about after criticism of the original definition of AI systems (which was tied to a specific list of technologies and methods) and aims to ensure that the definition is future-proof.

2. List of prohibited systems

What to include in the list of prohibited AI practices, considered to pose unacceptable risk to safety and fundamental rights, was a particularly controversial part of the negotiations. The final text contains a closed list of prohibited AI systems, such as exploiting the vulnerabilities of a person or group due to specific characteristics, leading to significant harm; social scoring systems; inferring emotions in workplaces or educational institutions, except for medical or safety reasons; biometric categorisation systems that individually categorise a person based on sensitive information, except for labelling or filtering lawfully acquired biometric data sets in the area of law enforcement; real-time remote biometric identification systems in the public for law-enforcement purposes (unless used for specific listed purposes, such as searching for victims of human trafficking or sexual exploitation or for the prevention of terrorist attacks, etc).

3. High-risk AI – two types

The second tier of regulation concerns AI applications that are considered to pose a high level of risk to safety or to fundamental rights. The AI Act considers two types of AI systems as high risk: (i) AI intended to be used as a product (or the security component of a product) covered by specific EU legislation, such as civil aviation, vehicle security, marine equipment, toys, lifts, pressure equipment and personal protective equipment; and (ii) AI systems listed in Annex III, such as remote biometric identification systems, AI used as a safety component in critical infrastructure, and AI used in education, employment, credit scoring, risk assessment and pricing in health and life insurance, law enforcement, migration and the democratic process. Since the AI systems in this category are considered to be high risk, they are subject to the most stringent regulatory requirements, including the establishment of risk and quality management systems, data governance, human oversight, cybersecurity measures, post-market monitoring, and maintenance of the required technical documentation.

  • High-risk AI – exceptions: The most significant change to the Commission’s original proposal is that, as agreed in earlier phases of the negotiations, there will be a carve-out for AI that falls within the specified high-risk categories but which does not, in fact, pose a significant risk to safety or fundamental rights. This exception will become very important in practice, as it is expected that AI system providers will try to argue that their system does not pose such risks, aiming to avoid falling under the stringent obligations.
  • High-risk AI – FRIA: The final text includes the requirement to undertake a fundamental rights impact assessment (FRIA) for public sector bodies and private entities providing public services (education, healthcare, housing, social services, and entities engaged in credit scoring or life and health insurance), prior to deploying the high-risk AI system. This new requirement has been added and, in short, implies that these entities must list the risks, control measures, risk mitigation measures, categories of natural persons concerned, the intended frequency of use and the provider’s processes for which the system will be used.

4. Requirements may apply to the entire value chain

All companies should assess the impact of the AI Act on their business. The AI Act imposes obligations on entities that “place AI systems on the market” and applies to different parties at different points in the AI supply chain. It is therefore important for companies to assess their position within this framework in terms of obligations along the supply chain of planned or existing AI systems, products or services.

It is worth noting that for high-risk systems, the statutory obligations of a provider may be transferred along the value chain (deployer, importer, distributor, or another third party) if one of the following three conditions is met: (i) they have put their name or trade mark on the system after it has already been placed on the market or put into service; (ii) they have made substantial modifications after the placing on the market/putting into service, provided that the system remains high risk; or (iii) they have modified the intended purpose of the AI system, which renders the system high risk.

5. Right to complain

Any individual or legal entity having grounds to believe that the AI Act has been violated is granted the right to lodge a complaint with a market surveillance authority.

6. “General purpose AI” and foundation models

The AI Act creates two tiers of obligation for general-purpose AI. This set of obligations is distinct from the core risk-based tiers of regulation in the AI Act. Also new is the AI Office, which will be created within the Commission to centralise oversight of general-purpose AI models at EU level.

7. Right to an explanation

For several years, the question of whether individuals are entitled to an explanation when a business performs automated individual decision-making, has been a hot topic. The AI Act now explicitly confirms this right, but only for high-risk AI systems listed in Annex III: a data subject now has the right to a meaningful explanation regarding the role of the AI system in decision-making and the key elements of the decision made. This is without prejudice to the existing right to explanations under the GDPR, discussed below.

Critical deadlines

The AI Act will come into force 20 days after publication in the Official Journal of the EU. It is currently expected to come into force in Q2–Q3 2024. After this, the following compliance timelines will apply:

  • Six months – enforcement of prohibited AI practices will commence.
  • 12 months – GPAI obligations will take effect, with one exception. GPAI models that have been placed on the market before this date will have an additional 24 months to comply.
  • 24 months – most other obligations will take effect from this date.
  • 36 months – obligations for high-risk systems listed in Annex II will take effect.
  • 48 months – obligations for high-risk AI systems intended for use by public authorities that were on the market before the coming into force of the AI Act will take effect.

Credit Scoring and the AI Act

The impact of the AI Act on the financial sector will differ depending on the particular AI system deployed or developed. The financial sector holds an uncertain position within the AI Act. While financial services are not explicitly categorised among high-risk systems in the annexes, specific references to credit institutions or banks are made in various sections. Additionally, certain AI systems used in the context of access to and enjoyment of essential private and public services, like credit scoring, must adhere to the requirements set for high-risk AI systems.

Indeed, institutions using AI systems that are intended to be used to assess the creditworthiness of natural persons or to establish their credit score will have to comply with the far-reaching obligations applicable to high-risk systems and be subject to a FRIA (see above). Reference is made to recital 37 of the preamble to the AI Act: “In particular, AI systems used to evaluate the credit score or creditworthiness of natural persons should be classified as high-risk AI systems, since they determine those persons’ access to financial resources or essential services such as housing, electricity, and telecommunication services”.

Furthermore, as discussed earlier, banks and insurance companies employing such systems must acknowledge that EU citizens have the right to file complaints about the functioning of AI systems with the competent supervisory authorities. Additionally, these entities must ensure their ability to provide explanations for decisions based on high-risk AI systems.

Finally, it is important to address the question of how far the protection of trade secrets reaches to refuse providing an explanation. No practical guidance exists under the AI Act, but the view of Advocate General Pikamäe in Case C-634/21 before the Court of Justice of the European Union (CJEU) regarding Article 22 of the GDPR, leads to the conclusion that trade secrets cannot be used as an absolute ground for refusal and that at least a minimum amount of information should be provided to the subject. In practice, it will be a matter of balancing these different interests (see the SCHUFA case, below).

Data Protection as a Top Priority

The final trend to be discussed is more accurately an ongoing theme that remains of utmost importance. Organisations must continue to prioritise adherence to data protection rules, despite new legislation such as the AI Act coming into play.

While the Belgian Data Protection Authority (DPA) appears to be primarily focused on cookies, smart cities and the role of data protection officers, a stronger focus on the financial sector and the explainability of algorithmic decisions in 2024 is expected. This shift will not only be due to the introduction of the AI Act, but will also be influenced by important decisions at the European level.

With respect to AI, it is still unclear whether or not the Belgian DPA will also be vested with regulatory powers and, if so, to what extent. That being said, there is little doubt that the DPA will exercise its powers as often as it can in relation to automated decision-making and the impact of AI projects on fundamental rights.

However, at the European level, one of the first European Court decisions that considers what amounts to automated decision-making within the meaning of Article 22 of the GDPR has recently been published, the so-called SCHUFA case (Case C-634/21). 

This is important, as the decision may result in credit-scoring agencies in the EU being required to obtain consumers’ express consent before calculating their creditworthiness and providing consumers with an opportunity to object to a credit score. The outcome of the case very much depends on specific German federal law, but the lessons from the judgment are relevant throughout the EU and in particular in Belgium.

Facts of the case

SCHUFA is a German credit agency that carries out credit-scoring activities for third parties such as financial institutions. The plaintiff was denied credit by a financial institution after SCHUFA provided the financial institution with a negative credit score. The plaintiff requested SCHUFA to erase the – in her view “wrong” – credit score, and to grant access to her personal data and the logic of its scoring process. SCHUFA, however, merely informed her of the score and of the basic functioning of its scoring process without informing her of the concrete data used and its weighting, arguing that the calculation method was a trade secret. After the plaintiff issued a complaint to the DPA in Hesse, Germany, the DPA argued that SCHUFA’s granting of access as well as its refusal to delete the credit score complied with German data protection law. As the plaintiff disagreed with the DPA, the case was brought before the Administrative Court in Wiesbaden, Germany, which referred two questions on the interpretation of Article 22 of the GDPR and Article 6 of the GDPR to the CJEU.

Scope of Article 22 of the GDPR

Article 22 of the GDPR provides for the general prohibition on completely automated individual decision-making, unless used in certain limited situations. Where automated individual decision-making is allowed, safeguards must be in place.

CJEU decision

The CJEU rejected SCHUFA’s claim that its engagement was limited to “preparatory acts” and that decisions were solely made by the lender. Instead, the CJEU held that SCHUFA itself was engaging in automated individual decision-making within the meaning of Article 22 of the GDPR when it created the credit scores as a result of automated processing and when lenders drew strongly on these scores to establish, implement or terminate contracts.

The CJEU emphasised three conditions for determining engagement in automated decision-making:

  • a decision must be made;
  • it must be based solely on automated processing, including profiling; and
  • it must produce legal effects concerning the individual, or otherwise produce an effect that is equivalent or similarly significant in its impact on the individual.

According to the CJEU, all of these conditions were met.

The CJEU interpreted “decision” broadly, encompassing acts that may affect the individual in various ways, including the calculation of a credit score. Furthermore, it ruled that the calculation of the creditworthiness score would have significant effects. The CJEU stated this was clear from the question posed by the referring court, which said lenders would rely heavily on the score (a low credit score results in a bank rejecting the loan application in almost all cases).

On the contrary, if a credit reference agency or other similar provider issues a score that is not relied on heavily by those taking the end decision – for example, because lenders attach significant weight to other factors – then the issuing of the score would not be covered by Article 22.

As a result, organisations using AI-based assessments must either incorporate human review or comply with the requirements of Article 22 of the GDPR. The first option is challenging, as the person making the final decision must have the necessary expertise and time to review the AI system’s initial decision. It is therefore critical for developers to provide transparency, and it is up to the user/deployer to become familiar with the functionalities of the AI system. If no human review is possible, the organisation must provide data subjects with detailed information on the method used to calculate the score and the reasons leading to the result. The parameters and their respective weight in the result should also be communicated so that the data subject can effectively challenge the decision.

Conclusion

The AI Act and the GDPR should be seen as complementary frameworks, each with its own set of rules and obligations. In practice, however, both will often need to be applied simultaneously. Organisations involved in the development or use of AI systems will benefit from already having established the required data protection controls and policies. The onus will also be on organisations to understand how they are deploying AI and to implement processes to continuously monitor their use of AI in light of the increasingly complex EU framework, that takes into consideration not only the technical mechanisms of AI but the specificities of its use for different industries and public services.

Osborne Clarke

Bastion Tower
Marsveldplein 5 Place du Champ de Mars
B–1050 Brussels
Belgium

www.osborneclarke.com
Author Business Card

Law and Practice

Authors



Osborne Clarke is an international legal practice with over 330 partners and more than 1,300 lawyers in 26 locations. In Brussels, the firm’s data, IP and IT experts work together as a team to support high-profile Belgian and international clients on complex regulatory matters, including the implementation of the Digital Services Act, the Digital Markets Act, and the Digital Operational Resilience Act (DORA). Osborne Clarke has a strong international client base in a range of industry sectors, including life sciences, retail, financial services and in particular fintech, as well as specialist technology clients and companies in the digital sector. The team intercedes in data privacy matters at different levels, from communicating with the Belgian Data Protection Authority, drafting data protection policies, and carrying out data protection audits, to assisting clients in the scope of disputes before the Belgian Data Protection Authority.

Trends and Developments

Authors



Osborne Clarke is an international legal practice with over 330 partners and more than 1,300 lawyers in 26 locations. In Brussels, the firm’s data, IP and IT experts work together as a team to support high-profile Belgian and international clients on complex regulatory matters, including the implementation of the Digital Services Act, the Digital Markets Act, and the Digital Operational Resilience Act (DORA). Osborne Clarke has a strong international client base in a range of industry sectors, including life sciences, retail, financial services and in particular fintech, as well as specialist technology clients and companies in the digital sector. The team intercedes in data privacy matters at different levels, from communicating with the Belgian Data Protection Authority, drafting data protection policies, and carrying out data protection audits, to assisting clients in the scope of disputes before the Belgian Data Protection Authority.

Compare law and practice by selecting locations and topic(s)

{{searchBoxHeader}}

Select Topic(s)

loading ...
{{topic.title}}

Please select at least one chapter and one topic to use the compare functionality.