Data Protection & Privacy 2023

Last Updated February 06, 2023


Law and Practice


VJT & Partners is a Hungarian commercial law firm located in Budapest, with countrywide coverage and major international referral networks. It is a full-service law firm, advising international and domestic corporate clients, but is especially highly regarded in information technology and data protection matters. The privacy team is made up of five professionals and provides legal support on various data protection issues, from niche challenges through complex GDPR audit projects to legal representation of clients before the Data Protection Authority. Highlights include assisting in creating hi-tech solutions for GDPR challenges, and providing support to a software developer company to develop Data Hawk software that screens the data processes of companies and makes proposals on an automated basis (without human intervention).

Legal Background

The basis of data protection can be found in the Hungarian constitution, which states that everybody has the right to privacy and that an independent authority shall oversee the protection of personal data.

The major laws in the data protection field are Act CXII of 2011 on the Right of Informational Self-Determination and on Freedom of Information (Data Protection Act) and the General Data Protection Regulation (GDPR).

To implement the GDPR and Directive (EU) 2016/680 of the European Parliament and of the Council (Law Enforcement Directive), the Data Protection Act was completely amended in July 2018, and now contains three groups of provisions:

  • additional procedural and substantial rules for data processing, which fall under the scope of the GDPR (where the GDPR itself permits the application of national laws or derogation therefrom);
  • rules for data processing, which do not fall under the scope of the GDPR; and
  • rules for data processing for law enforcement, national security and national defence purposes.

Enforcement Environment

The most important administrative sanction is the fine, which may go up to EUR20 million or 4% of the annual turnover, whichever is higher. In addition to the administrative GDPR sanctions, Hungarian laws also provide other types of sanctions, as follows.

  • Civil sanctions: individuals may bring private actions against data controllers and processors for violations of data protection rules. An individual may claim both pecuniary and non-pecuniary damages.
  • Criminal sanctions: criminal penalties may apply if the abuse of personal data is committed for financial gain, or if it causes significant detriment to individuals.

The authority responsible for monitoring the application and enforcement of Hungarian data protection laws is the National Authority for Data Protection and Freedom of Information (the “Authority”).

Within its advisory powers, the Authority is particularly active in advising lawmakers on legislative measures in the data protection area. It also issues recommendations to controllers and the general public from time to time, although it has highlighted several times that the European Data Protection Board (EDPB) or its predecessor, the Article 29 Working Party (WP29), is the main body entrusted with interpreting the GDPR.

The Authority also has various authorisation powers in line with the GDPR, but these powers are rarely used, as they are rather specific (eg, the approval of binding corporate rules or the approval of codes of conduct or certifications).

Apart from fines, the Authority may impose several other corrective measures, with the following being particularly common:

  • reprimands to controllers or processors (where data processing has infringed Hungarian data protection laws);
  • bans on data processing;
  • ordering the controller or processor to comply with data subject right requests;
  • ordering the controller or processor to bring their operations into compliance with data protection laws;
  • ordering the controller to communicate the data breach to the affected data subjects; and
  • ordering the erasure or rectification of personal data, or a restriction on data processing.

The Authority may conduct audits and has wide investigatory powers. Investigations are usually initiated by the complainant, but the Authority may also initiate them ex officio.

In its enforcement framework, the Authority has two main kinds of procedures.

  • Investigation: this is a preliminary phase in which the Authority aims to collect sufficient evidence, based on which it decides whether or not to launch an administrative procedure. After the Authority has mapped all the relevant circumstances of a possible breach of data protection laws, it can either –
    1. close the case and declare a lack of breach of data protection rules;
    2. call the controller to remedy the unlawful data processing within 30 days; or
    3. launch an administrative procedure (if the controller does not remedy the situation within 30 days or if the gravity of the breach justifies the launch of the administrative procedure).
  • Administrative procedure: this is the main enforcement procedure by which the Authority may impose fines or other corrective measures, and it may be launched even without a prior preliminary investigation phase.

Both procedures may be launched ex officio or based on a complaint (but in the administrative procedure only the data subject concerned can file a complaint).

In general, the Authority has a broad selection of investigatory powers, including making on-site visits and accessing equipment used in the course of data processing. The Authority usually provides very short deadlines for controllers to present the GDPR-compliant documentation, so the GDPR’s accountability principle must be taken seriously.

The controllers and processors may challenge the Authority’s decision in merits before the Budapest Regional Capital Court. Such legal remedy by itself does not have a suspensive effect.

As Hungary is part of the EU, the Hungarian lawmakers decided on several GDPR implementation packages to bring Hungarian law in line with the GDPR. The Authority also confirmed that the GDPR will prevail if there is any direct conflict between it and Hungarian privacy rules.

Moreover, the Authority follows the guidelines, opinions and similar soft law issued by the EDPB, and respects that the EDBP is the main body for interpreting the GDPR.

In cross-border proceedings, the Authority also co-operates with other member states: it suspends the proceeding until the lead supervisory authority makes its decision based on the GDPR’s one-stop shop rule.

The future subnational landscape of e-privacy and big data largely depends on the final adoption of the widely debated E-Privacy Regulation and the Data Act. Moreover, the application of the Digital Services Act and the Data Governance Act is likely to invoke new issues in the field of data protection as well.

Although the Authority is rather strict, privacy awareness in Hungary is still in its infancy, so the role of NGOs and self-regulatory organisations remains marginal.

There are two NGOs that aim to specifically address privacy issues, “MADAT” (the Hungarian Association for Privacy Awareness) and “NADAT” (the National Data Protection Association). There are also other NGOs that generally assist in the promotion and enforcement of human rights, such as “TASZ” (the Hungarian Civil Rights Union), and as part of this work represent those clients whose right to privacy has been violated. TASZ also regularly shares privacy-related educational materials on its website.

In certain sectors, such as marketing, there are organisations that also cover sector-specific areas of privacy; for example, “IAB” Hungary and the Hungarian Data & Marketing Association share research and news on the topic of online marketing.

The Authority is one of the strictest authorities in the context of GDPR interpretation. It has an especially granular approach to purpose specification and data minimisation (see 2.1 Omnibus Laws and General Requirements). This has made business management difficult for international companies that want to use one uniform privacy policy across different countries, as this is hardly possible in Hungary due to the local expectations of the Authority.

It remains to be seen whether the international EDPB practice will bring any change in this context. The Authority recognises that the EDPB is solely authorised to interpret the GDPR, however it may continue its old practice in any matter that is not explicitly regulated by the EDPB.

The Authority’s GDPR enforcement practice is gradually becoming more punitive, as more and more of the Authority’s fines are approaching the upper limit of GDPR fines (see 2.5 Enforcement and Litigation).

Hungary adopted the so-called GDPR Omnibus Act by amending 86 sectoral acts in dozens of sectors (including finance, healthcare and online marketing) in April 2019, and there has been only one significant development at a legislative level since then.

The Data Protection Act states that, as from January 2022, the Authority may order host service providers to remove the data displayed via an electronic communication network (electronic data) if  the display would otherwise constitute a serious privacy breach and if the data relates to children, or to a sensitive category of personal or criminal data.

The Authority has remained active and made hundreds of decisions public since the GDPR came into force. During 2022, the Authority advanced its case law by addressing data protection aspects in, among others, the use of artificial intelligence (AI), employee surveillance with CCTV cameras and direct marketing.

Milestone Rulings on Data Protection in AI

The Authority’s decision on the use of AI was a milestone in Hungarian regulatory practice as this was the first Hungarian watchdog to address this technology.

The case involved the personal data processing of a bank which automatically analysed the recorded audio of customer service calls with AI. The software then established a ranking of the calls recommending which caller should be called back as a priority, based on their emotional state.

The purpose of the processing activity was determined by the bank as quality control based on variable parameters, the prevention of complaints and customer migration, and the development of its customer support efficiency. However, according to the Authority, the bank’s privacy notice referred to these processing activities in general terms only, and no material information was made available regarding the voice analysis itself.

The bank based the processing on its legitimate interests to retain its clients and to enhance the efficiency of its internal operations. The Authority emphasised that the only lawful legal basis for the processing of emotion-based voice analysis would be the consent of the data subjects.

Additionally, the Authority highlighted that although the bank had carried out a data protection impact assessment, it had failed to present substantial solutions to address the high risks. In conclusion, the Authority, in addition to imposing a record fine of HUF250 million (approximately EUR640,000), obliged the bank to cease the analysis of emotions in the course of voice records.

The DPA has also organised conferences, eg, the fourth annual DPA conference, where the use of AI was the main issue addressed. Its activity is also preeminent in user education as it has made several training videos available concerning hot topics such as data breaches.

On a judicial level, the Supreme Court has upheld the decision of the Authority stating that it is lawful to process personal data for the purpose of establishing a press list of the wealthiest people in Hungary with the largest family undertakings by estimating their fortune, if the data processing otherwise respects the GDPR.

Direct Marketing

Recently, direct marketing has been a very hot topic. The Authority has addressed it in numerous decisions, and this could remain in the focus in the future. Thus, companies are advised to review their marketing practice.

In a milestone direct marketing case against  Magyar Éremkibocsátó Kft, the Authority imposed a HUF30 million (approximately EUR74,500) fine, as the company handled the contact data of thousands of individuals, in the absence of adequate prior privacy information, a concretely defined purpose, and a valid legal basis.

The decision is outstanding for providing an official interpretation of how many data subjects are required to give consent for a data controller to perform direct marketing activities through several different channels. The decision’s key takeaways are the following:

  • separate, explicit consent is required individually for each marketing purpose and channel;
  • this does not preclude the provision of an option where consent can be given to all specified purposes at the same time, if it is possible to give separate consent only for certain purposes;
  • the purpose must be specified – the Authority stated that receiving direct marketing “electronically” is too broad as it allows the possibility for different interpretations; 
  • as direct marketing is an umbrella concept, its implementation must also be specified for the sake of transparency (ie, it must specify whether it includes own advertisements or third-party advertisements, as well as specifying the channel); and
  • in the case of offline communication, it is not sufficient to refer to the online privacy notice, because there are data subjects who either do not have access to the internet or for whom it is difficult to find the information on the internet during or before ordering by mail or telephone.

The Schrems II Judgment

The Schrems II judgment is another hot topic. It still raises a lot of issues in Hungary, as no proper standards have been developed regarding how to assess the adequacy of the data-importing country in a practical, smooth, cost-efficient and safe manner (see 4.2 Mechanisms or Derogations that Apply to International Data Transfers). In December 2022, the European Commission launched the process to adopt a new adequacy decision for the EU-US Data Privacy Framework complying with the Schrems II judgment. The final adoption is expected to take place during 2023, although privacy experts are concerned about its future.

Data Protection Officers

The rules to appoint a data protection officer (DPO) in Hungary are the same as anywhere else in the EU. Appointing a DPO is necessary if:

  • data processing is carried out by a public authority or body (except for the courts when they act in their judicial capacity);
  • the core activity performed consists of processing operations that require regular and systematic monitoring of data subjects on a large scale; or
  • the core activity performed involves the processing of special categories of personal data or data relating to criminal convictions and offences on a large scale.

The responsibilities of the DPO stem directly from the GDPR, with the primary responsibilities being as follows:

  • to inform and advise the controller or the processor and the employees who carry out processing about their obligations under data protection laws;
  • to monitor compliance with data protection laws (such as identifying processing activities, analysing and checking the compliance of processing activities, and issuing recommendations on compliance);
  • to provide advice on the data protection impact assessment and monitor its performance;
  • to co-operate with and act as the contact point for the Authority; and
  • to assist in maintaining the records of processing (not an explicit GDPR requirement, but recommended as best practice).

The DPO must be announced to the Authority via its website.

The Authority has issued numerous explanatory guidelines about appointing DPOs, but they do not usually contain new information compared to the international WP29 Guideline No 243 on DPOs, upon which organisations should rely when deciding whether or not to appoint DPOs.

Legal Bases for Data Processing

Personal data may be processed only if there is adequate legal ground to do so. The GDPR recognises the following six legal grounds:

  • the data subject gives consent;
  • the data processing is necessary for the performance of a contract (to which the data subject is a party) or to take steps at the request of the data subject prior to entering into a contract;
  • the processing is based on a necessity for compliance with a legal obligation to which the controller is subject (such legal obligation must be set out in an act of parliament or a municipal decree, according to the Data Protection Act);
  • the processing is necessary to protect the vital interests of the data subject or of another natural person;
  • the data processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller; and
  • the data controller (or a third party) has a legitimate reason for the data processing.

If the data processing involves sensitive personal data, additional conditions must be met (see 2.2 Sectoral and Special Issues).

Privacy by Design and Default

Even in the pre-GDPR era, the Authority considered it important to examine whether companies integrated the basic data protection principles into their processes, so privacy by design and default are not completely new concepts in Hungary.

Privacy by design means that, even in the early stages (eg, when decisions are made) and throughout the entire cycle of the processing, controllers must use appropriate technical and organisational measures to implement the basic data protection principles and address key privacy concerns.

Privacy by default requires controllers to integrate appropriate measures so that data processing by default is limited to an “as-needed” basis in the context of the amount of personal data collected, the duration of processing and access rights.

Privacy Impact Assessments

Controllers must carry out privacy impact assessments (PIAs) in high-risk data processes. The Authority has not issued any specific guidelines about PIAs, but the international WP29 Guideline on this topic (No 248) is relevant in Hungary as well.

In 2019, the Authority published the list of data-processing operations that require a prior PIA to be carried out (the “PIA list”). Several processing activities that include the use of emerging technologies were listed in the document (see 5.1 Addressing Current Issues in Law).

Controllers may freely decide on the PIA methodology they wish to use; however, the Authority recommends using the Hungarian version of the French Data Protection Authority’s PIA software, which it has published on its website.

Privacy Policies

Controllers are obliged to provide thorough information to data subjects about the use of their personal data. The provision of information differs if it is collected from the data subject directly (in this case, Article 13 of the GDPR applies) or if the data is obtained from someone else or even created by the controller (Article 14 of the GDPR applies).

In 2016, the Authority issued a guideline on how the controller should prepare external privacy policies. The bottom line is that the information must be provided on a purpose level. Broad data processing purposes such as “HR management” are not acceptable in the eyes of the Authority – the purposes must be specified in a way that enables only one interpretation (eg, “recruitment”). After the purpose is specified, the Authority also expects controllers to display in the privacy notice all the relevant circumstances of the data processing for the given purpose.

As for internal policies, the Authority concludes that a lack of internal policies does not automatically lead to GDPR sanctions, but the controller must implement adequate technical and organisational measures to prove compliance with the GDPR. The controller must decide on its own what measures to take, but such measures may include the preparation of internal policies as well.

Data Subject Rights

The GDPR gives several rights to data subjects to guarantee that they retain control over their personal data. The Authority requires data controllers not only to inform individuals about the following rights, but also to give meaningful information about what each right means, as well as in what situations and how it can be exercised.

  • Right to access: this allows data subjects to request a copy of their personal data or supplementary information about the data processing from the controller. The data subject does not have to justify their request.
  • Right to be forgotten/right to erasure: individuals may request their personal data be deleted in several cases – eg, if it is no longer necessary for the purpose for which it was originally collected, following a successful objection to the processing or if the individual has withdrawn their consent.
  • Right to rectification: the individual can request their inaccurate or incomplete personal data be corrected.
  • Right to restriction of the processing: if the individual exercises this right, the controller’s processing is further limited to storing the personal data. Restrictions may be requested, for example, if the data is no longer necessary for the processing, but the individual needs it in the future, for legal claims.
  • Right to objection to the processing: this right relates to purposes where the legal base is legitimate interest. If the data subject objects, the data controller needs to check whether the data subject’s interests override those of the data controller.
  • Right to data portability: if the data processing is based on consent or is necessary for the performance of a contract and in both cases it is carried out by automated means, the data subject can request to receive the personal data in a portable format, or can request that the controller transmits the data to another controller.
  • Right to not be subject to a decision based solely on automated decision-making (including profiling): an individual has this right if the decision produces legal effects concerning them or if it similarly significantly affects them.

The Authority has been very active in enforcing data subjects’ rights and has taken the 30-day response deadline seriously. Therefore, it is important for controllers to implement adequate policies on handling data subjects’ requests and also to implement technical/organisational measures so that data subjects’ requests can easily be fulfilled (eg, to use software to localise the data subjects’ personal data in different records).

Anonymised, De-identified and Pseudonymised Personal Data

The GDPR and the Data Protection Act only apply to personal data that allows the direct or indirect identification of a person. Anonymisation means that such connection is lost forever, and therefore anonymised personal data does not fall under the scope of these laws.

De-identification and pseudonymisation are good methods by which to adhere to data security, but as it is possible to reconnect the personal information with the individual, de-identified and pseudonymised personal data as a general rule remain under the scope of the GDPR.

In the Authority’s view, the data remains personal as long as the data subject is identifiable, and it is not necessarily relevant whether the controller itself can identify the data subject (see 1.7 Key Developments).

Use of New Technology

The GDPR addresses new technology such as profiling, automated decision-making, online tracking, big data and AI (see 5.1 Addressing Current Issues in Law).

Breach of Personal Rights

The Data Protection Act authorises data subjects to bring private actions against data controllers or processors for breaches of privacy laws. They may claim both pecuniary and non-pecuniary damages in front of the court.

In the case of non-pecuniary damages, it is enough to prove that the privacy right of the data subject has been violated; beyond this, no proof of any non-pecuniary disadvantage has to be provided.

Personal Data

The GDPR covers three types of personal data:

  • “normal” personal data, which does not require further safeguards;
  • sensitive personal data; and
  • criminal convictions and offences data.

Sensitive personal data includes data relating to racial or ethnic origin, political opinion, religious or philosophical beliefs, trade union membership, genetic data or biometric data (for the purpose of uniquely identifying a natural person), health data and data about the sex life or sexual orientation of an individual.

Under the Data Protection Act, personal data relating to criminal convictions and offences may be processed – unless the law states otherwise – on the legal basis applicable to special categories of personal data.

In the case of sensitive personal data and personal data relating to criminal convictions and offences, the Authority expects controllers to check whether any additional special condition is fulfilled under Article 9 of the GDPR (apart from the basic six legal grounds under Article 6 of the GDPR). If the controller is unable to demonstrate proper legal grounds this way, the data processing is prohibited.

Hungary also regulates sector-specific data (which is not necessarily sensitive personal data), to which special rules apply under sector-specific acts. The controller may usually invoke Article 6 (1) c) of the GDPR if these acts determine the permitted scope of data processing, including the categories of personal data, the purposes and conditions of the processing, the persons authorised to process the personal data and the duration of processing.

Financial Data

Financial data is regulated in various Hungarian financial acts and comes under professional secrecy rules (such as insurance secrets, bank secrets and securities secrets). The Hungarian financial acts provide detailed rules on confidentiality and the disclosure of financial secrets.

Health Data

Health data is sensitive personal data. In Hungary, Act XLVII of 1997 on the Processing and Protection of Health Care Data and Associated Personal Data (the Health Data Act) provides detailed rules on processing health data. In general, health data can be processed only for a given purpose authorised by the Health Data Act, or if the patient gives explicit consent. The recent amendment of the Health Data Act (which followed the creation of the National eHealth Infrastructure) allows patients direct access to their health data.

Communications Data

Electronic communications data is regulated in detail in Act C of 2003 on Electronic Communications (the Electronic Communications Act). Electronic service providers may process electronic communication data for the purposes set out in the Electronic Communications Act (such as billing) to the extent it is necessary, and in line with privacy by design they must implement appropriate measures to prevent accidental or illegitimate interception of communication.

Voice Telephony and Text Messaging

Traditional voice telephony is regulated in various sector-specific laws (such as consumer and financial regulations), which provide rules on recording calls. In voice-to-voice (not-automated) calls, the user may be called for direct marketing, information, public opinion polling or market research only if they do not object to such communication. The Authority has also provided guidelines on how the controllers should provide the necessary privacy information to users over the phone. The Voice over Internet Protocol (VoIP) is not explicitly regulated by Hungarian data protection laws, but the future ePrivacy Regulation is expected to set rules in this context.

Text messaging as a form of electronic communication is primarily relevant in terms of Hungarian anti-spam laws, according to which users may not receive electronic advertisements without providing prior consent.

Children’s or Students’ Data

Children’s data in the context of information society services may be processed based on the child’s consent if said child is over 16 years of age. If the child is under 16 years of age, parental consent is required. In the context of offline services, Act V of 2013 on the Civil Code is relevant, which provides rules on the legal capacity of children.

The processing of the data of pupils and students that is used for assessment is included in the PIA list as an activity that results in higher risk.

Employment Data

Employment data is regulated by the Labour Code, which limits the scope of data that may be processed by the employer by defining that an employee may only be requested to disclose such information that is necessary for the establishment, completion or termination of the employment relationship, or for exercising claims arising from the Labour Code.

Internet, Streaming and Video Issues

Regarding internet, streaming and video issues, the only processing of personal data on which the Authority has provided a statement since the GDPR came into force is the use of cookies. The Authority’s view is in line with the European practice and WP29’s Opinion 04/2012 on Cookie Consent Exemption. In June 2018, it stated that cookies may be set on the device of the user based on:

  • the website manager’s legitimate interest – if the cookie is used for the sole purpose of carrying out the transmission of communication over an electronic communications network or is strictly necessary for the provision of the service explicitly requested by the subscriber or user (eg, authentication cookies); or
  • the user’s consent – in basically all other cases.

If the processing is based on consent, the website manager needs to provide sufficient information about the cookies used. This information must include:

  • the name of the cookie;
  • the function of the cookie;
  • the storage period; and
  • personal data processed by the cookies.

Social Media

Social media is not explicitly regulated in Hungary by sector-specific data protection rules. In January 2021, the Authority issued a statement on the lawful use of social media, providing basis rules for website operators who use embedded social media modules (eg, tracking pixels) on their website (including providing proper privacy notices and consent mechanisms in the context of social media modules). Concerning social media, the EDPB published a decision that the fulfilment of contractual obligations cannot serve as a legal basis to process personal data in the context of terms of service for the purpose of behavioural advertising, as this is not a core element of social media services. This ruling is likely to direct the Authority’s approach as well.

Search Engines

Search engines are also not explicitly regulated by sector-specific data protection rules. The Authority issued a guideline on how to handle right-to-be-forgotten (RTBF) cases in line with the Costeja Judgment No C-131/12 of the CJEU and WP29 Guideline No 225 on implementing the judgment. There is also Hungarian case law on interpreting the scope of RTBF rules and the delisting criteria.

Online Platform Content

Online platform content (such as hate speech, disinformation and terrorist propaganda) is not specifically regulated in Hungary and usually does not have Hungarian data protection relevance. Online platform providers usually qualify as intermediary service providers and thus must remove any illegal content upon being notified of such by users, within the deadline set by Hungarian laws.

As of January 2022, the Authority may order platform providers to remove certain online content if it has data protection relevance (eg, if the online content seriously violates the privacy right of children or if it relates to special categories of personal data or data relating to criminal convictions).

Act XLVIII of 2008 on Business Advertising Activity still provides that explicit consent is required from the individual for unsolicited electronic marketing communication (via email, fax or sms), regardless of whether said individual is a recipient in a B2B or B2C context. However, based on the GDPR, the Authority has recognised that legitimate interests may be a proper legal ground for electronic marketing in existing client relationships by ensuring there is an opt-out feature.

Automated marketing calls are only permitted based on the explicit consent of the user. Non-automated (voice-to-voice) marketing calls are permitted only if the user has not objected to such calls (ie, there is no § or other similar objection mark in the relevant applicable telephone directory).

Hungarian law does not have specific rules on behavioural advertising; thus, the GDPR rules apply. As a strict rule, tracking cookies that allow behavioural advertising may only be set on the user’s device based on prior explicit consent, and the website operator must have full knowledge of third-party cookies on its website. In the context of social media marketing, the Authority has made it clear that website operators must use inactive social plug-ins so that the user may control what information will be transferred from the website to their social media.

The Authority has issued a guideline in the workplace privacy context (the “Workplace Guideline”), which includes the basic principles of data processing and numerous special issues, including recruitment, employee monitoring and whistle-blowing operations.

The Authority places high importance on the basic data protection principles in the workplace environment. Workplace data processing purposes must be well specified, and only data that is strictly necessary for the employment relationship can be processed. In general, consent is not a proper legal ground for data processing, due to the subordinate relationship between the employer and the employee. The most common legal grounds used by employers are either legal obligation (under Article 6 (c) of the GDPR) or their legitimate interest (when the data processing purpose may not be connected with a specific legal obligation).

The employer has the right to monitor workplace communication, but certain guarantees must be provided. According to the Workplace Guideline, the following are the most important:

  • the employer may not review and store private files in the course of monitoring;
  • the employer must give the employee the opportunity to be present during the check if possible;
  • the employer must consider the proportionality principle and whether a less privacy-invasive measure is available (eg, if websites are blocked, website monitoring may not be needed);
  • the employer must take a gradual approach (eg, if possible, only email titles may be checked first, to filter out private messages, and only then may the content be accessed); and
  • the employee must be properly notified before the respective monitoring takes place.

The Authority has not addressed new emerging technological means (like threat detection, e-discovery or loss prevention programmes), but the above guarantees may apply mutatis mutandis. Also, WP29 Opinion No 2/2017 on data processing at work is relevant in the Hungarian context as well.

In general, the employer must consult with the works council 15 days before implementing any workplace privacy or employee monitoring measure.

Hungary also has a specific whistle-blowing act that provides detailed rules on how the employer may lawfully operate a whistle-blowing system, including rules on the subject matter of the reports, access to reports, confidentiality rules, limitations on complainants and procedural rules.

In the Hungarian civil and administrative procedure, no specific standard applies to alleging violations of data protection laws, but the applicable Hungarian rules on evidence set out in the applicable procedural acts must be respected.

The Authority usually collects evidence by asking the controllers to provide the relevant information and documents. Based on the GDPR “super principle” of accountability, the burden of proof to demonstrate compliance with data protection laws is on the controller.

The Authority’s GDPR enforcement practice becomes gradually more punitive, as more of the Authority’s fines approach the upper limit of GDPR fines.

In May 2020, the Authority imposed a GDPR fine of HUF100 million (close to EUR260,000) for a security breach of a Hungarian telecoms company when an ethical hacker reported a security vulnerability to the telecoms company. Even though there was no actual theft or leak of personal data, apart from the access by the ethical hacker, the Authority still imposed a large fine, as it considered that the vulnerability of the database was high (identify theft or misuse of personal data could potentially have occurred). This leading case shows that there is some shift in enforcement focus from traditional GDPR issues to data breach management and cybersecurity.

In February 2022, the Authority made a new breakthrough by imposing a new record fine of HUF250 million (approximately EUR640,000) concerning the improper use of AI technology (see 1.7 Key Developments).

Private Litigation

The Hungarian courts do not have specific standards on alleging violations, but the applicable rules on evidence set out in procedural acts must be respected. Apart from written evidence, witnesses and expert opinions are often used in litigation.

The Data Protection Act authorises individuals to bring private actions against data controllers or processors for breaches of data protection laws. As of June 2023, class actions may also be filed for GDPR violations. Within the framework of the class action, competent authorities and representative organisations will be entitled to bring lawsuits before the courts and ask for civil law reparations in cases where the unlawful data protection practice affects a large number of consumers.

In line with the GDPR, the Data Protection Act states that the burden of proof in litigation to demonstrate compliance with data protection laws lies with the controller/processor involved as a defendant.

Both damages and injunctive relief may be obtained through the courts. In a recent case, a Hungarian court issued a preliminary injunction in which a newspaper had to remove the names of the owners of a Hungarian company from the online version of its list of richest Hungarians, due to privacy concerns.

In 2018, Act XC of 2017 on the Criminal Procedure (the “New Criminal Procedure Code”) came into effect, which included completely new rules on law enforcement’s access to data and surveillance.

As a general rule, law enforcement authorities may collect information without prior official approval, except where the collection of information would be more privacy-intrusive. For example, law enforcement authorities may file information requests to service providers, but they need the prior approval of the public prosecutor if the information request is directed to financial organisations or electronic communication network service providers. Similarly, law enforcement authorities may only pursue certain covert surveillance activities (eg, covert surveillance of information systems, covert searches, covert surveillance of a specific location, opening mail or other closed packages, and interception) based on prior judicial approval.

Even when the law enforcement agency is authorised to unilaterally pursue covert surveillance activity, it is not sufficient to purely refer to “law enforcement/prosecution purposes”. The New Criminal Procedure Code makes it clear that collecting secret information via concealed devices is possible only if:

  • it is reasonable to assume that the information or evidence sought is essential for the purpose of criminal proceedings and cannot be otherwise obtained (necessity test);
  • the use of the concealed device does not result in the disproportionate restriction of the fundamental rights of the person concerned, or of another person, in relation to the prosecution objective pursued (balance test); and
  • the use of the concealed device will likely result in obtaining information or evidence relating to the crime (relevance test).

The New Criminal Procedure Code also provides numerous other safeguards by specifying – among other things – who may access the data, when the access is possible, what measures may be taken and when the collected data must be erased.

Moreover, following the pattern of the Law Enforcement Directive, the Data Protection Act itself provides detailed rules on how the law enforcement authorities must process the personal data, including rules on privacy by default, data subject rights, data security measures and logging (to make the activities of law enforcement traceable). In this area, the GDPR is not applicable, but the Authority may still impose a fine on the relevant authorities, based on a breach of the Data Protection Act. Such fine is capped at HUF20 million (approximately EUR50,000).

Access to data for national security purposes is regulated in detail in Act CCXV of 1995 on National Security Services (the “National Security Services Act”).

The national security agencies have wider access to data than law enforcement agencies and have particularly wide access to certain service providers’ records and government records. On the other hand, certain covert surveillance activities that are more privacy-intrusive (such as covert surveillance in closed areas or covert surveillance of an information system) are subject to prior judicial approval.

The National Security Services Act provides that collecting secret information is possible only if the information required to perform the national security tasks set out in the National Security Services Act cannot be obtained otherwise. However, unlike the New Criminal Procedure Code, the National Security Services Act does not require the performance of a balance test to examine whether the national security purpose disproportionately restricts personal rights.

Overall, national security agencies have wider power and may collect information based on more flexible rules than those imposed on law enforcement agencies.

The Data Protection Act was amended in July 2018 to regulate data processing for national security purposes. The GDPR is not applicable in this area, but the Authority still supervises whether the national security agency complies with the provisions of the Data Protection Act, and may impose a fine of up to HUF20 million (approximately EUR50,000).

The Data Protection Act provides detailed rules on how the national security agencies must process personal data, although some rules are more flexible for them than they are for law enforcement agencies (eg, data breaches must be notified to the Authority only once the national security interest has ceased to exist, and rules on electronic logging are less rigid).

Hungarian organisations may invoke a foreign EU member state authority’s access request as a legitimate basis to collect and transfer personal data, as long as its authority is properly granted in the respective member state’s law.

Hungarian organisations may transfer personal data to non-EU authorities only if the GDPR conditions on international transfers are met (see 4. International Considerations) – that is, the transfer is based on an adequate level of data protection. This means that, in most cases, a direct request of a non-EU authority is not in itself a legitimate basis upon which to collect and transfer personal data (as most of these authorities could not provide an adequate level of data protection). In such cases, based on Article 48 of the GDPR, Hungarian organisations should generally refuse direct requests from non-EU authorities and refer to existing mutual legal assistance treaties (if there is such an agreement between Hungary and the given state).

In March 2018, the CLOUD Act was adopted by the US Congress to enable US law enforcement agencies to request direct access to electronic data in a cross-border setting. In this context, the EDPB took the position that the CLOUD Act is contrary to the GDPR, and reaffirmed its position that direct requests from US agencies (like other non-EU authorities) are not in themselves a legitimate basis for collecting and transferring personal data.

Hungary does not participate in a CLOUD Agreement with the US, and legal uncertainty will remain as long as the US and the EU do not reach an international agreement on access requests.

In its annual report in 2018, the Authority emphasised that the regulation of intelligence services, and their actual practice, have always been among its top priorities, especially as data subjects are hardly in a position to enforce any of their data subject rights, due to the secrecy of the interventions. The intelligence service has received criticism in Hungary. When the Snowden case was a hot topic, it was even argued in Hungarian privacy circles that the operation of the Hungarian intelligence services raises very similar issues to those in the US.

This topic has now become hotter than ever in Hungary. In summer 2021, the Pegasus scandal came to light, in which Hungarian intelligence services allegedly targeted journalists’ mobile phones with Israeli Pegasus spyware based on the “national security interest”. The Authority inspected the use of Pegasus and found that the intelligence service acted in line with the National Security Act. However, some privacy activists consider that the Pegasus scandal shows that the National Security Act gives too broad statutory powers to Hungarian intelligence, without proper privacy guarantees.

Transfers within the European Economic Area (EEA) are permitted and are treated the same way as transfers within Hungary. The same applies to “adequate countries” (ie, those countries deemed by the European Commission as being adequate in terms of the high level of their privacy standards).

Transfers of personal information to any entity outside the EEA or adequate countries are possible only if some additional mechanisms are fulfilled (see 4.2 Mechanisms or Derogations that Apply to International Data Transfers).

International data transfers (outside the EEA and adequate countries) are permitted only if certain additional mechanisms are fulfilled. These additional mechanisms should be examined in the following order.

First, the transferor must examine whether additional privacy safeguards have been taken to achieve an adequate level of privacy protection, including:

  • binding corporate rules for transfers made within company groups;
  • standard contractual clauses (SCCs) approved by the EU Commission;
  • approved certification mechanisms;
  • approved codes of conduct; and
  • individual transfers approved by the Authority.

In light of the Schrems II decision, it must be emphasised that relying on the documentation of these safeguards (eg, just signing a SCC) is not sufficient. On the contrary, it must be factually assessed and documented that the guarantees provided by the safeguards can indeed be fulfilled in practice. This includes legal assessment of the data-importing jurisdiction (such as whether the access to personal data by public authorities is proportionate and whether there is any effective remedy available to data subjects) and the implementation of supplementary measures (such as encryption) to ensure compliance with the EU level of protection of personal data.

If none of the above special safeguards are taken, the data subject must give their informed consent specifically for the given international transfer.

In the absence of the above safeguards and consent, international transfers are still permitted if any special derogation rules apply under Article 49 of the GDPR, including when the transfer is necessary for:

  • the performance of the contract with the data subject;
  • the protection of the vital interests of an individual;
  • public interest; or
  • the establishment, exercise or defence of legal claims.

Finally, in the absence of derogation rules, an international transfer is still permitted if the following conditions are met:

  • the data transfer is not repetitive;
  • the transfer applies only to a limited number of data subjects;
  • the transferring of the data is necessary for a compelling and overriding legitimate interest of the controller that overrides the interests, rights and freedoms of the data subject;
  • the data controller has examined all the circumstances of the data transfer; and
  • the data controller has provided adequate safeguards regarding the protection of personal data.

Transfers within the EEA/adequate countries are permitted. Transfers to countries outside the EEA/adequate countries are also permitted if the additional mechanisms set out in 4.2 Mechanisms or Derogations that Apply to International Data Transfers are followed; in general in such cases, no further notification to or approval from the Authority is required.

Certain public records within the scope of national data assets (such as land registry records, ID records, company registers, criminal records and close to 30 other public records) may be maintained only in-country. Data processing activities for certain public bodies (such as government offices or ministries) may also be maintained in-country only.

Similarly, online betting service providers must maintain their servers in Hungary. This means that the data located on their servers must be maintained in-country as well.

In all cases, such data may not be transferred outside Hungary.

There is no Hungarian law that would explicitly require the sharing of software codes, algorithms or similar technical details with any authority. However, Hungarian authorities may ask for such information if it is relevant to a given official process. For example, if the open-source code of some software had vulnerabilities that led to a data breach, the Authority could require the controller to share information about such software.

The following limitations may apply to transfers of data in connection with foreign official access requests.

  • In the absence of a proper data transfer mechanism (see 4.2 Mechanisms or Derogations that Apply to International Data Transfers), a direct request from a non-EU authority is not in itself a legitimate basis upon which to collect and transfer personal data. The organisation should generally refuse such requests and refer to an existing mutual legal assistance treaty (see 3.3 Invoking Foreign Government Obligations).
  • Certain sectoral financial laws (eg, insurance and banking regulations) also hint that the fulfilment of foreign access requests without the involvement of Hungarian authorities may breach the obligation of professional secrecy of the financial service providers.
  • Foreign access requests must be in line with the Hungarian trade secret regulation (eg, the authority must have a legitimate purpose, such as the prevention of violations of law, and the request must be limited to the extent justified by the purpose).

See 4.6 Limitations and Considerations.

Emerging new technology was within the focus of the GDPR when it was being drafted, which led to the handling of various technological legal issues, including automated decision-making and the processing of biometric data. Aside from this, the long-awaited ePrivacy Regulation is expected to tackle issues related to electronic communications, including the hot topics of cookies, direct marketing using online communications, and behavioural advertising (profiling) via tracking.

Big Data Analytics

The Authority has not issued specific guidelines on big data. However, the PIA list includes the processing activity of combining data from various sources for matching and comparison purposes, which is a classic use of big data.

Automated Decision-Making

After the GDPR came into force in 2018, companies needed to follow strict rules when they decided to use automated decision-making (including profiling). As a rule of thumb, individuals have the right not to be subject to decision-making based solely on automated means. Secondly, individuals have the right to receive thorough information about the logic used to make decisions based on their given personal data.

The Authority has also added automated decision-making that has a legal effect or other significant effect on individuals to the PIA list.


Profiling is any form of the automated processing of personal data where personal data is used to evaluate certain personal aspects relating to a natural person, in particular, to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements. According to this definition, profiling has three core elements:

  • it is automated;
  • it concerns personal data; and
  • the data is used for the evaluation of aspects of the person’s life.

If profiling is used, the data controller should make sure to adhere to the general data protection principle. For example, the profiling must be visible, fair, transparent and in line with data minimisation. The controller should consider using aggregated or anonymised data if it cannot otherwise justify the collection of data.

The Authority has also added profiling activities to the PIA list, such as profiling to assess solvency, profiling using the data of children, or profiling by way of evaluating personal data systematically and on a large scale.

Artificial Intelligence

AI is not yet regulated by law in Hungary, but the government has set up the Coalition on Artificial Intelligence, which aims to create the legal background for using such technology. AI is also considered a hot topic on the European level, with the European Commission presenting the AI Act in April 2021, highlighting the necessity for a European regulatory framework to unify the safeguards provided by member states. The Authority also addressed the data protection aspects of the use of artificial intelligence (see 1.7 Key Developments).

Internet of Things (IoT)

IoT is not specifically regulated in Hungary, but in 2019 the Authority issued guidance on using a form of IoT: smart energy meters. In this guidance, the Authority followed the logic of the former international soft-law maker, WP29, which also issued an opinion on smart meters.

Furthermore, the PIA list states that a data protection impact assessment needs to be carried out if a public utilities provider uses smart meters that send consumption information via a network.

Biometric Data

Biometric data is considered a special category of personal data. This means that, aside from having a lawful legal ground for its use, a further condition must also apply in order for the data processing to be lawful (see 2.2 Sectoral and Special Issues). In 2016, the Authority discussed the use of biometric systems in the employment context. Although the guideline was issued before the GDPR came into force, it is expected that the Authority will continue to rely on it in the future. The guideline highlighted that four core issues need to be considered before implementing such a system:

  • whether the biometric system is fundamentally necessary to achieve the purpose of the data processing (ease and better cost-efficiency are not considered to be fundamental necessities);
  • whether the system will be sufficiently efficient to fulfil the given need (considering the particular feature of the technology);
  • whether the privacy implications of the system are in proportion to the advantages brought by the system; and
  • if there are any other solutions that would cause less intrusion to privacy.

The Authority also addressed the use of biometric information in an employment context in 2019. In its statement, it concluded that there may be situations where using biometric information can be lawful (eg, in a research lab where employees also work with deadly viruses); however, in general, using biometric systems to monitor employees is neither essential nor the least intrusive method, and therefore is not likely to be lawful.

The PIA list also includes two types of processing of biometric information:

  • where the processing of biometric data for the purpose of uniquely identifying a natural person refers to systematic monitoring; and
  • where the processing of biometric data for the purpose of uniquely identifying a natural person concerns vulnerable data subjects, particularly children, employees and mentally ill people.

Facial Recognition

Facial recognition is a use of biometric data, so the same rules apply as for biometric data.


Both the former WP29 and the Hungarian Authority issued opinions on the use of geolocation data to monitor employees through data sent by in-built GPS in company cars. Such geolocation information can be used to track the vehicles based on the employer’s legitimate interest if there is a compelling reason to do so (eg, organising routes for couriers, tracking vehicles that transport goods of great value), but it may not be used to monitor employees outside of their working hours. Thus, employers should allow employees to turn off GPS tracking when they are not working if they are allowed to use the vehicle outside of work. The employees should receive thorough information about GPS tracking, and should also be allowed to object to it.

The PIA list includes geolocation data as a factor that indicates a higher risk for individuals if it is used to monitor or create profiles of people.


The Hungarian drone law came into effect in January 2021. Although it included various rules on the operation of drones, it did not include privacy-related provisions. However, the monitoring or recording of another’s property via unauthorised drone use has become a criminal offence. The PIA list also contains the operation of drones flown above public places or areas open for the public. Aside from the list, the Authority issued a thorough recommendation about the use of drones in 2014.

Digital governance or fair data practice review boards are not established in Hungary.

There is no recently published, available enforcement decision of the Authority in the emerging digital and technology area, nor has Hungarian court practice recently tackled this area.

The Authority recognises parties’ legitimate interest to transfer client databases in corporate transactions, but otherwise it has not yet issued any guidance on how to conduct due diligence in such transactions in a privacy-friendly manner (such as using data anonymisation techniques).

It has become clear in the Hungarian M&A market that the GDPR increases the buyer’s risk in the course of acquisition, which must be properly addressed (eg, by auditing the seller’s GDPR practice and including representations and warranties for data protection and cybersecurity).

In line with the NIS Directive, certain digital service providers (such as online marketplace providers, search engine providers and cloud-based IT service providers) must notify cybersecurity breaches that could have a major effect in terms of operation within the EU to the Hungarian computer security incident response team: the Special Service for National Security.

The DMA and DSA

On the legislative level, Hungarian laws are adapted to enforce the newest EU digital package, such as the Digital Markets Act (DMA) and the Digital Services Act (DSA).

On the policy level, the Ministry of Justice established a taskforce called the Digital Freedom Commission to address the regulation of tech companies in 2020. In the same year, the Digital Freedom Commission published a white book on the most crucial fields of the regulation of tech companies, including tax-related, privacy and child protection issues. However, the operation of the Digital Freedom Commission has been suspended due to the adoption of the DSA and DMA.

The Hungarian Competition Authority

Hungarian Authority cases involving big tech companies are emerging. Apart from the privacy cases before the Authority, the Hungarian Competition Authority (HCA) also plays a very active role in focusing on big tech companies.

The HCA has initiated competition supervision proceedings for unfair commercial practices in the context of the data collection practices of Facebook and Viber. The operators advertised their services as being “free”. While it was true that users did not have to pay for these services, Facebook and Viber benefit economically from collecting the users’ data and monitoring the users’ activities.                                                                                                                                 

In the case of Facebook, the HCA considered this commercial practice as unfair infringement of competition law, but the Hungarian Supreme Court later overturned this decision. In this respect, the Supreme Court took the view that for the word “free” to be misleading, there has to be a substantive disadvantage for consumers, which is capable of directly influencing consumer decisions, and there is no such disadvantage in Facebook’s case. The competition supervision procedure against Viber is still ongoing. In short, a consistent regulatory approach has not yet been worked out in Hungary.

Cybersecurity has become a burning issue in Hungary, in light of the Authority’s recent cybersecurity decision in which it imposed its second biggest fine (see 2.5 Enforcement and Litigation). The key takeaway for companies is that they must pay attention not just to having information security policies, but also to their implementation and regular testing of the effectiveness of the applied security measures.

Although only the general GDPR cybersecurity rule applies for most companies (with no detailed Hungarian cybersecurity regulation), this is still a challenging area as Hungarian security awareness is low and the burden of deciding on the right measures lies entirely with individual companies.

The Information Security Act

The Information Security Act applies for certain organisations (eg, critical service providers) and imposes additional requirements such as the classification of security breaches, the appointment of a security officer, a ban on data transfers outside the EEA, and logging or reporting security breaches (even if such breaches do not involve personal data).

VJT & Partners

Kernstok Károly tér 8
1126 Budapest

+36 1 501 9900
Author Business Card

Trends and Developments


VJT & Partners is a Hungarian commercial law firm located in Budapest, with countrywide coverage and major international referral networks. It is a full-service law firm, advising international and domestic corporate clients, but is especially highly regarded in information technology and data protection matters. The privacy team is made up of five professionals and provides legal support on various data protection issues, from niche challenges through complex GDPR audit projects to legal representation of clients before the Data Protection Authority. Highlights include assisting in creating hi-tech solutions for GDPR challenges, and providing support to a software developer company to develop Data Hawk software that screens the data processes of companies and makes proposals on an automated basis (without human intervention).

Data Breach Management: A Hot Topic in Hungary

In May 2020, the Hungarian Data Protection Authority (the “Authority”) issued a significantly high fine of HUF100 million (close to EUR260,000) to one of the key Hungarian telecoms companies, DIGI. This was a milestone in the Authority’s enforcement practice, as the highest fine that it had imposed prior to this in a traditional GDPR enforcement case was HUF30 million.

The DIGI case shows that there is a clear shift in enforcement focus away from traditional GDPR issues towards cybersecurity and data breach management. For example, in 2021 the Authority completed 530 data breach control proceedings, which is nearly 95% of the total number of Authority control proceedings.

The trends show that data breach management has become a hot topic in Hungary, and it is worth examining it in a Hungarian context.

Notification of the Authority

The Authority sets a very low threshold on notification. In general, data controllers must file a notification with the Authority each time there is reasonable certainty that a data breach has occurred, if such breach could have any adverse effect on data subjects (even a minor effect).

Article 33 of the GDPR states that it is not necessary to notify the supervisory authority about a data breach when “it is unlikely to result in a risk to the rights and freedoms of data subjects”.

Based on this exemption rule, a more nuanced position can be taken in the context of the “likelihood of an adverse effect”. The following categories are used for differentiation:

  • not occurred – when there is absolute certainty that no adverse effect occurred;
  • not likely – when there is no evidence that an adverse effect has occurred;
  • likely – when it is likely that there will be an adverse effect;
  • highly likely – when it is almost certain that there will be an adverse effect; and
  • occurred – when there is evidence that an adverse effect occurred.

Based on Article 33 of the GDPR, it can be argued that “not occurred” and “not likely” are not reportable categories. However, the Authority adopts a black-and-white approach, whereby anything beyond “not occurred” is reportable in practice.

The Authority takes the position that the data controller must fully exclude the possibility that an adverse effect occurred to exempt itself from notification. In its 2020 Annual Report, the Authority provides the following examples for exemption:

  • an IT system suffered a ransomware attack, but in the course of investigation the independent IT expert concluded that there was no unauthorised access to data and the controller had backup to fully restore the data (encrypted by the virus) within a short period of time;
  • unauthorised access to a database has taken place, but the database is encrypted and the encryption key has not been compromised; and
  • personal data was accidentally forwarded to a trusted partner, who erased the personal data irretrievably.

Notification of the data breach must be either via post or online. The notification form is very detailed (the Word format has 26 pages). Although this could help the controllers to approach risk assessment better, it also makes meeting the 72-hour notification deadline more difficult (even if the controller completes the notification in phases).

Risk Assessment

Data controllers are free to choose their own risk assessment method, but the Authority especially recommends that they assess the following criteria:

  • source of risk – eg, whether it is from an internal accident or an external attack, and whether in the latter case there was just an encryption of data or also theft of data, etc;
  • the controller’s environment – eg, the controller must check whether it has proper encryption measures and backups, etc;
  • personal data circumstances – eg, how much data was affected, what kind of data it was, the level of identification of data subjects, etc; and
  • possible adverse effects – eg, whether it could result in identity theft, financial damage, damage to reputation, etc.

Data controllers must select from four levels of severity of data breach in the Authority’s notification form (ie, low, medium, high and very high), and are recommended to apply the European Union Agency for Network and Information Security (ENISA) methodology, which uses the following formula to identify these four levels:

severity of data breach = DPC x EI + CB,

where DPC, EI and CB stand for the following –

  • DPC: data processing context, depending on whether it is simple data, sensitive data, behavioural data or financial data (scores from one to four);
  • EI: ease of identification of the data subject (score from nought to one depending on the level of identification); and
  • CB: circumstances of the breach (scores from nought to one depending on the level of loss of integrity, loss of confidentiality or loss of availability).

Based on this calculation, the severity of the data breach can be low (below two), medium (between two and three), high (between three and four) and very high (above four).

Depending on the result of the risk assessment, the data controller may decide to notify data subjects (this is highly recommended if the ENISA score is above three). Furthermore, based on the gathered inputs and risk assessment, the data controller may decide on the necessary steps to be taken to mitigate the effects of the data breach.

Notification of the Data Subjects

If the data controller finds that there is a high risk to data subjects, based on risk assessment, the data controller must notify them as well. Prompt notification is highly recommended, especially when the mitigation measure itself requires prompt action from data subjects (eg, a swift change of log-in data).

In this context, when the data controller notifies the Authority about the data breach, it must also notify the Authority whether:

  • the data subjects have been notified about the data breach;
  • the data subjects will be notified about the data breach; or
  • the data subjects will not be notified.

Whichever decision the controller makes, they must be able to demonstrate that the decision was adequate based on the “super principle of accountability” under the GDPR.

Data Breach Management

There is no clear guidance on how to run a proper data breach management process; the Authority always assesses this on a case-by-case basis. The bottom line is that the data controller must do everything in its power to remedy the data breach – ie, mitigate the effects of the data breach and correct its processes to prevent future breaches.

Nevertheless, the Authority recommends that data controllers take the following steps in the course of data breach management:

  • have a proper data breach internal policy in place;
  • have a dedicated data breach management team;
  • involve the senior management and the Data Protection Officer in the process;
  • make in-depth analyses of breaches by engaging a third party (eg, forensic analysis);
  • continuously report and document the process; and
  • work out an action plan to mitigate the effects of breaches and to correct the processes to prevent future breaches.

Authority’s Enforcement Framework

In its enforcement framework, the Authority first conducts an investigation as a preliminary phase of the process to collect evidence. In this phase, the Authority examines the following in particular:

  • whether the data controller met the notification deadlines;
  • whether the data controller carried out an appropriate investigation and took all the necessary steps to remedy the data breach; and
  • whether the data breach occurred due to a lack of proper security measures at the controller’s organisation.

The Authority will complete the preliminary investigation within 60 days, after which it will either close the case and declare a lack of breach of data protection rules or launch an administrative procedure (this is the main proceeding in which the Authority may impose fines).

Based on the GDPR’s one-stop-shop rule, in an EU cross-border data breach, the Authority usually suspends the proceeding in order to identify the lead supervisory authority. Once the lead supervisory authority takes over the case, the Authority terminates its proceeding.

The DIGI Case and Its Lessons

In the DIGI case, the Authority imposed a GDPR fine of HUF100 million (close to EUR260,000) for a security breach of a Hungarian telecoms company where an ethical hacker reported the security vulnerability.

DIGI did not fix the vulnerability of the open-source content management system of its publicly available website, which was known of for more than nine years and could be detected and debugged using the appropriate instruments. The ethical hacker exploited this vulnerability and accessed DIGI’s database containing the personal data of approximately 322,000 data subjects.

DIGI reported the personal data breach and terminated the vulnerability by making the necessary installations and deleting the database.

There was no actual theft or leak of personal data, apart from the access gained by the ethical hacker, but the Authority still imposed a large fine, as it considered the vulnerability of the database to have been high (it could potentially have resulted in identify theft or the misuse of personal data).

The case is currently before the Hungarian national courts to rule on the lawfulness of the Authority’s decision.

This is a very instructive example as it shows that it is not enough to have an information security policy in place: data controllers must actually implement adequate security measures. In the absence of adequate security measures, the Authority can impose a high fine even if the data controller managed the data breach adequately. Of course, data controllers might also consider not reporting a breach to the Authority, but this would be the first aggravating factor in the context of a GDPR fine if the Authority were to discover the data breach.


Data breaches have become the Authority’s enforcement priority. The Authority has the strictest enforcement policy in this field (especially in the light of possible fines), so it is highly recommended for businesses to spend more time on working out proper security measures and data breach management.

In addition, despite the uniform GDPR rules, there are several local Hungarian specialities, which also make this area very important.

Finally, the recent adoption of the NIS2 Directive will also likely have an important impact on cybersecurity and data breach management in Hungary.

VJT & Partners

Kernstok Károly tér 8
1126 Budapest

+36 1 501 9900

Law and Practice


VJT & Partners is a Hungarian commercial law firm located in Budapest, with countrywide coverage and major international referral networks. It is a full-service law firm, advising international and domestic corporate clients, but is especially highly regarded in information technology and data protection matters. The privacy team is made up of five professionals and provides legal support on various data protection issues, from niche challenges through complex GDPR audit projects to legal representation of clients before the Data Protection Authority. Highlights include assisting in creating hi-tech solutions for GDPR challenges, and providing support to a software developer company to develop Data Hawk software that screens the data processes of companies and makes proposals on an automated basis (without human intervention).

Trends and Development


VJT & Partners is a Hungarian commercial law firm located in Budapest, with countrywide coverage and major international referral networks. It is a full-service law firm, advising international and domestic corporate clients, but is especially highly regarded in information technology and data protection matters. The privacy team is made up of five professionals and provides legal support on various data protection issues, from niche challenges through complex GDPR audit projects to legal representation of clients before the Data Protection Authority. Highlights include assisting in creating hi-tech solutions for GDPR challenges, and providing support to a software developer company to develop Data Hawk software that screens the data processes of companies and makes proposals on an automated basis (without human intervention).

Compare law and practice by selecting locations and topic(s)


Select Topic(s)

loading ...

Please select at least one chapter and one topic to use the compare functionality.