Data Protection & Privacy 2020 Comparisons

Last Updated March 09, 2020

Contributed By VJT & Partners

Law and Practice


VJT & Partners is a Hungarian commercial law firm located in Budapest, with a countrywide coverage and major international referral networks. It is a full-service law firm, advising international and domestic corporate clients, but is especially highly ranked in information technology and data protection matters. The privacy team has four professionals and provides legal support on various privacy issues, from niche challenges through complex GDPR audit projects to legal representation of clients before the Data Protection Authority. VJT & Partners won the Wolters Kluwer Award “Best Data Protection Team Of The Year” in 2018 and 2019 based on its assistance in creating hi-tech solutions for GDPR challenges, and especially for the support it provided to a software developer company to develop Data Hawk, software that screens the data processes of companies and makes proposals on an automated basis (without human intervention).

Legal Background

The basis of data protection can be found in the Hungarian constitution, which states that everybody has the right to privacy and that an independent authority shall oversee the protection of personal data.

The major laws in the data protection field are Act CXII of 2011 on the Right of Informational Self-Determination and on Freedom of Information (Data Protection Act) and the General Data Protection Regulation (GDPR).

To implement the GDPR and Directive (EU) 2016/680 of the European Parliament and of the Council (Law Enforcement Directive), the Data Protection Act was completely amended in July 2018, and now contains three groups of provisions:

  • additional procedural and substantial rules for data processing which fall under the scope of GDPR (where GDPR itself permits the application of national laws or derogation therefrom);
  • rules for data processing which do not fall under the scope of GDPR; and
  • rules for data processing for law enforcement, national security and national defence purposes.

Enforcement Environment

The most important administrative sanction is fines, which may go up to EUR20 million or 4% of the annual turnover (whichever is higher). In addition to the administrative GDPR sanctions, Hungarian laws also provide other types of sanctions:

  • civil sanctions – individuals may bring private actions against data controllers and processors for violations of data protection rules. An individual may claim both pecuniary damages and non-pecuniary damages; and 
  • criminal sanctions – criminal penalties may apply if the abuse of personal data is committed for financial gain, or if it causes significant detriment to individuals.

The authority responsible for monitoring the application and enforcement of Hungarian data protections laws is the National Authority for Data Protection and Freedom of Information (the Authority).

Within its advisory powers, the Authority is particularly active in advising lawmakers on legislative measures in the data protection area. The Authority also issues recommendations to controllers and the general public from time to time, although it has highlighted several times that the European Data Protection Board (EDPB) or its predecessor, the Article 29 Working Party (WP29), is the main body entrusted with interpreting the GDPR.

The Authority also has various authorisation powers in line with the GDPR, but these powers are rarely used, as they are rather specific (such as approval of binding corporate rules or approval of codes of conduct or certifications).

Apart from fines, the Authority may impose several other corrective measures, with the following being particularly common:

  • reprimands to controllers or processors (where data processing has infringed Hungarian data protection laws);
  • ban on data processing;
  • ordering the controller or processor to comply with data subject right requests;
  • ordering the controller or processor to bring their operations into compliance with data protection laws;
  • ordering the controller to communicate the data breach to the affected data subjects; and
  • ordering the erasure or rectification of personal data, or a restriction on data processing.

The Authority may conduct audits and has wide investigatory powers. Investigations are usually initiated by the complainant, but the Authority may also initiate them ex officio.

In its enforcement framework, the Authority has two main kinds of procedures:

  • Investigation – this is a preliminary phase in which the Authority aims to collect sufficient evidence, based on which it decides whether or not to launch an administrative procedure. After the Authority has mapped all the relevant circumstances of a possible breach of data protection laws, it can either:
    1. close the case and declare a lack of breach of data protection rules;
    2. call the controller to remedy the unlawful data processing within 30 days; or
    3. launch an administrative procedure (if the controller did not remedy the situation within 30 days or if the gravity of the breach justifies the launch of the administrative procedure).
  • Administrative procedure – this is the main enforcement procedure in which the Authority may impose fines or other corrective measures. This procedure may be launched even in the lack of the prior preliminary investigation phase.

Both procedures may be launched ex officio or based on a complaint (but in the administrative procedure only the data subject concerned can file a complaint).

In general, the Authority has a broad selection of investigatory powers, including making on-site visits and accessing equipment used in the course of the data processing. The Authority usually provides very short deadlines for controllers to present the GDPR-compliant documentation, so the GDPR’s accountability principle must be taken seriously.

The controllers and processors may challenge the Authority’s decision in merits in front of the Budapest Regional Capital Court. Such legal remedy by itself does not have suspensive effect.

As Hungary is part of the EU, the Hungarian lawmakers decided on several GDPR implementation packages to bring Hungarian law in line with the GDPR. The Authority also confirmed that the GDPR shall prevail if there is any direct conflict between it and the Hungarian privacy rules.

Moreover, the Authority follows the guidelines, opinions and similar soft law issued by the EDPB, and respects that the EDBP is the main body for interpreting the GDPR.

In cross-border proceedings, the Authority also co-operates with other Member States: it suspends the proceeding until the lead supervisory authority makes its decision based on the one-stop shop rule of the GDPR.

Although the Authority is rather strict, privacy awareness in Hungary is still in its infancy, so the role of NGOs and SROs remains marginal.

The one NGO that aims to tackle this is MADAT (Hungarian Association for Privacy Awareness). There are also other NGOs, such as TASZ (Hungarian Civil Rights Union), which generally assist in the promotion and enforcement of human rights, and as part of this work represent those clients whose right to privacy has been violated. TASZ also regularly shares privacy-related educational materials on its website.

In certain sectors, such as marketing, there are organisations that also cover sector-specific areas of privacy; for example, IAB Hungary and the Hungarian Data & Marketing Association share research and news on the topic of online marketing.

The Authority is one of the strictest authorities in the context of GDPR interpretation. It has an especially granular approach on purpose specification and data minimisation (see 2.1 Omnibus Laws and General Requirements). This made business management difficult for international companies that wanted to use one uniform privacy policy across different countries, but it was hardly possible in Hungary due to the local expectations of the Authority.

It remains to be seen whether the international EDPB practice will bring any change in this context. As mentioned, the Authority recognises that the EDPB is solely authorised to interpret GDPR, but the Authority may continue its old practice in any matter that is not explicitly regulated by the EDPB.

On the other hand, the GDPR enforcement practice has not been aggressive so far, as the Authority’s fines have been rather low compared to the upper limit of GDPR fines (please see 2.5 Enforcement and Litigation).

As Hungary has recently implemented the GDPR, numerous key developments have occurred in the last 12 months. In April 2019, the Hungarian Parliament adopted the second GDPR implementation package, the so-called GDPR Omnibus Act, by amending 86 sectoral acts in dozens of sectors, including finance, healthcare and online marketing.

In 2019, the Hungarian Authority made 36 decisions public and imposed its first GDPR fines, mostly for data breaches and the mishandling of data subject right requests.

Due to the increased number of Hungarian data protection officers, the Authority organised the annual DPA conference online for the first time in 2019 and made the training video materials public concerning numerous important GDPR matters, such as the legitimate interest test, data subject rights, handling data breaches and data protection impact assessments.

In June 2019, the Supreme Court’s judgment in case BH2019/272 shook the Hungarian privacy world as it took a much more flexible interpretation of personal data than has been common in the Hungarian data protection practice.

Namely, the Authority has a broad interpretation on the notion of personal data by using the "absolute approach", according to which data remains personal data as long as the data subject remains identifiable by the controller or any other person.

In contrast, the Supreme Court used the "relative" approach by narrowing down the question to whether the data subject is identifiable by the controller. The Supreme Court found that an organisation is not a data controller and does not process "personal data" if it processes pseudonymised medical data without the identification key (please see 5.3 Significant Privacy and Data Protection Regulatory Enforcement or Litigation).

One of the key future issues is whether the Authority keeps its own former practice or relies more on the interpretation of the EDPB. It is interesting that privacy notices were not in the Authority’s focus in 2019, although they were one of its enforcement priorities for years. It seems that the Authority tried to avoid any confrontation of its old practice with EDPB/WP29 guidelines and focused more on areas that dictate simpler or more uniform logic, like data breaches and handling data subject right requests. On the other hand, as long as there are no specific EDPB guidelines, the Authority may use only its own practice.

The other key issue is whether or not the Authority accepts the Supreme Court’s more flexible interpretation of personal data. It would be of pivotal importance to clarify this as there is currently a legal uncertainty regarding when an organisation falls under the scope of the GDPR.

Data Protection Officers

The rules to appoint a data protection officer (DPO) in Hungary are the same as anywhere else in the EU. Appointing a DPO is necessary if:

  • the processing is carried out by a public authority or body (except for courts when they act in their judicial capacity);
  • the core activity performed consists of processing operations that require regular and systematic monitoring of data subjects on a large scale; or
  • the core activity performed involves the processing of special categories of personal data or data relating to criminal convictions and offences on a large scale.

The DPO must be announced to the Authority via its website (

The Authority issued numerous explanatory guidelines about appointing DPOs, but they usually do not contain new information compared to the international WP29 Guideline No 243 on DPOs. When deciding whether or not to appoint DPOs, organisations should rely on this guideline.

Legal Bases for Data Processing

Personal data may be processed only if there is an adequate legal ground. The GDPR recognises the following six legal grounds:

  • the data subject gives consent;
  • the data processing is necessary for the performance of a contract (to which the data subject is party) or to take steps at the request of the data subject prior to entering into a contract;
  • the processing is based on a necessity for compliance with a legal obligation to which the controller is subject (such legal obligation must be set out in an act of the parliament or a municipal decree, according to the Data Protection Act);
  • the processing is necessary to protect the vital interests of the data subject or of another natural person;
  • the data processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller; and
  • the data controller (or a third party) has a legitimate interest for the data processing.

If the data processing involves sensitive personal data, additional conditions apply in order for the processing to be lawful (please see 2.2 Sectoral and Special Issues).

Privacy By Design and Default

Even in the pre-GDPR era, the Authority considered it important to examine whether companies integrated the basic data protection principles into their processes, so privacy by design and default are not completely new concepts in Hungary.

Privacy by design means that, even in the early stages (eg, when decisions are made) and through the entire cycle of the processing, controllers must use appropriate technical and organisational measures to implement the basic data protection principles and address key privacy concerns.

Privacy by default requires controllers to integrate appropriate measures so that data processing by default is limited on an “as-needed” basis in the context of the amount of personal data collected, the duration of processing and access rights.

Privacy Impact Assessments

Controllers must carry out privacy impact assessments (PIA) in case of high-risk data processes. The Authority has not issued any specific guidelines about PIA, but the international WP29 guideline on this topic (No 248) is relevant in Hungary as well.

In 2019 the Authority published the list of data processing operations that require prior PIA to be carried out (the PIA list). Several processing activities that include the use of emerging technologies were listed in the document (please see 5.1 Addressing Current Issues in Law).

Controllers may freely decide on the PIA methodology they wish to use; however, the Authority recommends using  the Hungarian version of the French data protection authority’s PIA software, which it has published on its website.

Privacy Policies

Controllers are obliged to provide thorough information to data subjects about the use of their personal data. The provision of information differs if it is collected from the data subject directly (in this case Article 13 of the GDPR applies) or if the data is obtained from someone else – or even created by the controller (Article 14 of the GDPR applies).

In 2016, the Authority issued a guideline on how the controller should prepare external privacy policies. The bottom line is that the information must be provided on a purpose level. Broad data processing purposes such as “HR management” are not acceptable in the eyes of the Authority – the purposes must be specified in a way that only one interpretation is possible (such as “recruitment”). After the purpose is specified, the Authority also expects controllers to display in the privacy notice all the relevant circumstances of the data processing for the given purpose.

As to internal policies, the Authority concludes that a lack of internal policies does not automatically lead to GDPR sanctions, but the controller must implement adequate technical and organisational measures to prove compliance with the GDPR. The controller must decide on its own what measures to take, but such measures may include the preparation of internal policies as well.

Data Subject Rights

The GDPR gives several rights to data subjects to guarantee that they regain control over their personal data. The Authority requires data controllers not only to inform the individuals about the following rights, but also to give meaningful information about what the right means, and in what situations and how it can be exercised.

  • Right to access: this allows data subjects to request a copy of their personal data or supplementary information about the data processing from the controller. The data subject does not have to justify his/her request.
  • Right to be forgotten/right to erasure: individuals may request their personal data to be deleted in several cases – eg, if it is no longer necessary for the purpose for which it was originally collected, following a successful objection to the processing or if the individual has withdrawn his/her consent.
  • Right to rectification: the individual can request his/her inaccurate or incomplete personal data to be corrected.
  • Right to restriction of the processing: if the individual exercises this right, the controller’s processing is further limited to storing the personal data. Restrictions may be requested, for example, if the data is no longer necessary for the processing, but the individual needs it in the future, for legal claims;
  • Right to objection to the processing: this right relates to purposes where the legal base is legitimate interest. If the data subject objects, the data controller needs to check whether the data subject’s interests override those of the data controller;
  • Right to data portability: if the data processing is based on consent or is necessary for the performance of a contract and in both cases it is carried out by automated means, the data subject can request to receive the personal data in a portable format, or can request the controller to transmit the data to another controller;
  • Right to not be subject to a decision based solely on automated decision making (including profiling): an individual has this right if the decision produces legal effects concerning him/her or if it similarly significantly affects him/her.

In 2019, the Authority was very active in enforcing data subject rights, and took the 30-day response deadline seriously. Therefore, it would be important for controllers to implement adequate policies on handling data subject requests and also to implement technical/organisational measures so that the data subject requests could be easily fulfilled (eg, to localise the data subjects’ personal data in different records using software).

Anonymised, De-identified and Pseudonymised Personal Data

The GDPR and the Data Protection Act only apply to personal data that allows the direct or indirect identification of a person. Anonymisation means that such connection is lost forever, and therefore anonymised personal data does not fall under the scope of these laws.

De-identification and pseudonymisation are good methods to adhere to data security, but they do not strip personal information from the possibility to reconnect the data with the person, so de-identified and pseudonymised personal data as a general rule remain under the scope of the GDPR.

In the Authority’s view, the data remains personal as long as the data subject is identifiable, and it is not necessarily relevant whether the controller itself can identify the data subject (please see 1.7 Key Developments and 2.5 Enforcement and Litigation).

Use of New Technology

The GDPR addresses new technology such as profiling, automated decision-making, online tracking, Big Data and AI (please see 5.1 Addressing Current Issues in Law).

Breach of Personal Rights

The Data Protection Act authorises data subjects to bring private actions against data controllers or processors for breaches of privacy laws. They may claim both pecuniary and non-pecuniary damages in front of the court.

In case of non-pecuniary damages, it is enough to prove that the privacy right of the data subject has been violated; beyond this, no proof of any non-pecuniary disadvantage has to be provided.

GDPR has three types of personal data:

  • “normal” personal data, which does not require further safeguards;
  • sensitive personal data; and
  • criminal convictions and offences data.

Sensitive personal data includes data relating to racial or ethnic origin, political opinion, religious or philosophical beliefs, trade union membership, genetic data or biometric data (for the purpose of uniquely identifying a natural person), health data and data about the sex life or sexual orientation of the individual.

Under the Data Protection Act, personal data relating to criminal convictions and offences may be processed – unless the law states otherwise – on the legal basis applicable to special categories of personal data.

In the case of sensitive personal data and personal data relating to criminal convictions and offences, the Authority expects the controllers to check whether any additional special condition is fulfilled under Article 9 of the GDPR (apart from the basic six legal grounds under Article 6 of the GDPR). If the controller is unable to demonstrate proper legal grounds this way, the data processing is prohibited.

Hungary also regulates sector-specific data (which is not necessarily sensitive personal data), for which special rules apply under the sector-specific acts. The controller may usually invoke Article 6 (1) c) of the GDPR if these acts determine the permitted scope of data processing, including the categories of personal data, the purposes and conditions of the processing, the authorised persons to process the personal data and the duration of processing.

Financial data is regulated in various Hungarian financial acts and comes under professional secrecy rules (such as insurance secrets, bank secrets, and securities secrets). These acts apply not only to legal entities, but also to natural persons, so financial data remains under the scope of the GDPR. The Hungarian financial acts provide detailed rules on confidentiality and the disclosure of financial secrets.

Health data is sensitive personal data. In Hungary, Act XLVII of 1997 on the Processing and Protection of Health Care Data and Associated Personal Data (the Health Data Act) provides detailed rules on processing health data. In general, health data can be processed only for a given purpose authorised by the Health Data Act, or if the patient gives explicit consent. The recent amendment of the Health Data Act (which followed the creation of the National eHealth Infrastructure) allows patients direct access to their health data.

Electronic communication data is regulated in detail in Act C of 2003 on Electronic Communications (the Electronic Communications Act). Electronic service providers may process electronic communication data for the purposes set out in the Electronic Communications Act (such as billing) to the extent it is necessary, and in line with privacy by design they must implement appropriate measures to prevent accidental or illegitimate interception of communication.

Traditional voice telephony is regulated in various sector-specific laws (such as consumer and financial regulations), which provide rules on recording calls. In voice-to-voice (not-automated) calls, the user may be called for direct marketing, information, public opinion polling or market research only if he/she did not object to such communication. The Authority has also provided guidelines on how the controllers should provide the necessary privacy information to users over the phone. Voice over IP is not explicitly regulated by Hungarian data protection laws, but in this context the future ePrivacy Regulation is expected to set rules.

Text messaging as a form of electronic communication is primarily relevant in terms of Hungarian anti-spam laws, according to which users may not receive electronic advertisements without providing prior consent.

Children’s data in the context of information society services may be processed based on the child's consent if he/she is not below 16. If the child is under 16 years of age, parental consent is required. In the context of offline services, Act V of 2013 on the Civil Code is relevant, which provides rules on the legal capacity of children.

The processing of data of pupils and students which is used for assessment is included in the PIA list as an activity that results in higher risk.

Employment data is regulated by the Labour Code, which limits the scope of data that may be processed by the employer by defining that an employee may only be requested to disclose such information that is necessary for the establishment, completion or termination of the employment relationship, or for exercising claims arising from the Labour Code.

Regarding internet, streaming and video issues, the only processing of personal data on which the Authority has provided a statement since the GDPR came into force is the use of cookies. The Authority’s view is in line with the European practice and Opinion 04/2012 on Cookie Consent Exemption by the WP29. In June 2018, it stated that cookies may be set on the device of the user based on:

  • the website manager’s legitimate interest – if the cookie is either used for the sole purpose of carrying out the transmission of communication over an electronic communications network or is strictly necessary for the provision of the service explicitly requested by the subscriber or user (eg, authentication cookies); or
  • the user’s consent – in basically all other cases.

If the processing is based on consent, the website’s manager needs to provide sufficient information about the cookies used. This information must include:

  • the name of the cookie;
  • the function of the cookie;
  • the storage period; and
  • personal data processed with the cookies.

Social media is not explicitly regulated in Hungary by sector-specific data protection rules. In 2018, the Authority examined whether the household GDPR exemption applies in the context of Facebook groups (eg, sharing public photos in a public group is under the scope of the GDPR).

Search engines are also not explicitly regulated by sector-specific data protection rules. The Authority issued a guideline on how to handle right to be forgotten (RTBF) cases in line with the Costeja Judgment No C-131/12 of the CJEU and the WP29 guideline No 225 on implementing the judgment. There is also Hungarian case law on interpreting the scope of RTBF rules and the delisting criteria.

Internet content (such as hate speech, disinformation and terrorist propaganda) is not specifically regulated in Hungary and usually does not have Hungarian data protection relevance. However, providers of social media and other online platforms of such content qualify as intermediary service providers and thus must remove any illegal content upon notice of the users within the deadline set by Hungarian laws.

Act XLVIII of 2008 on Business Advertising Activity still provides that explicit consent is required from the individual for unsolicited electronic marketing communication (via e-mail, fax or sms),  regardless of whether he/she is a recipient in a B2B or B2C context. However, based on the GDPR, the Authority has recognised that legitimate interests may be a proper legal ground for electronic marketing in existing client relationships.

Automated marketing calls are only permitted based on the explicit consent of the user. Non-automated (voice-to-voice) marketing calls are permitted only if the user has not objected to such calls (ie, there is no § or other similar objection mark in the relevant applicable phone directory).

Hungarian law does not have specific rules on behavioural advertising; thus, the GDPR rules apply. As a strict rule, tracking cookies that allow behavioural advertising may only be set on the user’s device based on prior explicit consent, and the website operator shall have full knowledge of third-party cookies on its website. In the context of social media marketing, the Authority made it clear that the website operators must use inactive social plug-ins so that the user may control what information will be transferred from the website to social media.

The Authority has issued a guideline in the workplace privacy context (the Workplace Guideline), which includes the basic principles of data processing and numerous special issues, including recruitment, employee monitoring and whistle-blowing operations.

The Authority places high importance on the basic data protection principles in the workplace environment. Workplace data processing purposes must be well specified, and only data that is strictly necessary for the employment relationship can be processed. In general, consent is not a proper legal ground for data processing, due to the subordinate relationship between the employer and employee. The most common legal grounds used by employers are either legal obligation (under Article 6 (c) of the GDPR) or their legitimate interest (when the data processing purpose may not be connected with a specific legal obligation).

The employer has the right to monitor workplace communication, but certain guarantees must be provided. According to the Workplace Guideline, the following are the most important:

  • the employer may not review and store private files in the course of monitoring;
  • the employer must give the employee the possibility to be present during the check if possible;
  • the employer must consider the proportionality principle and whether a less privacy-invasive measure is available (eg, if websites are blocked, website monitoring may be not needed);
  • the employer must take a gradual approach (eg, if possible, only emails’ titles may be checked first, to filter out private messages, and only then the content); and
  • the employee must be properly notified before the respective monitoring.

The Authority has not addressed new emerging technological means (like threat detection, e-discovery or loss prevention programmes), but the above guarantees may apply mutatis mutandis. Also, WP29 Opinion No 2/2017 on data processing at work is relevant in the Hungarian context as well.

In general, the employer must consult with the work councils 15 days before implementing any workplace privacy or employee monitoring measure.

Hungary also has a specific whistle-blowing act that provides detailed rules on how the employer may lawfully operate a whistle-blowing system, including rules on the subject matter of the reports, access to reports, confidentiality rules, limitations on complainants, and procedural rules.

In the Hungarian civil and administrative procedure, no specific standard applies to alleging violations of data protection laws, but the applicable Hungarian rules on evidence set out in the applicable procedural acts must be respected.

The Authority usually collects evidence by asking the controllers to provide the relevant information and documents. Based on the GDPR "super principle" of accountability, the burden of proof is on the controller to demonstrate compliance with data protection laws.

The GDPR enforcement practice has not been aggressive so far, with fines being rather low compared to the upper limit of GDPR fines. Most of the fines were below HUF3 million (approximately EUR8,900). In May 2019, the Authority imposed its highest fine: HUF30 million (approximately EUR89,000) against Sziget, the biggest cultural and musical festival of Hungary.

As part of its practice, Sziget carried out thorough screenings of the festival goers, which included photocopying IDs and taking photos of the guests upon entrance. The main reasons for the fine were the following:

  • Sziget breached the storage limitation principle, as there was no justification for storing the data for one year after the end of the festival;
  • Sziget breached the data minimisation principle, as storing only the image and the name of the festival goer was justifiable for entry purposes; all other data (eg, citizenship, type, number and expiration date of the ID, date of birth and gender) was deemed excessive;
  • the given consent was not voluntary, as the participants would have been denied entry – and therefore the service – if they had refused to give consent; and
  • proper security screening could be achieved without data processing (eg, using metal detectors or physical screening).

Private Litigation

The Hungarian courts do not have specific standards on alleging violations but the applicable rules on evidence set out in procedural acts must be respected. Apart from written evidence, witnesses and expert opinions are often used in litigation.

The Data Protection Act authorises individuals to bring private actions against data controllers or processors for breaches of data protection laws. Class action, however, is not allowed.

In line with the GDPR, the Data Protection Act states that the burden of proof in litigation to demonstrate compliance with data protection laws lies with the controller/processor involved as a defendant.

In June 2019, the Supreme Court (Curia) brought a decision of pivotal importance to the interpretation of personal data. The case involved storing medical data in a pseudonymised form on a country level.

The key question was whether the medical data remains personal data if an organisation that receives medical data in a pseudonymised form does not have the means to reconnect the data with the person to whom the data relates.

The Curia came to the conclusion that, even though there was a theoretical possibility of the reconnection of the data with the person, the organisation itself was not in the position to carry out such reconnecting without further information, and it was under a legal ban to do so. Thus, the organisation did not have the means to reconnect the information with the person and this way it did not process personal data and was not under the scope of data protection laws.

In 2017, Act XC of 2017 on the Criminal Procedure (New Criminal Procedure Code) came into action, which included completely new rules on law enforcement’s access to data and surveillance.

As a general rule, law enforcement authorities may collect information without prior official approval, except where the collection of information would be more privacy-intrusive. For example, law enforcement authorities may file information requests to service providers, but they need the prior approval of the public prosecutor if the information request is directed to financial organisations or electronic communication network service providers. Similarly, law enforcement authorities may only pursue certain covert surveillance activities (eg, covert surveillance of information systems, covert searches, covert surveillance of a specific location, opening mail or other closed packages, and interception) based on prior judicial approval.

Even when the law enforcement agency is authorised to unilaterally pursue covert surveillance activity, it is not sufficient to purely refer to “law enforcement/prosecution purposes”. The New Criminal Procedure Code made it clear that collecting secret information via concealed devices is possible only if:

  • it is reasonable to assume that the information or evidence sought is essential for the purpose of criminal proceedings and cannot be otherwise obtained (necessity test);
  • the use of the concealed device does not result in a disproportionate restriction of the fundamental right of the person concerned or of another person in relation to the prosecution objective pursued (balance test); and
  • the use of the concealed device will likely result in obtaining information or evidence relating to the crime (relevance test).

The New Criminal Procedure Code also provides numerous other safeguards by – among others – specifying who may access the data, when the access is possible, what measures may be taken and when the collected data must be erased.

Moreover, following the pattern of the Law Enforcement Directive, the Data Protection Act itself provides detailed rules on how the law enforcement authorities must process the personal data, including rules on privacy by default, data subject rights, data security measures and logging (to make the activities of law enforcement traceable). In this area, the GDPR is not applicable, but the Authority may still impose a fine on the relevant authorities, based on a breach of the Data Protection Act. Such fine is capped at HUF20 million (approximately EUR59,000).

Access to data for national security purposes is regulated in detail in Act CCXV of 1995 on National Security Services (National Security Services Act).

The national security agencies have wider access to data than law enforcement agencies, and have particularly wide access to certain service providers' records and governmental records. On the other hand, certain covert surveillance activities that are more privacy-intrusive (such as covert surveillance in closed areas or covert surveillance of an information system) are subject to prior judicial approval.

The National Security Services Act provides that collecting secret information is possible only if the information required to perform the national security tasks set out in the National Security Services Act cannot be obtained otherwise. However, unlike the New Criminal Procedure Code, the National Security Services Act does not require the performance of a balance test to examine whether the national security purpose disproportionately restricts the personality rights.

Overall, national security agencies have wider power and may collect information based on more flexible rules than law enforcement agencies. Ultimately, however, privacy is protected by the limits set out in the Data Protection Act.

The Data Protection Act was amended in July 2018 to adequately regulate data processing for national security purposes. In this area, the GDPR is not applicable, but the Authority still supervises whether the national security agency complies with the provisions of the Data Protection Act, and may impose a fine of up to HUF20 million (approximately EUR59,000).

Similar to law enforcement agencies, the Data Protection Act provides detailed rules on how the national security agencies must process the personal data, though some rules are more flexible for them (eg, data breaches must be notified to the Authority only once the national security interest has ceased to exist, and rules on electronic logging are less rigid).

Hungarian organisations may invoke a foreign EU Member State authority’s access request as a legitimate basis to collect and transfer personal data, as long as its authority is properly granted in the respective Member State’s law.

Hungarian organisations may transfer personal data to non-EU authorities only if the GDPR conditions on international transfers are met (please see 4 International Considerations) – ie, the transfer is based on an adequate level of data protection. This means that, in most cases, a direct request of a non-EU authority is not in itself a legitimate basis upon which to collect and transfer personal data (as most of these authorities could not provide an adequate level of data protection). In such cases, based on Article 48 of the GDPR, Hungarian organisations should generally refuse direct requests of non-EU authorities and refer to existing mutual legal assistance treaties (if there is such an agreement between Hungary and the given state).

In March 2018, the Cloud Act was adopted by the US Congress to provide US law enforcement agencies the possibility to request direct access to electronic data in a cross-border setting. In this context, the EDPB took the position that the Cloud Act is contrary to the GDPR, and reaffirmed its position that direct requests from US agencies (like other non-EU authorities) are not in themselves legitimate bases for collecting and transferring personal data.

Hungary does not participate in a Cloud Agreement with the USA, and the legal uncertainty will remain as long as the USA and the EU do not reach an international agreement on access requests.

In its annual report in 2018, the Authority emphasised that the regulation of intelligence and its actual practice has always been among its top priorities, especially as data subjects are hardly in a position to enforce any of their data subject rights, due to the secrecy of the interventions. The intelligence service received criticism in Hungary. When the Snowden case was a hot topic, it was even argued in Hungarian privacy circles that Hungarian intelligence raises very similar issues to those in the US. The Authority welcomes the new rules in this area, as the means and methods of information gathering are more precisely defined than in the previous regulation. It remains to be seen, however, what the practice will bring.

Transfers within the European Economic Area (EEA) are permitted and shall be considered the same way as transfers within Hungary. The same applies to adequate countries (deemed by the EU Commission as being adequate countries based on the higher level of privacy standards).

Transfers of personal information to any entity outside the EEA or to adequate countries are possible only if some additional mechanisms are fulfilled (please see 4.2 Mechanisms That Apply to International Data Transfers).

International data transfers (outside the territories of the EEA and adequate countries) are permitted only if certain additional mechanisms are fulfilled. These additional mechanisms should be examined in the following order.

First, the transferor must examine whether additional privacy safeguards are taken to achieve an adequate level of privacy protection, including:

  • Binding Corporate Rules for transfers made within company groups;
  • Privacy Shield for transfers to US companies/organisations;
  • Standard Contractual Clauses approved by the EU Commission;
  • approved certification mechanism;
  • approved code of conduct; and
  • individual transfers approved by the Authority.

If none of the above special safeguards are taken, the data subject must give his/her informed consent specifically for the given international transfer.

In the absence of the above safeguards and consent, international transfers are still permitted if any special derogation rules apply under Article 49 of the GDPR, including when the transfer is necessary for:

  • the performance of the contract with the data subject;
  • the protection of the vital interests of an individual;
  • public interest; or
  • the establishment, exercise or defence of legal claims.

Finally, in the absence of derogation rules, the international transfer is still permitted if the following conditions are met:

  • the data transfer is not repetitive;
  • the transfer applies only to a limited number of data subjects;
  • the transferring of the data is necessary for a compelling and overriding legitimate interest of the controller that overrides the interests, rights and freedoms of the data subject;
  • the data controller has examined all the circumstances of the data transfer; and
  • the data controller provided the adequate safeguards about the protection of personal data.

Transfers within the EEA/adequate countries are permitted. Transfers to countries outside the EEA/adequate countries are also permitted if the additional mechanisms set out in 4.2 Mechanisms That Apply to International Data Transfers are followed; in general in such cases, no further notification to or approval from the Authority is required.

Certain public records within the scope of national data assets (such as land registry records, ID records, company register, criminal records and close to 30 other public records) may be maintained only in-country.

Similarly, online betting service providers must maintain their servers in Hungary. This means that the data located on their servers must be maintained in-country as well.

In both cases, data may not be transferred outside of Hungary.

There is no Hungarian law that would explicitly require sharing software codes, algorithms or similar technical details with any authority. However, Hungarian authorities may ask for such information if it is relevant for the given official process. For example, if the open-source code of the software had vulnerabilities that led to a data breach, the Authority may require the controller to share the information of such software.

The following limitations may apply to transfers of data in connection with foreign official access requests:

  • In the absence of a proper data transfer mechanism (please see 4.2 Mechanisms That Apply to International Data Transfers), a direct request from a non-EU authority is not in itself a legitimate basis upon which to collect and transfer personal data. The organisation should generally refuse such requests and refer to an existing mutual legal assistance treaty (please see 3.3 Invoking a Foreign Government).
  • Certain sectoral financial laws (such as insurance and banking regulations) also hint that the fulfilment of foreign access requests without the involvement of Hungarian authorities may breach the obligation of professional secrecy of the financial service providers.
  • Foreign access requests must be in line with the Hungarian trade secret regulation (eg, the authority must have a legitimate purpose, such as the prevention of violations of law, and the request must be limited to the extent justified by the purpose).

Please see 4.6 Limitations and Considerations.

Emerging new technology was within the focus of the GDPR when it was being drafted, which led to the handling of various technological legal issues, including automated decision-making and the processing of biometric data. Aside from this, the long-awaited ePrivacy regulation is expected to tackle issues related to electronic communications, including the hot topics of cookies, direct marketing using online communications and behavioural advertising (profiling) via tracking.

Big Data Analytics

The Authority has not issued a specific guideline on Big Data. However, the PIA list includes the processing activity of combining data from various sources for matching and comparison purposes, which is a classic use of Big Data.

Automated Decision-making

After the GDPR came into force in 2018, companies needed to follow strict rules when they decided to use automated decision-making (including profiling). As a rule of thumb, individuals have the right not to be subject to decision-making based solely on automated means. Secondly, individuals have the right to receive thorough information about the logic used to make the decisions based on their given personal data.

The Authority has also added automated decision-making that has legal effect or other significant effect on the individuals to the PIA list.


Profiling is any form of automated processing of personal data where personal data is used to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements. According to this definition, profiling has three core elements:

  • it is automated;
  • it concerns personal data; and
  • the data is used for the evaluation of aspects of the person’s life.

If profiling is used, the data controller should make sure to adhere to the general data protection principle. For example, the profiling must be visible, fair, transparent and in line with data minimisation. The controller should consider using aggregated or anonymised data if it cannot justify the collection of data otherwise.

The Authority has also added profiling activities to the PIA list, such as profiling to assess solvency, profiling using the data of children or profiling by way of evaluating personal data in a large scale and systematically.

Artificial Intelligence

Artificial intelligence is not yet regulated by law in Hungary, but the Government has set up the Coalition on Artificial Intelligence, which aims to create the legal background for using such technology. AI is also considered a hot topic on the European level. In February 2020, the European Commission issued its white paper on the use of AI, highlighting the necessity for a European regulatory framework to unify the safeguards provided by Member States.


IoT is not specifically regulated in Hungary, but in 2019 the Authority issued guidance on using a form of IoT: smart energy meters. In this guidance, the Authority followed the logic of the former international soft-law maker, WP29, which also issued an opinion on smart meters.

Furthermore, the PIA list states that a data protection impact assessment needs to be carried out if a public utilities provider uses smart meters that send consumption information via a network.

Facial Recognition

Facial recognition is a use of biometric data, so the same rules apply as for biometric data.

Biometric Data

Biometric data is considered a special category of personal data. This means that, aside from having a lawful legal ground for their use, a further condition must also apply in order for the data processing to be lawful (see 2.2 Sectoral and Special Issues). In 2016, the Authority discussed the use of biometric systems in the employment context. Although the guideline was issued before the GDPR came into force, it is expected that the Authority will continue to rely on it in the future. The guideline highlighted that four core issues need to be considered before implementing such a system:

  • whether the biometric system is fundamentally necessary to achieve the purpose of the data processing (bigger comfort or even better cost-efficiency are not considered fundamental necessities);
  • whether the system will be sufficiently efficient to fulfil the given need (considering the particular feature of the technology);
  • whether the privacy implications of the system are in proportion to the advantages brought by the system; and
  • if there are any other solutions that would cause less intrusion to privacy.

The Authority also addressed the use of biometric information in an employment context in 2019. In its statement, it concluded that there may be situations where using biometric information can be lawful (eg, in a research lab where employees also work with deadly viruses); however, in general, using biometric systems to monitor employees is neither essential nor the least intrusive method, and therefore is likely not to be lawful.

The PIA list also includes two types of processing biometric information:

  • where the processing of biometric data for the purpose of uniquely identifying a natural person refers to systematic monitoring; and
  • where the processing of biometric data for the purpose of uniquely identifying a natural person concerns vulnerable data subjects, particularly children, employees and mentally ill people.


Both the former WP29 and the Hungarian Authority issued opinions on the use of geolocation data to monitor employees through data sent by in-built GPS in company cars. Such geolocation information can be used to track the vehicles based on the employer’s legitimate interest if there is a compelling reason to do so (eg, organising routes for couriers, tracking vehicles that transport goods of great value), but it may not be used to monitor employees outside of their working hours. Thus, employers should allow employees to turn off GPS tracking when they are not working if they are allowed to use the vehicle outside of work. The employees should receive thorough information about GPS tracking, and should also be allowed to object to it.

The PIA list includes geolocation data as a factor that indicates a higher risk for individuals if it used to monitor or create profiles on people.


Drones were regulated EU-wide in 2019. Although the regulations included various rules on the operation of drones, they did not include privacy-related provisions, aside from the necessity for drone operators to have relevant data protection knowledge in order to prevent drones breaching the privacy of those recorded by them. On the other hand, the PIA list contains the operation of drones flown above public places or areas open for the public. Aside from the list, the Authority issued a thorough recommendation about the use of drones in 2014.

Digital governance or fair data practice review boards are not established in Hungary.

There is no recently published, available enforcement decision of the Authority in the emerging digital and technology area. Hungarian court practice has not recently tackled this area, either.

The Authority recognises parties' legitimate interest to transfer client databases in corporate transactions, but otherwise it has not yet issued any guidance on how to conduct due diligence in such transactions in a privacy-friendly manner (such as using data anonymisation techniques).

It has become clear in the Hungarian M&A market that the GDPR increases the buyer’s risk in the course of acquisition, which must be properly addressed (such as auditing the seller’s GDPR practice and including representations and warranties for data protection and cybersecurity).

In line with the NIS Directive, certain digital service providers (such as online marketplace providers, search engine providers and cloud-based IT service providers) must notify cybersecurity breaches that have major effects in terms of operation within the EU to the Hungarian computer security incident response team – the Special Service for National Security.

It is worth mentioning that in December 2019 the Hungarian Competition Authority (HCA) fined one of the largest tech companies, Facebook (whose online profiling is under constant privacy debate), HUF1.2 billion (close to EUR4 million) for misleading internet users. The HCA considers that the message “It’s free and always will be” on the Facebook registration page distracts the users' attention from the fact that they pay for the service with their own personal data, and also from the scope and consequences of such consideration.

In the emerging technology area, the Authority has also released a statement on the use of blockchain, in which it evaluated a theoretical situation where blockchain is used to store payment information. It highlighted that the use of the technology to – in this case – store personal data brings about plenty of concerns. Among others, the Authority highlighted that every single blockchain user – all around the world – who adds even a single block to the blockchain will be considered a data controller. This also means that there cannot be a single authority to be competent regarding the blockchain system, even if the system itself is a single system.

VJT & Partners

Kernstok Károly tér 8
1126 Budapest

+36 1 501 9900
Author Business Card

Law and Practice in Hungary


VJT & Partners is a Hungarian commercial law firm located in Budapest, with a countrywide coverage and major international referral networks. It is a full-service law firm, advising international and domestic corporate clients, but is especially highly ranked in information technology and data protection matters. The privacy team has four professionals and provides legal support on various privacy issues, from niche challenges through complex GDPR audit projects to legal representation of clients before the Data Protection Authority. VJT & Partners won the Wolters Kluwer Award “Best Data Protection Team Of The Year” in 2018 and 2019 based on its assistance in creating hi-tech solutions for GDPR challenges, and especially for the support it provided to a software developer company to develop Data Hawk, software that screens the data processes of companies and makes proposals on an automated basis (without human intervention).