The fundamental provisions for privacy and data protection in Greece are the following in order of priority:
1. The Treaty on the Functioning of the EU (TFEU) and Regulation (EU) 2016/679
Regulation (EU) 2016/679 of the European Parliament and the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (GDPR) is the main legislation for the protection of personal data. The GDPR is directly applicable in Greece and supersedes any provision of national law, including the Constitution. The GDPR provides for the imposition of penalties (Article 83), as well as the obligation to compensate for damages incurred (Article 82), in case of violation of its provisions.
2. Constitution
The basic principles for the privacy of communications and the protection of personal data are set out in the Greek Constitution. The respective articles are included in the chapter regarding fundamental individual rights. More specifically:
3. Civil Code
Articles 57-59 of the Greek Civil Code include fundamental provisions for the protection of the personality of the individual. The offence of insulting an individual’s personality may substantiate civil claims for injunction, compensation and moral damages.
4. Laws
In general, enforcement measures include the imposition of administrative penalties, as well as criminal charges and fines. At the same time, civil claims for injunction, compensation and moral damages may be substantiated by injured parties.
5. Regulatory acts and guidelines issued by the competent authorities
Lastly, the competent independent authorities, such as the HDPA and the Hellenic Authority for Communication Security and Privacy, issue regulatory acts and guidelines.
The Hellenic Data Protection Authority
Area of jurisdiction – powers
As stated in 1.1 Laws, the HDPA is a constitutionally provided independent authority whose main purpose is to supervise and monitor the implementation of the GDPR, Law 4624/2019 and any other legislation concerning the protection of the data subject from the processing of his/her personal data. The powers of the HDPA are listed in Articles 51 et seq of the GDPR and Law 4624/2019 and include, among others: (a) the power to supervise the application of the GDPR and the law for the protection of the individual from personal data processing; (b) the power to advise the government and other institutions on new legal provisions regarding personal data processing; (c) the power to conduct audits and investigations to ensure the application of the legislation; (d) the power to handle complaints; (e) the power to co-operate with other supervisory authorities to ensure the uniform application of the legislation; and (f) the power to impose penalties upon violation of the legislation.
Audits and investigations
The investigative and remedy powers of the HDPA are very important and are provided by Article 58 of the GDPR and Article 15 of Law 4624/2019. Audits are initiated by the HDPA itself (ex officio), or following the filing of a complaint, or following information provided by another state authority. During such audits, the HDPA has the power to acquire, from either the data controller or the data processor, access to all the personal data and information required for the purposes of the audit and the performance of its tasks. No right of privacy may be opposed to the HDPA. The audits are performed by members of the HDPA who are special investigating officers and have all the respective investigation powers provided by the Code of Criminal Procedure. During the audits and for the purposes of such audits, the HDPA can:
The Hellenic Authority for Communication Security and Privacy
Area of jurisdiction – powers
As stated in 1.1 Laws, the Hellenic Authority for Communication Security and Privacy is a constitutionally provided independent authority whose main purpose is the protection of the privacy of post/mail, the freedom of communication in any other manner and the security of networks and information. In order to achieve its purpose, the Authority has, in accordance with Article 6 of Law 3115/2003, the following powers: (a) to conduct audits of public entities, as well as private entities; (b) to hold hearings, investigate complaints and impose penalties; and (c) to issue regulations regarding the assurance of the confidentiality of communications.
Audits and investigations
The Authority is competent to conduct audits on facilities, technical equipment, archives, databases and documents belonging to state authorities or private entities active in the domain of post, telecommunications or other services related to communications. Audits are initiated either by the Authority itself (ex officio) or following the filing of a complaint. The Authority collects information and has the power to confiscate any means that violate the privacy of communications and summon to a hearing any person pertaining to its mission. The Authority may decide to impose administrative penalties following a hearing and invitation of the parties involved to provide justifications.
Lastly, it should be noted that for the time being there is no regulator for artificial intelligence (AI) in Greece and AI is not included in the areas of jurisdiction of the existing regulators.
The administrative process before the HDPA is governed by the provisions of Law 3051/2002 and the Code of Administrative Procedure.
Decision 9/2022 of the HDPA, as amended, includes the Rules of Operation of the HDPA and provides that every case must follow the basic procedural steps:
The HDPA may issue decisions in the form of provisional measures applicable until the issuance of its definitive decision on the merits of the case.
The HDPA’s decisions are binding on its addressees, while its enforceable acts are subject to appeal before the Administrative Courts and annulment by the Council of State.
Greece is an EU member state. The GDPR is directly applicable in Greece and supersedes any provision of national law. Furthermore, Law 4624/2019 provides the necessary measures for the implementation of the GDPR and the operation of the HDPA. All relevant decisions taken at EU level apply directly in Greece, including issues concerning data privacy rules with any non-EU state.
During the last five years, non-governmental organisations have been established in Greece aiming: (i) to protect the privacy and personal data of individuals; (ii) to raise awareness among individuals on the protection of privacy and personal data; (iii) to educate individuals about their rights to privacy and personal data; (iv) to identify new risks regarding the protection of privacy and personal data deriving from developing technologies; and (v) to participate in the development of the legal environment regarding the protection of privacy and personal data.
It is worth pointing out that in 2022 the HDPA imposed the largest fine ever imposed on a company, namely EUR20 million, for violation of a data subject’s right of access to personal data, pursuant to a complaint filed by an active NGO on behalf of the data subject (decision 35/2022).
Greek law follows the EU model, and the HDPA is well organised, quite active and relatively aggressive in the imposition of the law.
During the years 2020-2023, the HDPA focused on digital transformation projects, including the initial upgrade of the HDPA’s web portal and integrated information systems, as well as the byDesign project, which developed a GDPR compliance web application for use by all interested parties and educational materials and programmes on the subject of “data protection by design and by definition” addressed to IT and communications professionals. Moreover, the recent developments of the HDPA’s integrated information system pertain, among other things, to the management of data breach incidents, the self-assessment of controllers as to the level of data security and protection, the assistance of data subjects in exercising their rights, and the submission of complaints, data breach incidents, etc through the HDPA’s digital portal.
As far as enforcement is concerned, the HDPA during 2023 issued 30 decisions finding infringements of the law and imposing penalties in most cases. Such decisions include findings of infringements regarding the processing of data from major banks, telecommunications services providers, electric power providers and even the Independent Authority for Public Revenue. The HDPA has also opined in regard to legislation proposed by the Government concerning the introduction of a system of a single ID for all natural persons and issued a Regulation concerning data processing for political campaigns.
Recently, hot topics have revolved around AI and include the following:
Another hot topic is the EU proposal for a Regulation on Child Sexual Abuse Material providing rules to prevent and combat child sexual abuse. Concerns have been expressed with regard to the effectiveness of the proposed rules, which could lead to an overall deterioration of cryptography and security of communications for all users, including children.
Another hot topic is the EU proposal for a Regulation on the co-operation of the Supervisory Authorities of the member states in cross-border cases of strategic importance. The Regulation is positively awaited and expected to provide greater legal certainty.
Data Protection Officer
The Data Protection Officer (DPO) is responsible for the compliance of the data controller or processor with the GDPR and the applicable legislation, and is the person who communicates with the HDPA or the data subjects. The DPO’s role is advisory and not decisive, and confers no personal responsibility on the person of the DPO in the event that the data controller or processor does not comply with the law. Articles 6-8 of Law 4624/2019 regulate the duties of the DPO in the public sector. The function of the DPO is crucial, and the HDPA pays special attention to the appropriate enforcement of the law (eg, decision 2/2024 imposing a fine of EUR25,000 upon the Ministry of Rural Development and Food for not having appointed a DPO for a long time).
The appointment of a DPO is mandatory in the following cases:
Criteria Necessary to Authorise Collection, Use or Other Processing of Personal Data
For the processing of an individual’s personal data to be lawful, at least one of the following conditions must apply:
Privacy by Design and by Default
The principle of privacy by design and by default is a basic condition of compliance according to Article 25 of the GDPR. It is a fact that the risk of personal data breaches increases with the development of technology and communications, and creates the need to protect data by design and by default. Security systems of data controllers and processors should be designed in a “preventive” way, to avoid future breaches. Furthermore, the protection of privacy by design and by default creates the need to implement technical and organisational measures that will ensure that the processing is carried out only for the fulfilment of the intended purpose.
Data Protection Impact Analysis (DPIA)
Article 35 of the GDPR provides for the obligation to carry out an impact assessment for the protection of personal data in specific cases, when data processing is carried out with advanced technological means, such as AI, and may entail a high risk to the rights and freedoms of the data subject.
The impact assessment is carried out in five stages, namely:
The impact assessment is necessary especially when the processing of personal data is based on automated means, when it comes to large-scale data processing (big data) and in the case of systematic monitoring of public space on a large scale. The HDPA has issued decision 65/2018, which groups the processing operations that require a DPIA into three categories depending on:
Data Subject Access Rights
A basic condition for compliance with the GDPR is the satisfaction of data subjects’ requests regarding the exercise of their rights. The data controller or processor must prove that he/she applies the appropriate measures and the required procedures, as well as that he/she respects the rights of the data subjects. The rights are detailed in Articles 12-22 of the GDPR and include:
Profiling and Automated Decision-Making
Profiling and automated decision-making present risks to the rights and freedoms of data subjects, in perpetuating stereotypes and creating discrimination against individuals. The GDPR grants to the individual the right to object when he/she is subject to a decision based solely on automated processing, except: (a) if such decision is necessary for entering into or performing a contract between the data subject and the data controller; (b) the data subject has granted his/her consent; or (c) if authorised by a provision of Greek law or EU law, which simultaneously defines suitable measures to safeguard the rights of the individual.
Moreover, Law 4624/2019 prohibits profiling and automated decision-making in the processing of personal data by public authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences, execution of criminal penalties and the safeguarding of and prevention of threats to national security, unless expressly provided for by a provision of Greek law or EU law, which simultaneously define adequate guarantees for the freedoms and the rights of the individual. The Article 29 Data Protection Working Party has issued guidelines on automated decision-making and profiling for the purposes of the GDPR (WP251 rev.01).
Compensation Claim
Article 40 of Law 4624/2019 provides for the judicial protection of the data subject against the data controller or processor for violations of the GDPR. More specifically, it provides that the data subject may file a lawsuit before the court of the district where the data controller or processor has its establishment or where the data subject has his/her residence. However, when the data subject files a lawsuit against a public authority acting in its capacity as a “sovereign”, the lawsuit must be filed before the court of the district where the public authority has its seat.
Similarly, Article 80 of Law 4624/2019 establishes the civil liability of public authorities for unlawfully causing damage to the data subject, which may result in payment of compensation and/or moral damages. Further information on civil liability may be found in 2.5 Enforcement and Litigation (Civil liability – class action section).
“Sensitive” Personal Data or Personal Data Belonging to “Special Categories”
Definition
These personal data are those which by their nature are particularly “sensitive” in relation to the fundamental rights and freedoms of the data subject. Such data are characterised as “data of special categories” in the GDPR and Greek law, and include personal data revealing racial or national origin, political opinions, religious or philosophical beliefs, or trade union membership, genetic or biometric data, health data and data revealing sex life or sexual orientation.
General prohibition of processing
The general rule is that the processing of such “sensitive” personal data is prohibited.
Exceptions
Exceptionally, the processing of such “sensitive” personal data is allowed in the following situations:
Rules of processing
All appropriate and special measures for the protection of the interests of the data subjects must be implemented, taking into account the level of technology, the implementation cost, the purposes of processing, and the risks and their significance for the rights and freedoms of the data subjects. Such measures may include, among others:
Financial data
Financial data are personal data but not “sensitive” data, as they do not fall within the above-stated “special categories”. The main processors of financial data are banking and credit institutions, which have a statutory confidentiality obligation towards their customers and their financial data.
Health data
Health data are in their vast majority “sensitive” data and include medical examinations, diseases, disabilities, medical history, risk of developing diseases, psychological state, etc. In Greece, a system for electronic prescription of medicines, electronic patient medical records and telemonitoring has been implemented (eHealth). The HDPA has issued a decision regarding eHealth (decision 138/2013); a directive regarding processing in the context of the COVID pandemic (directive 1/2020); opinions regarding the Health Cards of Athletes (limiting the types of personal data processed and the processing thereof); a directive regarding data processing concerning expenditures of public hospitals (directive 3/2015); a directive regarding data processing concerning promotional activities in the pharma sector (directive 5/2016); a directive regarding data processing concerning the surgery lists of public hospitals (directive 8/2016); and decisions upon complaints concerning processing of health data from insurance companies or the Public Social Security Fund (decision 5/2021).
Minors’ data
Greek law sets the limit at the age of 15 for the receipt of information society services, so minors who have reached the age of 15 can validly give their consent to the processing of their personal data. On the contrary, the consent for minors under 15 years of age must be given by their legal representative (ie, parents or persons with custody, guardianship, etc). Otherwise, the consent is null and void (Article 21 of Law 4624/2019).
Genetic data
Greek law prohibits the processing of genetic data for health and life insurance purposes (Article 23 of Law 4624/2019).
Employment data
This issue is analysed in 2.4 Workplace Privacy.
Internet, Streaming and Video Issues
Cookies
Law 3471/2006 provides that the storage of information or the acquisition of access to already stored information in the terminal equipment of a subscriber or user is only permitted if the specific subscriber or user has given his/her consent and if he/she has been expressly informed (“opt-in”). In addition, the HDPA has issued recommendation 1/2020 with the aim of providing clear instructions to data processors of electronic communications regarding the management of cookies. The HDPA points out that for the placement of a tracker on the user’s terminal equipment to be legal, he/she must have given his/her consent following his/her extensive information. The user’s consent is especially necessary when cookies are installed for the purpose of online advertising or the use of third-party trackers, such as the Google Analytics service for the purpose of statistical analysis. The user’s consent is not required when cookies are installed on the terminal equipment and are technically necessary for the operation of the website or the provision of the internet services requested by the user.
Location data
Law 3471/2006 includes provisions for the processing of “location data”, which are data processed in an electronic communications network or by an electronic communications service that reveal the geographic location of the user’s terminal equipment. The law provides that the processing of location data must be necessary for the fulfilment of the intended purpose (principle of “purpose limitation” provided also in the GDPR). To this purpose, the processing is permitted only for the performance of the contract to which the subscriber or user is a contracting party or for the implementation of measures during the pre-contractual stage, upon the subscriber’s request (decision 4/2022 of the HDPA).
Behavioural or targeting advertising
The issue is analysed in 2.3 Online Marketing.
Electronic communications
Law 3471/2006 on the protection of privacy and personal data in electronic communications establishes obligations for providers of publicly available electronic communications services, such as obligations for the processing of traffic and location data and the satisfaction of the special rights of users and subscribers. In addition, the law provides rules applicable to controllers for the recording of calls, the conditions of access to information stored in user terminal equipment (cookies) and the legality of promoting goods and services by telephone, email and SMS. The provisions of the above law apply in addition to the GDPR.
Other Issues
Data subject rights
This issue is analysed in 2.1 Omnibus Law and General Requirements.
Unsolicited Commercial or Marketing Communications Through Electronic Means
This category includes various electronic communications or communications through automated means, such as:
Such unsolicited commercial or marketing communications, without human intervention, are permitted only if the data subject has granted his/her prior consent by “opt-in”. Otherwise, such communications are considered unwanted (ie, “spam”) (Article 11 of Law 3471/2006).
Unsolicited commercial or marketing communications with human intervention, such as telephone calls, are permitted, unless the data subject has informed the service provider that he/she does not want to receive such calls by “opt-out”. Service providers must keep a record of those subscribers who have stated their objection to receiving such telephone calls (Article 11 of Law 3471/2006, as amended by Article 16 of Law 3917/2011).
Exceptionally, Unsolicited Commercial or Marketing Communications Through Emails
Email contact data that have been acquired legally in the context of the sale of goods or supply of services or other transactions may be used for the direct marketing and promotion of similar goods or services, even if the recipient has not granted his/her prior consent. However, the recipient must be granted in a clear and precise manner the option to object easily and without cost to the collection and use of his/her electronic data at the time of collection of the data, as well as in every email message (Article 11 § 3 of Law 3471/2006).
Rules for Unsolicited Commercial or Marketing Communications
In any case, unsolicited commercial or marketing messages should refer clearly and precisely to the identity of the sender and the address where the recipient can request the interruption of any further communications.
Unsolicited commercial or marketing messages should clearly state their “commercial” nature in the subject matter of the message.
The HDPA has issued Guideline 2/2011 with examples and best practices to obtain electronically the consent of the data subject.
Objection to Any Unsolicited Communications
Individuals have the right to declare to the HDPA that they do not want their personal data to be subject to any processing for the purposes of marketing and promotion of sales of goods and services from a distance. The HDPA keeps records of the identity of the above individuals, the so-called “List of Article 13” (according to Article 13 of Law 2472/1997).
Provisions applicable to the processing of personal data of employees are included in both the GDPR and Law 4624/2019. More specifically:
The HDPA has issued various guidelines and decisions on the processing of personal data at work, including Guideline 115/2001 on the protection of personal data of employees and Guidelines 1/2021 and 2/2020 on the protection of personal data through telework.
Enforcement Penalties
Administrative fines for private entities and individuals
According to Article 83 of the GDPR, administrative fines imposed by the HDPA may amount to EUR10 million or, in case of an undertaking, up to 4% of the total worldwide annual turnover.
Administrative fines for public entities
According to Article 39 of Law 4624/2019, administrative fines imposed by the HDPA upon public entities are limited to the amount of EUR10 million.
Criminal penalties (Article 38 of Law 4624/2019)
Unauthorised access to databases and personal data, or copying, deleting, amending, harming, collecting, recording, organising, structuring, storing, adjusting, changing, recovering, searching, comparing, combining, limiting or destroying data, is punishable with imprisonment up to one year.
Further use, transmission, transfer, conveyance, notification or granting of access to personal data acquired according to 2.1 Omnibus Law and General Requirements is punishable with imprisonment up to five years. If special categories of data are involved, the punishment is imprisonment for up to one year and a monetary penalty up to EUR100,000.
If, in addition to the above, there is also an intention to acquire illegal profit or cause damage exceeding the amount of EUR120,000, the punishment is imprisonment for up to ten years.
If, in addition to the above, there is a risk to the free operation of democracy or national security, the punishment is imprisonment for up to ten years and a monetary penalty up to EUR300,000.
Civil liability – class action
According to the provisions of the Civil Code, any person or entity that violates the law and wrongfully damages another is liable in tort to compensate for any immediate effect on the property of another and reasonable expected damages (ie, loss of profit); moral damages may be also requested. Greek law does not provide for legal actions that can be taken by entities that cannot demonstrate an interest in taking the action. An action can have many plaintiffs or many defendants, but the decision will only bind the litigants and not third parties.
Legal Standards That Regulators Must Establish to Allege Violations of Privacy/Data Protection Laws
The regulators must establish full proof of any violations. The legal standards for such proof do not differ from the legal standards applied by the courts.
Leading Enforcement Cases in Greece
According to information provided by the HDPA, during the first five years of the GDPR, the HDPA has issued approximately 100 decisions imposing fines and penalties in a total amount of approximately EUR30 million. The majority of the decisions were issued against private entities, although there were some against public authorities as well. Indicatively:
Personal Data
Law 4624/2019 (Articles 43-86) adopted Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data. The above law provides in summary the following:
Such rights are addressed to the data controller and cannot be addressed to the data processor, eg, a laboratory performing DNA examinations on behalf of the prosecuting authorities cannot share the results of the DNA examinations to the suspect data subject.
Communications
Law 5002/2022 provides the conditions and procedures for the declassification of communications, the privacy of which is a fundamental individual freedom according to Article 19 of the Constitution (1.1 Laws). According to the above law:
Law 4624/2019, which adopted Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, applies also to safeguarding against and the prevention of threats to national security (Article 43). Therefore, the analysis in 3.1 Laws and Standards for Access to Data for Serious Crimes applies here as well.
Greece is a signatory to the OECD Declaration on Government Access to Personal Data Held by Private Sector Entities dated 14 December 2022.
General Principles for Transfers of Personal Data
According to Article 75 of Law 4624/2019, the transfer of personal data from the Greek state authorities to the authorities of non-EU countries or international organisations is permitted, provided the other provisions of Law 4624/2019 are met as stated in 3.1 Laws and Standards for Access to Data for Serious Crimes (Personal Data section), and provided:
The transfer of personal data is not permitted, despite the existence of an adequacy decision and the need to safeguard the public interest, if the protection of the fundamental rights and interests of the data subject cannot be ensured in the specific case. The data controller assesses the level that would ensure protection of the above rights of the data subject based on the guarantees for the protection of the personal data offered by the recipient of the personal data in the non-EU country.
The transfer of personal data from another EU member state requires prior authorisation by the competent data protection authority of such member state. Such prior authorisation is not required if the transfer of the personal data is necessary for the prevention of an immediate and serious threat against public safety of a member state or of a non-EU country and the prior authorisation cannot be obtained in a timely manner.
Transfers of Personal Data to Recipients in Non-EU Countries
In individual and specific cases, the Greek state authorities may transfer personal data to recipients of non-EU countries that are not competent for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, provided the other provisions of Law 4624/2019 are met as stated in 3.1 Laws and Standards for Access to Data for Serious Crimes (Personal Data section), and provided:
When personal data are transferred to non-EU countries, at least the same level of protection as the GDPR must be ensured. It is worth mentioning that the GDPR (Article 48) does not affect international agreements concluded between the EU and non-EU countries which govern the transfer of personal data and provide appropriate guarantees for data subjects.
The Clarifying Lawful Overseas Use of Data Act (CLOUD Act) was adopted by the Congress of the United States of America on 23 March 2018 with the aim of improving procedures for both the US and foreign authorities in obtaining access to data held by service providers in the context of criminal investigations. The CLOUD Act may endanger any protection provided by the GDPR, because it would extend its scope beyond the borders of the USA to EU member states where the GDPR applies. The competent bodies of the EU have evaluated the US legislation and have decided that the conditions of Article 48 of the GDPR are not met because sufficient protection guarantees for the security of the personal data of citizens of the EU are not ensured in the territory of the USA. The EU has expressed its concern about the possibility of individual member states entering into bilateral ClLOUD Act implementing agreements with the USA and are calling on the Commission for an EU-wide harmonised policy on the issue. Greece has not signed an international agreement with the USA.
Much public debate revolves around the issue of whether the individual whose communications were declassified should be informed or not and when about such declassification.
Law 5002/2022 regulates the declassification of communications for reasons of national security and particularly serious crimes:
Transfers of Personal Data Within the EU
According to the GDPR (Article 44), the transfer of personal data from an EU member state to another EU member state may take place freely, provided the other provisions of the GDPR are met.
Transfers of Personal Data to a Non-EU Country orInternational Organisation
According to the GDPR (Article 45), the transfer of personal data from an EU member state to a non-EU country or international organisation may take place freely, without any specific authorisation, if the European Commission has decided that such non-EU country or international organisation ensures an adequate level of protection for personal data.
The European Commission has so far recognised that the following non-EU countries and dependencies provide adequate protection: Andorra, Argentina, Canada (commercial organisations), the Faroe Islands, Guernsey, Israel, the Isle of Man, Japan, Jersey, New Zealand, the Republic of Korea, Switzerland, the UK, Uruguay and the USA (commercial organisations participating in the EU-US Data Privacy Framework).
With the exception of the UK, the above-mentioned adequacy decisions do not cover data exchanges in the law enforcement sector, which are governed by Law Enforcement Directive (EU) 2016/680 and analysed in 3.3 Invoking Foreign Government Obligations.
In the absence of an adequacy decision by the European Commission as described in 4.1 Restrictions on International Data Issues, transfers of personal data to third countries or international organisations may take place without any specific authorisation, if the data controller or data processor has provided appropriate safeguards and on condition that enforceable data subject rights and effective legal remedies are available, such as:
In the absence of any of the above appropriate safeguards, transfers of personal data to a third country or international organisation may take place only on one of the following conditions:
Transfers of personal data to an EU member state, a third country or an international organisation are not notified, nor do they require prior approval. However, the data controller or processor must enter the transfers in the records of processing activities (Article 30 of the GDPR), stating at least the recipient and the documentation proving the existence of appropriate safeguards. Such records, including records of transfers, should be made available to the HDPA upon request.
According to the GDPR (Articles 13 and 14), the data controller must upon collection of personal data provide the data subject with specific information such as the controller’s identity and contact details, the purposes of the processing, the recipients or categories of recipients of the personal data, etc. Among such information, the data controller must inform the data subject if he/she intends to transfer the personal data to a non-EU country or international organisation and the existence or absence of an adequacy decision, appropriate safeguards or other mechanisms discussed in 4.2 Mechanisms or Derogations That Apply to International Data Transfers.
In view of the above, if the information notice provided does not include the fact that the data controller intends to transfer the personal data to a non-EU country or international organisation, the controller must inform the data subject anew about such intended transfer prior to the transfer of personal data. However, the data controller is not obliged to inform the data subject about the transfer of the personal data within the EU. In any case, the recipients or categories of recipients of the personal data stated in the information notice should include the foreign recipient of the personal data to be transferred.
Of course, the other terms of the GDPR must be met, such as, for example, the following:
Under Greek law there is no specific obligation to share software code, algorithms or similar technical details with the HDPA. However, all technical and organisational measures should be made available to the HDPA upon request in the context of a regular audit or investigation.
Transfers Not Authorised by EU Law
According to the GDPR (Article 48), any judgment of a court and any decision of an administrative authority of a third country requiring a Greek private data controller or processor to transfer or disclose personal data may only be recognised or enforceable in any manner if based on an international agreement, such as a mutual legal assistance treaty, in force between the requesting third country and the EU or Greece.
Transfers Permitted by Greek Law
The transfer of personal data from the Greek state authorities to the authorities of non-EU countries or international organisations is permitted, provided the other provisions of Law 4624/2019 are met, which adopted Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, the safeguarding against and the prevention of threats to national security and on the free movement of such data. Please see 3. Law Enforcement and National Security Access and Surveillance.
There are no “blocking” statutes, meaning that there are no Greek laws or statutes which prohibit compliance with EU regulations. As already stated in 1.1 Laws, EU Regulations are directly applicable in Greece and supersede any provision of national law, including the Constitution.
Big Data
Big data refers to a large amount of personal data that is generated very quickly from various sources such as social media, the internet, GPS applications, digital systems or humans. There are no specific laws for big data.
Automated Decision-Making; Profiling
This issue is analysed in 2.1 Omnibus Laws and General Requirements (Profiling and Automated Decision-Making section).
AI
The European Commission addresses the multiple challenges raised by AI in the Proposal for a Regulation (AI Act), with the aim of establishing harmonised rules on AI. In Greece, Law 4961/2022 defines specific obligations for AI service providers in the public and private sectors, such as the obligation to comply with the GDPR when processing personal data, the obligation to perform an algorithmic impact assessment prior to the implementation of an AI system, and transparency obligations.
Internet of Things (IoT)
The Internet of Things (IoT) describes any technology which: (a) enables devices or a group of interconnected or related devices to perform automatic processing of digital data, including technology related to the interconnection of devices, vehicles and buildings, with electronic components, software, sensors, actuators, radio links and network connections; (b) enables the collection and exchange of digital data in order to offer a variety of services to users, with or without human involvement. IoT is used in homes such as “smart home” products, health, vehicles, industry and security systems to create a “smart net” that transfers information in real time, creating benefits such as saving money and time. Law 4961/2022 includes provisions for the use and application of IoT and imposes specific obligations on importers, distributors and providers of IoT devices in order to protect privacy and personal data. In the event of a violation, a fine ranging from EUR15,000 to EUR100,000 is imposed.
Biometric Identification
Biometric identification is used by Police Authorities as it effectively contributes against criminal activities for the purposes of public safety. Biometric identification systems through AI are “high risk” systems and fall into two categories: (a) those that use “unique” physiology characteristics such as facial recognition, fingerprints or iris recognition, and (b) those related to psychological factors such as voice analysis, psychological state, etc. In order for the processing of data based on biometric identification to be lawful, the competent authorities must adhere to the specific requirements provided by Law 4961/2022 regarding transparency obligations and algorithmic impact assessment.
Law 4727/2020 concerns digital governance in the public sector. The aim is to digitalise the public sector and provide suitable conditions for people and businesses to communicate with the public sector, using IT and communication technologies. The law provides in detail for the adoption of all the necessary measures and implementation of structures that enhance the option of citizens to directly communicate with state agencies, cutting through public sector bureaucracy and allowing the swift digital satisfaction of their requests. The creation of a digital state facilitates the daily life of Greek people because their requests (eg, issuance of certificates of all types) are satisfied immediately without having to visit in person the respective public agency. Apart from the above, there are no organisations that establish protocols for digital governance, AI or fair data practice review boards or committees to address the risks of emerging or disruptive digital technologies. Until now, Greece has followed digitally and applied all regulations and developments at EU level. Major agencies in the public and private sectors make their own assessment and implement measures to achieve the above.
Further to what was discussed in 2.5 Enforcement and Litigation (Leading Enforcement Cases in Greece section), the most important decisions of the HDPA during 2023 concern infringement of the law by Greek banks and telecommunications providers. The HDPA imposed a penalty of EUR10,000 upon a major Greek telecommunications services provider concerning the processing of the data of a subscriber who had not consented to the use thereof for marketing and profiling purposes (decision 5/2023). The HDPA had previously imposed a penalty of EUR60,000 upon the same company for wrongfully forwarding to the complaining subscriber the recorded discussion between the provider and another subscriber instead of the recorded discussion with the complaining subscriber as per the complaining subscriber’s request. The HDPA imposed a penalty of EUR60,000 upon another major Greek telecommunications services provider for unsolicited SMS (decision 10/2023). The HDPA imposed a fine of EUR30,000 upon a major Greek bank for forwarding to a third party details of the bank account of a customer (decision 4/2023). The HDPA imposed another penalty upon the same bank for forwarding customers’ data to a debt collection company (decision 25/2023). A penalty of EUR50,000 was imposed upon the Public Transport Company for customers’ data processing (decision 30/2023). Lastly, penalties of EUR60,000 and EUR10,000 were imposed upon another major Greek bank for forwarding a customer’s data and failure to satisfy the right of access to personal data collected through the CCTV system respectively (decisions 35/2023 and 36/2023).
It is up to the parties to a corporate transaction to arrange the manner and conditions for the processing of personal data in compliance with the provisions of the law. Entities that have a scope which includes the processing of personal data are obliged to have respective policies in effect, which are subject to due diligence. Furthermore, the parties undertake specific obligations towards each other as to the treatment and the allowed processing of personal data of all companies involved in the transaction. Lastly, there are rules set by the HDPA concerning the provision of credit rating services and information on the financial status or insolvency of traders, and the HDPA has dealt in several cases with complaints regarding the processing of personal data by credit rating service providers or the refusal thereof to allow access to information and personal data by third parties (decisions 135/2017, 9/2017 and 6/2006).
Greek law does not include for the time being provisions mandating disclosure of an organisation’s cyber risk profile or experience. Law 5086/2024 provides for the establishment of the National Cybersecurity Authority, whose purpose is the organisation, co-ordination, implementation and control of an integrated framework of strategies, measures and actions to achieve a high level of cybersecurity in Greece.
The digital market for products and services in the EU is developing rapidly, and it is expected that in the future the majority of transactions will take place online. The EU has so far regulated specific areas related to the digital transactions of goods and services, in particular:
The possibility of using technology and the many modern ways of communication enhance the quality of people’s life but also endanger the protection of privacy and personal data. The Greek legislator has declared that the use of technology is linked to private life, as reflected in the Constitution. See further 1.1 Laws.
8, Karneadou street
Athens 106 75
Greece
+30 2107217232
+30 2130993965
georgountzou@gkplaw.gr www.gkplaw.grThe Legal Implications of Generative AI
Generative artificial intelligence (AI) stands as a beacon of innovation, empowering machines to autonomously create original content across various mediums, from text to images and beyond. Leveraging technologies such as generative adversarial networks (GANs), reinforcement learning and deep neural networks, generative AI reshapes the landscape of creative expression and problem-solving. Its ability to produce diverse and high-quality content has found applications in fields as varied as art, medicine and finance, revolutionising how we approach complex tasks. As generative AI continues to evolve, understanding its multifaceted applications becomes increasingly crucial for researchers, developers and policymakers alike. From generating realistic images to crafting human-like dialogue, generative models exhibit remarkable versatility and potential. However, the widespread adoption of generative AI also prompts a nuanced examination of its legal and ethical implications.
In exploring the boundaries of AI creativity, questions arise regarding intellectual property rights, data privacy and algorithmic bias. As society grapples with these challenges, collaborative efforts are essential to ensure that generative AI realises its transformative potential responsibly. By fostering dialogue and developing comprehensive frameworks, we can navigate the evolving landscape of generative AI while upholding ethical principles and safeguarding individual rights.
Types of Generative AI
Exploring applications of generative AI
Generative AI constitutes a specialised branch within the field of artificial intelligence, leveraging sophisticated machine learning methodologies such as semi-supervised or unsupervised learning algorithms. These applications leverage AI models, such as GANs and Recurrent Neural Networks (RNNs), to create content that is often indistinguishable from human-generated content. Its primary function revolves around the creation of digital content spanning images, audio, videos, code and textual material. Generative AI is used, amongst other things, for text generation, such as content creation or coding, for image generation and manipulation, creating art or deepfakes or image-to-image translation, for audio generation, such as music composition and voice synthesis, for video generation, for game design, and even for medical and scientific research or simulation and training for autonomous vehicles, robots and AI systems.
This process hinges on a training regimen wherein algorithms are exposed to extensive datasets containing pairs of input and output examples. Through iterative learning, these algorithms discern intricate patterns within the input data, enabling them to generate outputs that align with the desired specifications. This training paradigm facilitates the development of AI systems capable of autonomously producing content that exhibits remarkable fidelity and complexity, mirroring human-generated counterparts in various domains.
Advances and innovations
Today’s AI systems can autonomously generate creative content across written, visual and auditory realms with minimal human input, creating works virtually indistinguishable from human creations. For instance, advanced text-to-image generators such as DALL·E 2 swiftly produce images based on textual prompts. Trained on a massive dataset of over 650 million image-text pairs, DALL·E 2 goes beyond simple imitation, grasping contextual understanding. OpenAI, the developer behind ChatGPT and DALL·E 2, pioneered the use of Aesthetic Quality Comparison, training a model to predict human aesthetic judgements using video data. This approach allows DALL·E 2 to craft art consistent with human perception, though it operates differently from human perception itself.
Listed below are a few interesting examples that gained traction during the past months:
While chatbots like the above have quickly risen in popularity, generative AI is also used in different areas and sectors.
Image generation and manipulation
Generative AI commonly generates images from text prompts, allowing users to describe their desired image. The AI interprets these prompts to produce realistic images that are customisable in subjects, styles and settings. This interaction creates diverse visual content, aiding creative expression and design across domains.
These types of generative AI include, amongst others, functionalities such as semantic image-to-image translation, image completion, image super-resolution and image manipulation.
Software and coding
Generative AI is already affecting software development and coding, boosting productivity and code quality. This rapidly evolving field holds vast potential to inspire new avenues of software innovation while improving efficiency.
A key application in software development is code generation, which extends to code completion, automated testing and enabling natural language interfaces. This enables developers to interact with software systems using human language instead of programming languages.
Video creation
Generative AI streamlines video production with novel features, automating tasks such as video compositions, animations and special effects. These tools create high-quality video content from scratch, enhancing resolution, manipulation and overall completion.
Such functionalities entail video style transfers and video predictions.
Audio generation
Generative AI is also used to create audio. Audio generation can be categorised as follows:
Text generation and summarisation
Certain AI models are trained on large datasets to generate up-to-date and authentic content. Some of the most common use cases of generative AI applications used for text generation and summarisation are listed below:
Legal Implications
Data privacy
Generative AI systems are fed with training data and learn to generate statistically probable outputs that have similar characteristics. It is therefore clear that ΑΙ enables the collection and use of large amounts of data, both personal, ie, information relating to an identified or identifiable individual, and non-personal; data feeds AI systems and AI systems generate more data.
The fact that the effectiveness and fairness of AI tools depend on the quality and quantity of data puts individuals’ fundamental right to protection of personal data and private life in jeopardy. Inclusion of personal data in training sets poses privacy and other risks to individuals, including, inter alia, that information in training data could foreseeably be produced as part of a generative AI system’s output. In the absence of a more specific regulatory framework at European level, the General Data Protection Regulation (GDPR) is called upon to address and regulate key aspects of the functioning of AI as far as the processing of personal data is concerned.
The overriding issue that emerges is the applicability of some principles governing the processing of personal data, which is highly impacted by generative AI systems. In particular, the following issues arise with regard to such principles:
Transparency
The principle of transparency is fundamental to the protection of personal data. Transparency is primarily achieved by providing individuals with the necessary information regarding the processing of their personal data, whether collected directly by the individuals themselves or by third parties. In their privacy notices, organisations must inform individuals that their personal data may be used to train and test a generative AI system as well as about the purpose for which their personal data are processed, explain the logic behind AI-powered automated decisions, and highlight risks for the individuals.
Purpose limitation
Personal data must be collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes or beyond the affected individuals’ reasonable expectations. During the development and deployment life-cycle of an AI system, organisations should carefully evaluate the compatibility with the purpose for which the personal data used in its development were collected.
Data minimisation
Personal data must be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed. While vast amounts of data are required to train generative AI systems to achieve their full potential, developers, providers and deployers of generative AI systems should limit the collection, use and further processing of personal data only to what is necessary to fulfil the legitimate identified purposes. Therefore, personal data must only be used as training data if required to achieve the legitimate identified purposes of the generative AI system, while the use of anonymisation or pseudonymisation techniques should be taken into consideration.
Accuracy
Pursuant to Article 5 of the GDPR, personal data must be accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay. It is obvious that the accuracy of the output of generative AI systems highly depends on the accuracy of training data. Therefore, generative AI systems must rely on accurate, reliable and representative data. False or inaccurate personal data must be excluded from training data. But even when trained with representative high-quality data, the output generated by generative AI systems may contain inaccurate or false information including personal data leading to hallucinations, in which a tool confidently asserts that a falsehood is real. To mitigate the risks posed by the potential lack of accuracy of generative AI systems, it is important that it must be indicated when there is uncertainty regarding generative AI responses so individuals have the chance to validate the output, eg, by citing the sources on which the output is based and using technical safeguards.
Privacy by design and by default
Rapid technological change poses new risks to data protection. Some of the unique characteristics of AI render compliance with data protection laws more challenging in comparison with more “traditional” IT systems. In line with the privacy by design and by default principle, organisations should conduct a data protection impact assessment to identify, assess and address the risks posed by generative AI systems at every stage of their life-cycle. The state-of-the-art security measures designed to implement data protection principles must be implemented in an effective manner, and security safeguards should be integrated into the processing in order to meet the requirements of the GDPR and protect the rights of individuals.
Biased or inaccurate information
Bias can occur in various stages of an AI system life-cycle. AI systems are based on machine learning data-driven techniques, so the primary source of bias is data collection. If generative AI systems are trained with data which may not be diverse or representative and/or reflect discrimination, they may generate outputs which have discriminatory effects on individuals based on their gender, race, age, health, religion, disability, sexual orientation or other characteristics. Furthermore, if training data is not balanced or the system architecture is not designed to handle diverse inputs, the generative AI system may produce biased outputs. In addition, bias may be introduced if the generative AI system is not tested with diverse inputs or monitored for bias after deployment.
Apart from the above, inadequate or biased training data and incorrect model assumptions may lead generative AI systems to generate responses which contain false or misleading information presented as facts, the so-called “hallucinations”. AI hallucinations may have various significant consequences/liabilities for organisations and/or persons using incorrect output from generative AI systems. For example, such an organisation or user could potentially suffer reputational damage or even charges of libel as well as negligence claims if they have used generative AI to provide advice.
Mitigation of AI hallucinations
The best way to mitigate the impact of AI hallucinations is to stop them before they happen. It must be ensured that generative AI systems are trained on diverse, balanced, well-structured and high-quality data from reliable sources. Datasets must be transparent to make AI outcomes understandable and traceable. Also, organisations should establish the AI system’s responsibilities and limitations; this will help the system complete tasks more effectively and minimise irrelevant, hallucinatory outputs. Finally, the generative AI system must be tested and evaluated thoroughly before use and on an ongoing basis as it evolves and improves.
In any case, human oversight must be established to mitigate the risks of AI hallucinations. As a final measure for the prevention of hallucination, a human being should review, filter, correct and validate generative AI outputs. Outputs generated by generative AI systems must be verified against a credible source. Human reviewers may also provide relevant expertise and increase the accuracy and safety of AI systems, upholding human values.
The Draft EU AI Act
The European Commission, proposing the first-ever legal framework on AI (“Draft EU AI Act”), which is expected to be adopted soon, attempts to address the challenges posed by AI following a proportionate risk-based approach. Pursuant to the Draft EU AI Act, AI systems will have to meet data transparency obligations. AI system providers will have to ensure that AI systems intended to interact with individuals are designed and developed in such a way that individuals are informed that they are interacting with an AI system unless this is obvious from the circumstances and the context of use. In addition, users of an AI system that generates or manipulates image, audio or video content that appreciably resembles existing persons, objects, places or other entities or events and would falsely appear to a person to be authentic or truthful (“deepfake”) will have to disclose that the content has been artificially generated or manipulated. As far as high-risk AI systems are concerned, they are faced with much stricter transparency obligations as well as the requirement for appropriate human oversight.
Intellectual property
Generative AI applications pose a new challenge to current intellectual property laws due to their capability to independently produce original content. Different levels of AI involvement in content creation are described in academia. For instance, “AI-assisted work” implies that a natural person, not AI, is functionally considered the author of the work, while “AI-generated” indicates that no natural person qualifies as the author.
A significant concern revolves around attributing ownership to AI-generated works. Traditional copyright laws designate human creators as owners, but determining authorship becomes unclear with generative AI. This ambiguity may trigger disputes over intellectual property rights, as multiple parties could claim ownership of AI-generated content. High-profile lawsuits by content creators, such as the New York Times and Getty Images, against generative AI developers in the US and EU have escalated these concerns.
The discourse on this issue appears twofold, as legal theorists focus on both stages: the algorithm’s training and the generation of outcomes.
Training of the algorithm
The primary copyright concern regarding AI training revolves around the possibility that training datasets might contain copyrighted text or materials. Lawfully reproducing or using these materials in the training process requires permission from rights holders or specific legal provisions allowing their use in training language models.
Training generative AI algorithms, such as language models (LLMs), encompasses large-scale datasets and numerous potential rights holders, making it highly challenging to seek all the rights holders and obtain explicit licences.
On the one hand, it has been argued the use of training datasets could be lawful by applying the text and data mining (TDM) exception provided by Directive 2019/790 to train language models (LLMs). Based on the definition of TDM as “any automated analytical technique aimed at analysing text and data in digital form to generate information, including patterns, trends, and correlations”, such activities could fall within this definition.
On the other hand, Article 4(2) of this Directive dictates that reproductions and extractions of content, such as described above, may only be retained for as long as is necessary for the purposes of text and data mining. This could result in the obligation of the trainers of LLMs to delete copyrighted content as soon as the training of the algorithm is concluded.
Output generation
The discussion on output generation includes not only the potential infringement of materials used during the training of LLMs by the outputs produced but also the possibility of legally protecting such outputs via copyright or patent legislation.
In a nutshell, the output generated by a generative AI application could lead to two main outcomes:
A generative AI output could potentially infringe on legal rights in two primary ways. Firstly, if the output closely resembles legally protected elements of pre-existing materials, and secondly, if the output incorporates protected aspects of pre-existing materials through unauthorised adaptations or modifications, it would likely be considered a derivative creation. Another important aspect of this intriguing problem is that the CJEU has recently determined in YouTube v Cyando (Joined Cases C-682/18 and C-683/18) that if platforms fail to comply with any of three distinct duties of care, they will be directly accountable for violations of the right to communicate a work publicly.
Regarding the outcome’s protection, the European Parliament emphasises that existing intellectual property legislation still applies when the creative outcome primarily stems from human intellectual activity, albeit with assistance from an AI system. The CJEU, as demonstrated in the Painer case (Case C-145/10), confirms that copyright-protected works can indeed be created with the aid of a machine or device. According to CJEU case law, predominant human intellectual activity is evident when a human creator utilising generative AI exercises free and creative choices during the conception, execution and/or editing phases of the work.
Similarly, a broader interpretation of the inventive step requirement may be warranted when considering patent protection for inventive outcomes produced with generative AI support. This interpretation would focus on non-obviousness to a person skilled in the art, assisted by AI, ie, an AI-aided human expert, as many scholars have pinpointed.
Conclusion
Generative AI holds immense promise for innovation and creativity, yet its proliferation underscores the pressing need for robust legal and ethical frameworks. Intellectual property rights face new complexities as AI-generated content blurs the lines of ownership and authorship. Moreover, concerns regarding data privacy and algorithmic biases demand careful consideration in the development and deployment of generative models. As society grapples with these challenges, collaborative efforts between policymakers, technologists and legal experts are paramount. By fostering dialogue and cultivating responsible practices, we can harness the transformative potential of generative AI while mitigating its risks. Ultimately, navigating the evolving landscape of generative AI requires a balanced approach that fosters innovation, safeguards ethical principles, and upholds the rights and dignity of individuals.
10 Solonos Street,
106 73 Athens
Greece
+30 2103625943
+30 2103647925
central@balpel.gr www.ballas-pelecanos.com