The new Data Protection & Privacy 2025 guide features over 30 jurisdictions. The guide provides the latest legal information on the legal and regulatory framework for data protection, privacy litigation and collective redress, data privacy regulation for artificial intelligence and the internet of things, sectoral issues including advertising and employment law, data localisation requirements, and blocking statutes.
Last Updated: March 11, 2025
Introduction to the Data Protection & Privacy Guide
Data privacy has become a fundamental concern for individuals, businesses and governments worldwide, as the proliferation of digital technologies and the increasing reliance on data-driven services have transformed how personal data is collected, processed and shared. This transformation has brought about significant benefits, including enhanced connectivity, personalised services and economic growth. However, it has also raised critical questions about the protection of personal data and the privacy rights of individuals.
Data privacy regulation is a dynamic and evolving field, shaped by the interplay of technological advancements, societal expectations and legal frameworks. In many jurisdictions, data privacy laws are built on core principles such as transparency, accountability and user consent. These principles are designed to ensure that individuals have control over their personal data and that organisations processing data do so responsibly. Key elements of data privacy regulation often include requirements for data security, data minimisation and the rights of individuals to access, correct and delete their data.
One of the most significant challenges in data privacy regulation remains the issue of cross-border data transfers. As data transfers are part of everyday business, regulators must address the complexities of ensuring that personal data transferred to other jurisdictions remains adequately protected. This has led to the development of mechanisms such as Standard Contractual Clauses (SCCs), Binding Corporate Rules (BCRs) and adequacy decisions, which provide frameworks for international data transfers. Many jurisdictions, particularly in the MENA region, have recently adopted this approach and published data transfer regulations that sometimes require specific approval by state authorities. For instance, the PDPL of Saudi Arabia requires data transfers occurring in the banking context to be approved by the Central Bank.
Similarly, major jurisdictions more often apply prohibitions and far-reaching restrictions on cross-border transfers to jurisdictions with questionable human rights practices, leading to de facto data localisation. The US government implemented an Executive Order that addresses the risk that countries could use advanced technologies and particularly artificial intelligence systems to process large sets of personal data, which could then be used to engage in malicious cyber activities. Jurisdictions also often control the export, transit and brokering of technology relating to dual use items and consisting of large sets of data, by applying export control regulations and requiring entities to apply for prior approval of the data transfer from export control authorities.
European Data Act
For a long time, protection focused only on personal data/personal information. The Chinese Data Security Law has established a framework for the protection and transfer of important non-personal data since 2021, and the EU now also aims to significantly expand protection to cover non-personal data by adopting the European Data Act (DA). The DA represents a significant legislative effort to ensure fair access to and use of data within the EU. It complements existing data protection frameworks, such as the GDPR, by establishing new rules for how users of connected products and services can utilise the data they generate and how data holders can derive economic value from it. The DA aims to foster a competitive data market, promote data-driven innovation and enhance data accessibility, addressing key challenges in the digital economy. It introduces comprehensive guidelines on how data generated by connected products and related services can be accessed and shared. This includes establishing a data access and sharing regime that applies to both business-to-consumer and business-to-business interactions, as well as public entities.
The scope of the DA is broad, impacting a wide range of stakeholders, including manufacturers of connected products (such as IoT devices like smart cars and home devices), providers of related services, data holders, data recipients, public sector bodies and several providers of data processing services, such as cloud computing services.
The DA's requirements cover both personal and non-personal data, with a primary focus on non-personal data rather than personal data, which continues to be governed by the GDPR. The DA imposes specific obligations on several cloud computing service providers, referred to as “data processing services”. These providers must facilitate switching without charging fees or imposing obstacles, ensuring that customers can transition smoothly to a different service provider. The DA requires providers to include mandatory terms in customer agreements to ensure consumers have the right to switch providers, and to comply with technical obligations to facilitate switching. The EU Commission currently develops SCCs for switching between data processing services.
The DA applies to manufacturers or related service providers established outside the EU, provided the connected products and related services are placed in the EU. This extraterritorial scope shall ensure that users can exercise their access rights under the DA, regardless of the provider's location.
European Artificial Intelligence Act
The European Artificial Intelligence Act (AI Act) marks a pioneering effort by the EU to establish a unified legal framework for the regulation of artificial intelligence systems. As the first comprehensive legislation of its kind, the AI Act aims to address the unique challenges and opportunities presented by AI technologies, ensuring that they are developed and used in a manner that is safe, ethical and aligned with fundamental rights. The AI Act establishes requirements for high-risk AI systems to ensure transparency, accuracy and data quality, addressing concerns about the potential misuse of AI technologies.
The AI Act complements the GDPR by setting forth additional obligations for high-risk AI systems to ensure responsible data processing. While the GDPR mandates lawful, fair and transparent data processing, the AI Act imposes further restrictions on high-risk AI applications, such as social scoring and real-time facial recognition, to prevent discrimination and protect privacy. The AI Act emphasises reducing bias and ensuring transparency, particularly for high-risk AI systems, by requiring that users are informed when interacting with AI and understand how decisions are made.
Regarding the relationship between the GDPR and AI models, a number of questions remain unanswered. Most recently, the European Data Protection Board (EDPB) issued an opinion that looks at relevant legal problems like when and how AI models can be considered anonymous, whether and how legitimate interest can be used as a legal basis for developing or using AI models, and what happens if an AI model is developed using personal data that was processed unlawfully.
The role of data protection authorities
Data protection authorities (DPAs) play a crucial role in enforcing data privacy laws and ensuring compliance. These authorities are responsible for monitoring data processing activities, conducting investigations and imposing penalties for non-compliance. They also provide guidance to organisations on best practices for data protection and facilitate co-operation among international regulators.
In the EU, the EDPB co-ordinates the activities of national DPAs, ensuring consistent application of the GDPR across member states. The EDPB issues guidelines and recommendations on various aspects of data protection, helping to harmonise interpretations of the GDPR and address emerging privacy issues.
Challenges in cross-border data transfers
Cross-border data transfers present significant challenges for data privacy regulation, and continue to be a hot topic. The Schrems II decision by the Court of Justice of the European Union (CJEU) in 2020 highlighted the complexities of cross-border data transfers, invalidating the EU-U.S. Privacy Shield Framework (DPF) and emphasising the need for robust safeguards. In response, the European Commission adopted a new adequacy decision for the DPF in 2023, allowing data transfers to US organisations that self-certify under the framework.
Recent developments involve not only increased regulation of non-personal data transfers and the adoption of laws mandating data localisation, but also a trend in privacy litigation. Courts are increasingly awarding damages to individuals for violations of data transfer rules, thereby focusing not only on high-risk contexts. This trend necessitates companies to carefully assess and consider risks when using services provided by foreign vendors.
Earlier this year, the General Court of the European Union made a significant ruling, awarding damages for the transfer of an IP address to the United States during the time when there was no DPF. The court held that the website operator was liable for data transfers made through a third-party API embedded on the website, even though the website operator had not conducted the transfer itself. Such decisions may have implications for companies operating in both low-risk and high-risk contexts, as they could face mass tort litigation for using third-party services that transfer non-sensitive and device-related data to third countries.
Litigation and enforcement trends
Such decisions illustrate that data privacy litigation is on the rise, with individuals and organisations increasingly seeking redress for privacy violations. In many jurisdictions, data privacy laws provide a basis for claims for immaterial damages, although the determination of such damages remains a contentious issue. Recent court decisions in the EU have clarified some aspects of compensation, emphasising that it should correspond to actual harm rather than serve as a punitive measure. However, courts tend to interpret relevant statutes broadly to ensure efficient protection of user privacy rights, which potentially leads to more waves of mass claim litigation.
The introduction of collective redress mechanisms, such as the Representative Actions Directive in the EU, has fuelled this trend and expanded legal protection for consumers, enabling them to file collective actions for data protection violations. This development increases liability risks for companies, particularly in cross-border contexts, and highlights the importance of robust compliance programmes.
Data access and portability
The ability to access and transport personal data is a key aspect of data privacy regulation. Laws such as the GDPR grant individuals the right to access their data and transfer it to another service provider, promoting transparency and competition in digital markets. The DA builds on these principles by establishing new rules for data access and portability, ensuring that users of connected products and services can leverage the data they generate.
The DA mandates that data holders make data available to users in a common, machine-readable format promptly and at no cost. Providers of connected products or services must inform users about the extent of data availability. The DA also facilitates data portability, requiring data processing service providers to enable customers to switch to another service provider without barriers. In addition, the DA includes measures to balance negotiation power for medium-sized enterprises in relevant contracts.
The intersection of data privacy and competition law
This is an emerging area of focus, particularly in the context of digital markets. The CJEU has ruled that competition authorities can investigate GDPR violations if a company exploits its dominant market position, provided they consult with data protection authorities. This decision has significant implications for organisations with dominant market positions that accumulate extensive personal data.
The interplay between data privacy and competition law highlights the need for a holistic approach to regulation, where privacy and competition concerns are addressed in tandem. This approach ensures that data-driven markets remain competitive while protecting individuals' privacy rights.
Conclusion
The landscape of data privacy law is complex and constantly evolving, reflecting the rapid pace of technological change and the growing importance of data in the digital economy. As jurisdictions worldwide seek to balance innovation with privacy protection, the EU's comprehensive regulatory framework serves as a model for other regions. Navigating this landscape requires a deep understanding of the legal and regulatory frameworks that govern data privacy, as well as the ability to adapt to new developments and challenges.