Metaverse
The metaverse is generally used to describe a virtual 3D world in which users (represented by avatars) can communicate, interact and conclude transactions through various technologies (such as virtual or augmented reality, NFTs and blockchain), although there is no generally accepted definition.
There are no specific laws or regulations governing the metaverse in the Netherlands. The laws applicable to the “offline” world are generally applicable to the “online” metaverse. The metaverse does pose legal challenges, especially in relation to intellectual property rights (and intermediary liability) and data protection.
Intellectual Property (and Intermediary Liability)
In the metaverse, products and services can be displayed which are protected by intellectual property rights, such as copyrights and trademark rights. Although this brings new (business) opportunities, holders of such intellectual property rights are to carefully review the user terms of metaverse platforms prior to accessing and using the metaverse as these terms may contain broad licensing clauses or even result in a transfer of intellectual property rights. Furthermore, risks may occur relating to counterfeit products that are offered in the metaverse, such as NFTs. When counterfeit or infringing NFTs are sold, challenges arise in enforcing intellectual property rights and for example identifying the “infringers”. This also raises questions regarding the intermediary liability of the provider of the metaverse platform.
Data Protection
When personal data is processed relating to EU data subjects, the General Data Protection Regulation (GDPR) is generally applicable. Challenges may arise in determining the relevant actors, such as the (joint) data controller(s), the data processor(s), the supervisory authority and the data subjects. The extent to which the purposes and methods are determined by the provider of the metaverse platform and the actors in the metaverse should be assessed for each instance. In addition, special and sensitive personal data can be processed in the metaverse, such as payment or medical data. For special categories of personal data stricter rules apply, which are further detailed in the laws of each member state of the European Union. In the Netherlands, national identification numbers, such as a citizen service number, may only be processed if this is prescribed by law. It is therefore of importance to assess which personal data is processed and what the legal basis is that will be invoked. Considering the global reach of the metaverse, the applicability of other (local) data protection laws may also be triggered when processing personal data.
The digital economy is a driver of many new laws and regulations. The key laws and regulations in the Netherlands relating to the digital economy are:
An important recent development in the regulation of the digital economy is the entering into force of the EU Digital Markets Act (DMA) and the EU Digital Services Act (DSA).
The DMA aims to ensure that large online platforms that act as “gatekeepers” in digital markets behave in a fair way, while protecting European consumers and entrepreneurs and improving competition.
The DSA regulates different online intermediation services, aiming to safeguard a safe online environment, protect the rights of users, and create a level playing field for companies. In the Official Journal of the European Union a list was published of intermediary services qualifying as Very Large Online Platforms and Very Large Online Search Engines. These companies will also have to comply with the stricter obligations under the DSA.
In the Netherlands, the DSA will be further implemented in the DSA Implementation Act. Both the Authority for Consumers & Markets (ACM) and the Data Protection Authority (AP) will function as supervisory authorities. The ACM has recently published a guideline for the intermediary services that are governed by the DSA, setting out the applicable rules and obligations.
Possible challenges in this regard are the following. Distinguishing between different categories of intermediaries might prove challenging given the open and vague norms. Also, the roles and competencies of both supervisory authorities need to be further clarified. In the Netherlands, various parties reacted to the first proposal of the DSA Implementation Act, flagging various other challenges in relation to the DSA.
Laws and Regulations
In the Netherlands, there are no specific laws or regulations in relation to cloud and edge computing. Cloud and edge computing services are subject to (general) Dutch and European laws and legislation. Key laws and legislation are the Dutch Civil Code, the GDPR, the Dutch Telecommunications Act and the NIS directive, which is to be repealed by the NIS2 Directive (2022/2555) with effect from 18 October 2024. The NIS2 Directive is to be implemented in Dutch law. Pursuant to the NIS and NIS2 Directive, providers of critical and essential cloud or digital services are to comply with the security requirements set forth therein.
The concept of cloud computing is described in the NIS2 Directive, which aims to achieve a high level of cybersecurity across the member states for operators of critical infrastructure and essential services. The NIS2 Directive describes cloud computing services as “digital services that enable on-demand administration and broad remote access to a scalable and elastic pool of shareable computing resources, including where such resources are distributed across several locations.”
Regulated Industries
The following industries or sectors are subject to more stringent regulatory requirements, which can also relate to the use of cloud and edge computing services.
Financial services
The financial services sector is subject to stringent regulations related to cloud computing. The most notable development is that the Digital Operations Resilience Act for the Financial Sector (DORA), which entered into force on 16 January 2023, stipulates requirements to ensure digital resilience in the financial sector. DORA stipulates rules on ICT risk management, incident reporting, operational resilience testing and ICT third-party risk monitoring. The use of cloud computing services by financial institutions is subject to the Dutch Financial Supervision Act and to the further substantiation thereof by the supervisory authority, namely the Dutch Central Bank (De Nederlandsche Bank, or DNB), such as the “Circulaire Cloud Computing” and the “Good practices for managing outsourcing risks” as published by the DNB. The DNB further supervises compliance with the European Bank Authority Guidelines on outsourcing arrangements of 25 February 2019 (the “EBA Guidelines”).
Healthcare
Specific provisions for the processing of medical (patient) data apply. As of 1 July 2017, the law on clients’ rights to electronic data processing in healthcare entered into force, regulating the secure exchange of medical (patient) data.
Other regulated industries
The other regulated industries are (i) electricity and (ii) telecoms.
Data Protection Issues
The main issues with cloud and edge computing from a data protection perspective relate to data transfers outside of the European Economic Area (EEA).
The transfer of personal data outside the EEA is generally only allowed if the third country in question ensures an adequate level of protection of personal data. The GDPR specifies under which circumstances personal data can be transferred to third countries, for instance in case of: (i) an adequacy decision (which has been adopted for the United Kingdom post-Brexit); (ii) appropriate safeguards (such as Binding Corporate Rules or Standard Contractual Clauses); or (iii) other specific derogations (such as the data subject’s consent).
Cloud providers often host personal data of customers in countries outside of the EEA, such as the United States. In the Schrems II decision of the ECJ, the transfer mechanism for EU-US data transfers (the Privacy Shield) was declared invalid. The use of Standard Contractual Clauses remains possible for EU-US data transfers, provided that a Data Transfer Impact Assessment (DTIA) is conducted prior to the transferring of personal data and (where necessary) additional measures are taken.
On 10 July 2023, the Data Privacy Framework entered into force, and constitutes a new EU-US data transfer mechanism. It succeeds the Privacy Shield and aims to facilitate legitimate transfers of personal data from the EU to the US, provided that the recipient in the US is acceded to the Data Privacy Framework. Critics argue that the Data Privacy Framework will also be invalidated by the ECJ due to similarities with previous frameworks like the Privacy Shield and Safe Harbour.
AI Act
On 14 March 2024, the AI Act was adopted by the European Parliament (EP). The AI Act consists of rules governing the use of Artificial Intelligence (AI) within the EU. The primary goal of the EP is to ensure that AI systems used within the EU are safe, transparent, traceable, non-discriminatory, and environmentally friendly. Additionally, the EP asserts that AI systems should never be fully automated and it advocates for a uniform definition of AI applicable to all future AI systems. The text adopted by the EP must now be formally adopted by the Council of the EU.
Until adoption by the Council of the EU, there is no specific legislation in the Netherlands concerning the complex landscape of AI. The Dutch government has in the meantime taken proactive steps to harness AI’s potential while mitigating associated risks. In October 2019 the Dutch government unveiled its Strategic Action Plan for Artificial Intelligence, a comprehensive framework outlining the nation’s ambitions to capitalise on AI’s socio-economic benefits. Central to this plan is fostering collaboration through the Dutch AI Coalition, a partnership between companies, government entities, and educational institutions aimed at implementing AI initiatives across various sectors.
The government’s strategic vision underscores AI’s role in addressing pressing societal issues such as population ageing, climate change, and healthcare. Nonetheless, it emphasises the imperative of safeguarding fundamental rights like privacy, non-discrimination, and autonomy in the face of AI advancements.
To promote ethical innovation, the Ministry of the Interior and Kingdom Relations introduced the Toolbox for Ethically Responsible Innovation. This toolbox provides developers with guidance on prioritising public values and fundamental rights in AI projects. Key principles within the toolbox stress the importance of incorporating safety measures into technology development, particularly concerning the processing of personal data.
The DNB issued guidelines in July 2019 outlining principles for responsible AI use for the Dutch financial sector. The DNB has highlighted the potential benefits AI offers to enhance business processes, while also underscoring the need for accountability, fairness, and transparency to mitigate risks such as reputational damage and harm to consumers.
Liability
While the Dutch Civil Code lacks specific provisions addressing AI, liability principles are still applicable. Manufacturers may be held liable for AI-related damages under existing product liability laws, contingent on factors like product safety expectations and defectiveness (Article 6:185 of the Dutch Civil Code). Moreover, fault-based liability (Article 6:162 of the Dutch Civil Code) may come into play in cases where neither product liability nor possessor liability (Article 6:173 of the Dutch Civil Code) applies.
Recognising the need for legal clarity in the digital economy, the European Commission proposed revisions to the Product Liability Directive in September 2022. These revisions aim to establish frameworks for AI-related liability, ensuring legal protection and accountability in AI deployment.
Data Protection
Data protection remains a paramount concern, with the GDPR and the Dutch Implementation Act providing the regulatory framework. The Dutch Data Protection Authority oversees compliance, emphasising the principles of lawfulness, fairness, and transparency in AI-related data processing activities.
Intellectual Property
Dutch patent law does not explicitly protect AI systems, but certain components such as software or training models may be patentable. Challenges may arise regarding inventorship with AI-generated inventions, as the European Patent Office has rejected AI as an inventor in patent applications. It is still unclear whether copyright protection applies to AI-generated works, although it is argued that copyright protection applies if human intervention significantly contributes to their creation, ensuring the preservation of creative rights.
The Trade Secrets Directive safeguards against unauthorised use and disclosure of confidential information, potentially encompassing AI systems as trade secrets if certain criteria are met.
Fundamental Rights
The Dutch government is acutely aware of AI’s implications for fundamental rights, as evidenced by the childcare benefits scandal in September 2018. This scandal underscored the discriminatory impact of AI algorithms and prompted a re-evaluation of AI governance practices. The government advocates for a human-centred approach to AI, prioritising respect for human rights and public values. To this end, it has issued guidelines for government agencies on algorithm use, emphasising transparency, accountability, and public engagement.
In conclusion, while adequate AI regulations are still lacking in the Netherlands, strategic initiatives, and collaborative efforts demonstrate a commitment to responsible AI deployment. By addressing ethical, legal, and societal considerations, the government seeks to maximise the benefits of AI while safeguarding fundamental rights and promoting public welfare.
The Internet of Things (IoT) is one of the main developments in our current digitalised society. From smart connected cars to smart cities, day-to-day objects and assets are now connected and equipped with data-driven technologies.
GDPR
On 25 May 2018, the GDPR came into effect. The GDPR has had major implications on all businesses in the EU, as well as all businesses that offer goods or services to EU-based customers and use their personal data. Challenges for providers of IoT solutions that may arise are to:
The e-Privacy Regulation
The e-Privacy Directive (2002/58/EC) is expected to be repealed by the e-Privacy Regulation, which is not yet approved. The e-Privacy Regulation expressly regulates and applies to machine-to-machine (M2M) communications, in particular relating to the confidentiality of data of such communications, with certain exceptions possibly applying.
The Digital Content and Digital Services Directive (2019/770)
The Digital Content and Digital Services Directive is implemented in the Dutch Civil Code and applies in the Netherlands to certain aspects concerning contracts for the supply of digital content and digital services. The Directive aims to promote the internal market and achieve a high, and as uniform as possible, level of consumer protection by harmonising a number of aspects in consumer contract law. The Directive stipulates, for example, that consumers have the right to receive (security) updates for all digital content (such as games and applications), digital services (like streaming) and all goods incorporating technology (such as IoT devices) for as long as they can reasonably expect such updates.
The Dutch Media Act 2008 aims to ensure that a diverse range of radio and TV channels are accessible to the public. It lays down requirements for both public and commercial broadcasters. It mandates regulations for public broadcasters regarding programming and advertising on their channels.
Supervision of compliance with the Dutch Media Act 2008 falls within the purview of the Dutch Media Authority (Commissariaat voor de Media). This oversight extends to broadcasters (TV, radio), commercial video-on-demand services (VODs), and video-sharing platform services (VSPs), although large VSPs often operate outside Dutch jurisdiction.
Commercial video-on-demand services are assessed against five criteria, including their primary purpose, mass media nature, economic orientation, editorial responsibility, and cataloguing structure, to determine regulatory status. Notable examples within Dutch jurisdiction are Netflix and YouTube channels operated by uploaders.
Video-sharing platforms like YouTube offer audiovisual content without editorial responsibility, therefore falling outside Dutch jurisdiction. However, complaints regarding audiovisual media services registered in the Netherlands but available in other European countries can be lodged with the Dutch Media Authority.
The Dutch Media Act 2008 aims to protect audiences from harmful content, regulate commercial communication, promote European, national, and independent works, and address the specific nature of public service media. Broadcasters and video-on-demand service providers have the freedom to determine the form and content of their offerings within legal boundaries.
For commercial media service providers, advertising, sponsoring, and product placement rules are vital. Advertising should occur only during commercial breaks, with clear demarcations, and adherence to the Dutch Advertising Code Foundation is mandatory. Sponsoring and product placement disclosures must precede or follow sponsored content or programmes, with strict prohibitions on inducements and excessive product focus. Product placement is prohibited in programmes targeting children under twelve, news and current affairs, consumer issues, or religious or spiritual content. Stricter rules apply to advertising and commercial communication for public media service providers to maintain their public and independent nature.
To establish a commercial broadcasting station in the Netherlands, a license or registration from the Dutch Media Authority is necessary, along with potential requirements from the Dutch Authority for Digital Infrastructure or agreements with cable operators. Applications for renewal must be submitted five months before expiration, and licenses or registrations are valid for five years.
In summary, the Dutch Media Act 2008 plays a crucial role in regulating the audiovisual media landscape, ensuring accessibility, protecting audiences, and promoting diverse content while imposing strict standards on advertising and commercial communication. Compliance with this law is essential for broadcasters and media service providers to operate within Dutch jurisdiction.
The Dutch Telecommunications Act is applicable to electronic communication providers and distinguishes between:
In the Netherlands, communication providers (such as providers of landline or mobile telephony, internet access or an internet network, email or webmail services, video conferencing services and internet telephony) are granted general authorisation to operate, without needing specific licenses, permits, or consents. However, there is a requirement to register with the Authority for Consumers & Markets (Autoriteit Consument & Markt, or ACM) before commencing operations. Registration entails providing details about the provider’s corporate structure, turnover, and services offered in the country.
Upon successful registration, the provider is listed in the public register of communication companies and receives a unique registration number. Any subsequent changes to activities must be promptly notified to the ACM for updating the registration. Additionally, registered communication companies are obligated to annually report their turnover from communication services to the ACM. Based on this information, the ACM imposes an annual fee, which varies depending on the turnover. Companies with a turnover of EUR2 million or less are exempted from this fee.
However, mobile operators and other spectrum users are required to obtain a license to install or operate specific mobile network equipment.
Technology Agreements
The freedom of contract forms an important principle in Dutch contracting law. Therefore, parties are largely able to deviate from the Dutch Civil Code (unless specific mandatory provisions apply) and determine the contractual arrangements and remedies.
Prior to concluding a technology agreement, a letter of intent (also referred to as LOI, MOU, Heads of Terms etc) can be concluded. This can either be a binding or non-binding agreement setting out the process for the upcoming negotiations and the key elements of the technology agreement. Moreover, a letter of intent allows parties to terminate the negotiations without incurring liability (as pre-contractual liability can arise under Dutch law).
After the conclusion of a letter of intent (or variation thereof), parties shall generally negotiate a technology agreement (usually a Master Services Agreement (MSA), depending on the type and nature of the technology agreement). In technology agreements, the following matters are generally subject to negotiations:
Regulated Industries
The following industries or sectors are subject to more stringent regulatory requirements (see also 3.1 Highly Regulated Industries and Data Protection):
Laws and Regulations
The rules regarding digital signatures are outlined in the eIDAS Regulation (Regulation (EU) No. 910/2014) and Article 3:15a of the Dutch Civil Code (DCC). These provisions distinguish between three categories of electronic signatures, all generally considered equivalent to handwritten signatures.
The following three categories of electronic signatures are distinguished:
The ordinary electronic signature is qualified as such when data in electronic form is attached to or logically associated with other data in electronic form and used by the signatory to sign. This includes scanned signatures.
More requirements are placed on the advanced electronic signature. Such a signature must:
The qualified electronic signature, where the signature is verified using a certificate issued by a recognised certification service provider, is equated with a handwritten signature by the eIDAS Regulation. The qualified electronic signature provides the most guarantees and is equivalent to a handwritten signature, recognised throughout the EU.
The advanced and ordinary electronic signatures are equated with a handwritten signature under Article 3:15a DCC if the method used is sufficiently reliable. This is an open norm that must be assessed based on the specific circumstances of the case.
Evidential Value of Electronic Signatures
If the advanced or ordinary electronic signature is considered sufficiently reliable, it has the same legal effects as a handwritten signature, just like a qualified electronic signature. Evidentially, this means that the electronically signed document is a private document and generally carries conclusive evidential value according to Article 156a in conjunction with Article 157 of the Dutch Code of Civil Procedure. If the advanced or ordinary signature is not considered sufficiently reliable, the document has free evidential value, and its assessment is left to the discretion of the court.
Gustav Mahlerplein 70
1082 MA
Amsterdam
Netherlands
+31 (0) 88 374 49 00
+31 (0) 10 412 79 41
www.habrakenrutten.comIntroduction
The Dutch business community and the government have a positive view on technology and the chances and benefits it offers. Cloud and AI are embraced, but of course with caution and an open eye for the risks. However, in general such risks are deemed to be low and acceptable. This differs from a number of other EU countries.
2023 witnessed remarkable strides in the dynamic realm of TMT, setting the stage for transformative trends that will shape the near future. From the surge in Generative AI adoption and the imperative to curtail algorithmic biases to the convergence of Augmented Reality (AR), Virtual Reality (VR) and Artificial Intelligence (AI) signalling an era of unparalleled customisation, this article explores the key developments driving innovation across industries, with an emphasis on the developments in relation to AI.
Delving into the realm of robotics, the integration of machine learning algorithms and sensor technologies brings us closer to the realisation of collaborative human-robot ventures. As the digital landscape braces for an increase in cyber threats, a unified defence against sophisticated AI-powered attacks becomes paramount, emphasising the need for standardised cybersecurity frameworks and collaborative efforts between industries and governments to safeguard data and privacy. An instrumental role to curtail these threats is foreseen in the EU Artificial Intelligence Act (AI Act).
This article will also discuss the latest TMT trends regarding competition law, the various inbound legislations on the use of data, an update on the 5G and FM frequency auctions, and important developments in light of personal data protection and taxes.
EU AI Act
On 9 December 2023, the European Parliament reached a provisional agreement on the text of the AI Act, which is likely to be the world’s first comprehensive law regulating the use of AI. The agreement must still be approved by all EU member states and the entire European Parliament; the law will enter into force two years after such approval. The AI Act is risk-based, which means that the obligations of providers and users of AI systems depend on the level of risk that the respective AI technology may create and/or generate, and that AI systems with a higher risk of affecting society will be subject to stricter requirements.
The AI Act attempts to establish guidelines to navigate the complexities of AI-generated content and authorship. The legislation recognises the importance of human creativity and addresses concerns surrounding ownership and attribution in instances where AI autonomously produces content. Furthermore, the AI Act integrates provisions for digital rights management tailored to AI-generated media, safeguarding against the unauthorised use and potential misuse of copyrighted material.
Prohibited uses
The proposal bans AI systems that violate EU values such as through violation of fundamental rights. This applies, for example, to “social scoring” by governments or companies, or the classifying or profiling of people based on their social behaviour or personal characteristics. The banned systems include:
High-risk AI
High-risk systems, such as AI systems that may cause significant harm to human health, safety and/or fundamental rights, are subject to more restrictions under the AI Act. For instance, AI used in controlling critical infrastructure in the fields of water, gas and electricity or medical devices will be subject to more scrutiny. Biometric identification, categorisation and emotion recognition systems are also considered high-risk.
Minimal risk
Minimal risk AI systems are those that can be developed and used according to existing legislation – ie, there are no additional requirements for them in the proposed regulation. Such systems include spam filters and search engines.
Specific transparency risk
Users should be made aware that they are interacting with a chatbot when employing such an AI system – eg, ChatGPT. There will also be a requirement for registration in a central European database before the application can be marketed in Europe. In addition, it is mandatory to record all copyrighted data used to train AI systems and to disclose its use in a detailed overview. Deepfakes and other AI-generated content need to be labelled as such. When biometric categorisation or emotion recognition systems are used, users need to be informed. Last but not least, providers need to make the content detectable as having been artificially generated or manipulated, by designing the system in a way that artificially generated or manipulated audio, video, text and images are marked in machine-readable format.
Innovation
To support innovation in AI, the AI Act inserted exemptions for research activities and for AI components provided under open-source licences. The AI Act promotes so-called test environments (sandboxes) set up by government agencies to test AI before it is brought to market.
Position of citizens
The European Parliament wants to strengthen citizens' rights. Think, for example, of being able to file complaints and get explanations about decisions using AI systems.
Next steps
On 9 December 2023, the European Parliament and the Council of the EU reached a preliminary political agreement on the AI regulation; they are still working on the formal legal text before the regulation is adopted by both bodies. Once the AI regulation is adopted, member states will begin implementing the regulation domestically. Most standards will take effect two years after the effective date, which is expected to be in mid-2026.
Furthermore on AI, on 11 December 2023, the Dutch government announced by way of a letter from the Minister of Digitalisation that it is currently in the process of creating guidance on safe and responsible use of generative AI within the government. In addition, the Dutch Data Protection Authority (Autoriteit Persoonsgegevens) announced in its yearly plan for 2024 that algorithms and AI will be a specific area of focus.
Technology Trends in Competition Law
The EU Digital Markets Act (DMA) became applicable in May 2023, and intends to ensure fair and contestable digital markets. In September 2023, the European Commission designated six companies as so-called “gatekeepers” under the DMA for the first time – Alphabet, Amazon, Apple, ByteDance, Meta and Microsoft.
Being designated as a gatekeeper comes with various obligations to maintain fair and effective competition. By March 2024, gatekeepers must have adjusted their businesses to comply with the DMA. Apple, ByteDance and Meta have all appealed their designations. More gatekeepers may be designated by the Commission in 2024. The case law resulting from the different proceedings will likely shape the Commission’s DMA enforcement in 2024 and the years to come.
The EU Digital Services Act (DSA) intends to create a safer digital space where the fundamental rights of users are protected, and to establish a level playing field for businesses. It already applies to designated online platforms and search engines, and will apply to all regulated companies as of February 2024. In December 2023, the Commission adopted a second set of designation decisions under the DSA, designating three very large online platforms: Pornhub, Stripchat and XVideos.
These designations follow a first set of designation decisions of 19 very large online platforms and search engines in April 2023. Amazon and Zalando have already challenged their designation by the Commission. Separately, in December 2023, the Commission opened formal proceedings against X to assess whether it may have breached the DSA in areas linked to risk management, content moderation, dark patterns, advertising transparency and data access for researchers. Again, the outcome of the different proceedings will likely guide the Commission's DSA enforcement in future. Amazon, in particular, has put forward wide-ranging arguments against the Commission’s designation, including challenges to the scope of the DSA itself.
In 2024, the Dutch Authority for Consumers and Markets (ACM) will likely continue to pay particular attention to companies’ use of algorithms and AI.
The focus in relation to antitrust damages cases before the Dutch civil law courts will likely continue to shift from follow-on cartel damages cases to abuse of dominance damages cases, specifically in relation to Big Tech. In 2024, the Dutch litigation landscape is expected to change even further in relation to Big Tech companies that have been designated under the DMA/DSA. A new wave of private enforcement actions may therefore be forthcoming in the Netherlands shortly.
Reporting Obligations for Platform Companies in Respect of Tax
EU rules were introduced in 2023 (under Directive 2021/514, also known as DAC-7) that extended EU tax transparency requirements to digital “platforms”. The aim of DAC-7 is to improve administrative tax co-operation – and counter tax fraud/tax evasion – and to address the challenges posed by the digital platform economy. Accordingly, an obligation was introduced for “platform operators” to provide information on income derived by sellers through those platforms.
An entity qualifies as a platform operator if it contracts with sellers in order to provide access and use of an online “marketplace” platform to those sellers. A platform operator falls under the scope of DAC-7 rules if, in brief, it is tax resident, has been incorporated under the laws of or is operationally active in an EU member state. The rules affect platform operators offering sellers access to:
The Netherlands implemented the DAC-7 directive into its local laws as of 1 January 2023 and platform operators had to file their first DAC-7 reporting, with regard to 2023, in January 2024.
Delayed Introduction of Digital Services Tax
The Dutch political parties that formed the 2021–2023 government (which collapsed in the fall of 2023) agreed to the introduction of a digital services tax in the Netherlands. However, unlike several other EU member states, they did not progress this topic, and no such tax has yet been introduced. The political parties that won the elections on 22 November 2023 did not include any significant notes related to the introduction of a digital services tax in their campaign plans.
Accordingly, it remains most likely for now that the Netherlands will await global acceptance of the OECD’s initiative to adapt the multilaterally agreed international tax system currently in place in such a way that some of the world’s largest multinational e-commerce businesses would need to start paying income taxes based not on where they are located or are otherwise active, but on where their consumers are located (also known as “Pillar 1”).
5G Auction to be Held
The 5G frequency auction that was originally scheduled to be held in 2023 has been postponed to early 2024, due to a conflict with Dutch maritime company Inmarsat. Now that this auction is likely to finally take place this year, media and telecommunications companies can brace themselves for an important year.
(Upcoming) Data-Related Legislation
In an attempt to keep up with the fast-evolving landscape of digital governance and the accumulation of a significant quantity of data in the digital space, the EU has undertaken significant legislative initiatives to fortify its financial, cybersecurity and data management frameworks, including the Digital Operational Resilience Act (DORA), the NIS2 Directive, the EU Data Act and the EU Data Governance Act.
Digital Operational Resilience Act
DORA is a ground-breaking legislative move in the EU, reshaping the way financial institutions manage operational risks in an increasingly digital era. Addressing the escalating reliance on digital infrastructure and the severe consequences of operational disruptions, DORA introduces a comprehensive framework to counteract cyber threats and enhance operational resilience.
Under DORA, outsourcing service providers in the EU face heightened requirements, compelling them to invest significantly in robust cybersecurity measures, compliance frameworks and incident response capabilities. This shift is expected to fuel demand for specialised cybersecurity services within the outsourcing industry.
DORA mandates financial institutions to identify and manage cyber risks through regular risk assessments, resilience testing and incident reporting to competent authorities. The regulation aims to ensure the stability and security of the financial sector, safeguard consumers and strengthen the overall economy. Virtually all financial entities in the EU fall under DORA's purview, including key third parties providing ICT-related services.
Outsourcing is central to DORA, with financial institutions facing stringent responsibilities regarding cybersecurity assessments and monitoring of third-party service providers. Key requirements include addressing ICT third-party risks through rigorous contractual arrangements, location indications for data processing, service descriptions, access guarantees, exit strategies, audits and performance targets.
DORA has been in effect since 14 December 2022 and requires in-scope companies to achieve compliance by 17 January 2025. As a binding regulation applicable across all EU member states, ongoing outsourcing agreements should already integrate DORA's requirements, emphasising the need for prompt adherence. Within the Netherlands, the Dutch Authority for the Financial Markets (Autoriteit Financiele Markten) will be responsible for supervising compliance with DORA.
NIS2 Directive
The EU Directive on measures for a high common level of cybersecurity across the Union (“NIS2 Directive”) signifies a monumental step towards harmonising cybersecurity standards across EU member states. Replacing the previous NIS Directive, the NIS2 Directive broadens its scope to cover medium-to-large enterprises and public organisations performing vital functions for the economy or society.
The NIS2 Directive introduces measures for elevating the cybersecurity posture of critical infrastructure operators and digital service providers, focusing on risk management, incident reporting and security certification schemes. Compliance with the NIS2 Directive is essential for outsourcing service providers, necessitating investments in advanced cybersecurity technologies, workforce training and robust incident-response capabilities.
Notably, the relationship between DORA and the NIS2 Directive is critical, with both pieces of legislation sharing significant requirements. As a regulation, DORA applies directly to all EU member states, while the NIS2 Directive is a directive requiring implementation into national law. The NIS2 Directive provides an exemption to ensure overlapping DORA provisions take precedence.
The implementation of NIS2 in 2023 requires EU member states to adopt it into national legislation by 17 January 2025. As organisations seek expert guidance for cybersecurity compliance, there is likely to be an increased demand for cybersecurity consulting services, offering a competitive advantage to outsourcing providers with robust security measures.
Furthermore, the Dutch legislature expects to publish the proposal for the Dutch implementation of the NIS2 Directive for public consultation in the first quarter of 2024.
Data Act
In addition to DORA and the NIS2 Directive, the proposed EU Data Act promises to have a profound impact on the technology and outsourcing landscape, particularly concerning the vast amounts of data processed through internet of things (IoT) services. Aimed at establishing a unified data governance framework across the EU, the EU Data Act emphasises data sharing and management practices, prioritising data protection and privacy.
The EU Data Act introduces new rules regarding data governance, access and sharing, enhancing data portability and interoperability. For outsourcing providers, compliance requires robust data management practices, stringent privacy measures and secure data-sharing protocols. Organisations aligned with the principles of the General Data Protection Regulation (GDPR) will find themselves well-prepared for the EU Data Act, while non-compliance may pose challenges in serving European clients or processing EU citizens' data.
The Data Act was adopted by the European Council on 27 November 2023 and entered into force on 11 January 2024. It will apply 20 months later, on 12 September 2025, prompting outsourcing providers to closely monitor developments and proactively ensure adherence to its provisions. In the Netherlands, it is not yet clear which supervisory authority will be responsible for monitoring adherence with this regulation.
EU Data Governance Act
Finally, 2023 marked the enactment of the EU Data Governance Act. This regulation on data governance seeks to improve the development of trustworthy data-sharing systems by regulating the re-use of data held by the public sector. In order to achieve this purpose, the EU Data Governance Act provides for four broad sets of measures:
The EU Data Governance Act applies to both personal and non-personal data, whereas the GDPR only applies to personal data. Where personal data is in scope, the GDPR will apply alongside the EU Data Governance Act.
The EU Data Governance Act entered into force on 23 June 2022 and has been applicable since September 2023. Within the Netherlands, the Netherlands Authority for Consumers and Markets (Autoriteit Consument en Markt) is responsible for supervising compliance with the EU Data Governance Act.
Data Protection – Transferring Personal Data to the US
Perhaps the most impactful trend with respect to data protection was the adoption of the adequacy decision for the EU-U.S. Data Privacy Framework (EU-US DPF), succeeding the Privacy Shield. This decision reinstates the possibility of data transfers between EU organisations and those in the US that have self-certified compliance with the principles of the EU-US DPF. Following the EU, the UK and Switzerland have also adopted their respective transfer mechanisms: the UK Extension to the EU-US DPF and the Swiss-U.S. DPF.
Given the previous invalidation of predecessors such as the “Safe Harbour” agreement and the “Privacy Shield” by the Court of Justice of the European Union (CJEU), the sustainability of the EU-US DPF is yet to be seen. One attempt from a member of the French Parliament, Philip Latombe, to challenge this decision – through an application for interim relief – has already been recorded, but ultimately failed because the CJEU held that he failed to establish the presence of serious and irreparable harm. Therefore, the CJEU has not (yet) reviewed this transfer mechanism on the merits. However, with a declaration of intention to initiate a legal action against the EU-US DPF by the person responsible for the invalidation of its two predecessors, this is still likely to occur in the next couple of years.
This and other data protection-related trends will be discussed in more detail in the Netherlands Trends & Development chapter in Chambers Data Protection & Privacy 2024.
Cloud
Although the Netherlands has a relatively high adaptation of cloud solutions, many enterprises have a major roadmap to bring applications to the cloud. These projects can be challenging, especially if it is not recognised early on that applications will need to be replaced by applications with slightly different functionality. This nears the risk of failure.
The Dutch government has a positive view on technology and on the cloud, and has adopted a new cloud policy that allows broad use of cloud applications. US cloud providers are not seen as suspicious and the risks of the US CLOUD Act are usually deemed acceptable.
Conclusion
In conclusion, the continuously evolving TMT landscape witnessed transformative developments in 2023 that set the stage for the future. The adoption of Generative AI, the convergence of AR/VR and AI, and the advancements in robotics signal a paradigm shift in innovation. As the digital landscape faces escalating cyber threats, the EU AI Act and collaborative defence mechanisms take centre stage, emphasising the importance of standardised cybersecurity frameworks.
The EU AI Act, with its risk-based approach, is poised to become the world's first comprehensive law regulating AI use. It addresses concerns surrounding AI-generated content, emphasising human creativity, ownership and attribution. The Act's focus on prohibited uses and the regulation of high-risk AI systems demonstrates a commitment to safeguarding fundamental rights and societal well-being.
In the realm of competition law, the DMA and DSA are reshaping digital markets by designating gatekeepers and establishing a safer online space. These designations, along with ongoing legal proceedings, will significantly influence DMA and DSA enforcement in the coming years.
In the realm of taxation, the extension of EU tax transparency requirements to digital platforms (DAC-7) and the upcoming reporting obligations for platform operators mark a significant step in countering tax fraud in the digital platform economy.
The delay in the introduction of a digital services tax in the Netherlands reflects a cautious approach, possibly awaiting global acceptance of the OECD's initiative. Furthermore, the 5G and FM frequency auctions, initially scheduled for 2023, are anticipated in early 2024, bringing substantial developments for media and telecommunications companies in the Netherlands.
The legislative initiatives on data management – including the Digital DORA, the NIS2 Directive, the EU Data Act and the EU Data Governance Act – underline the EU's commitment to fortifying digital governance. DORA's impact on financial institutions and outsourcing providers is evident, emphasising the need for robust cybersecurity measures.
The EU Data Governance Act, enacted in 2023, focuses on trustworthy data-sharing systems, setting measures for re-using public sector data and facilitating cross-sector and cross-border data sharing. This legislation, alongside the EU Data Act, strengthens data protection and privacy, aligning with GDPR principles.
Furthermore, the adequacy decision for the EU-US DPF reinstates data transfers, marking a pivotal trend in data protection. However, the sustainability of the EU-US DPF, given past invalidations, remains a crucial aspect to monitor.
As we move forward, TMT will continue to be an area deserving of close legislative attention, as it remains subject to technological advancements and instrumental in balancing innovation with societal well-being. The next phase will require adaptability and collaboration across industries to navigate the complexities of the rapidly changing digital environment.
Beethovenstraat 545
1083 HK Amsterdam
The Netherlands
+31 651 289 224
+31 20 301 7300
herald.jongen@gtlaw.com www.gtlaw.com