Data Protection & Privacy 2026

Last Updated March 10, 2026

Brazil

Law and Practice

Authors



Lopes Pinto, Nagasse Advogados is based in São Paulo. The firm provides expertise across many areas, including corporate and business law, tax and planning, data protection (LGPD, GDPR, and PIPL), contracts, regulation, digital assets, blockchain, transportation, logistics, labour, infrastructure, agribusiness, banking and finance, bioscience, civil law, corporate governance, compliance, tech law, and legal risks. The team of highly skilled professionals possesses in-depth experience of national and multinational companies and law firms, and the modus operandi of organisations and businesses. Lopes Pinto, Nagasse Advogados prides itself on being a highly ethical firm, focused on achieving results and providing excellent service to its clients.

In Brazil there are two sources of personal data protection and privacy laws: (i) primary, or original, and (ii) secondary, or derived.

Primary or original sources are concentrated in the Brazilian Federal Constitution and the General Data Protection Law (LGPD), since Brazil has adopted the centralist principle in which the protection of personal data is governed by federal rules rather than local rules. This means that any provision on the protection of personal data has its normative and conceptual origin in the Federal Constitution and the legal framework for the subject, ie, the LGPD.

Within the scope of the Federal Constitution, even before the general concept of personal data and its protection was recognised, the safeguarding of people՚s intimacy was already provided for in Article 5, X, considered the most essential sphere of personal autonomy. With Constitutional Amendment No 115/22, the specific protection of personal data was entered into the list of fundamental rights, becoming part of the constitutional core to be protected.

In terms of secondary or derived sources, reference should be made to the Consumer Protection Code (Law 8.078/90), which established rules for the use of personal data (it did not expressly refer to “processing”) in consumer relations. Subsequently, the Access to Information Law (Law 12.527/11) was enacted. Although it did not  specifically refer to personal data, it introduced rules on how citizens could exercise the right to information, provided for in the Federal Constitution (Article 5, XXXIII), reinforcing the mechanisms designed to protect the inviolability of intimacy. With the Civil Rights Framework for the Internet (Law 12.965/14), the protection of privacy was further consolidated, since the circulation of personal data and information online was afforded its own regulation.

The big revolution came in 2018, with the introduction of the General Data Protection Law (Law 13.709/18). Materially inspired by the European Union՚s General Data Protection Regulation, this Law gave personal data the status of a “fundamental right protected in practice”, with modern and efficient mechanisms aimed at ensuring that the processing of this data should only occur within strict and well-defined guidelines.

From then on, and particularly following the enactment of Law 13.853/19, which established the National Data Protection Authority (ANPD) and the National Council for the Protection of Personal Data and Privacy, the LGPD gained an “enforceable” arm of personal data protection, similar to the international concept of “enforcement”. As a result, the ANPD, a federal agency whose main legal mandate is to regulate the processing of personal data in Brazil, began to issue key regulations on matters ranging from the role of the Data Protection Officer (DPO, referred to in the LGPD as the “Data Officer”) to the international transfer of personal data.

This entire normative group, however, only has territorial scope in Brazil. The LGPD provides (Article 3, I) that its concepts and rules are applicable in so far as “the processing operation is carried out in the national territory”. In other words, there is strict territoriality.

An important aspect of these rules is their relationship to legal issues regarding cybersecurity and AI. For example, the LGPD establishes that it is the obligation of the processing agent – especially the controller – to adopt cybersecurity measures (encryption, access control, cross-verification, 2FA, and others) to protect personal data.

In the field of AI, the interaction with personal data protection standards is even more significant. Suffice to mention that the mechanisms employed by AI and its developments (machine learning, active virtualisation, and others) rely on large databases to identify patterns, establish correlations, and generate results, which in general involves processing personal data, including sensitive data.

In Brazil, the processing of personal data can only be carried out under certain principles (LGPD, Article 6):

  • adequacy;
  • good faith;
  • purpose;
  • free access;
  • non-discrimination;
  • necessity;
  • accountability;
  • prevention;
  • data quality;
  • accountability;
  • security; and
  • transparency.

Based on these principles, there are the so-called “essential requirements” for any processing of personal data. The first is that the processing must be carried out under the rules of the LGPD, which involves investigating where the data was collected, where it is processed, and for what purposes (the “triad”).

The other requirements are (Articles 7 and 11 of the LGPD):

  • consent of the data subject;
  • compliance with a legal or regulatory obligation by the controller;
  • execution of a contract or preliminary procedures in a contract to which the data subject is a party, at their request;
  • execution of public policies;
  • regular exercise of rights in judicial, administrative or arbitration proceedings;
  • legitimate interest of the controller or third party;
  • protection of the life or physical safety of the data subject or a third party;
  • credit protection;
  • conducting studies by a research body; and
  • protection of health by health professionals.

The LGPD also establishes the main rights of the data subject (Articles 17 and 18):

  • access to personal data;
  • anonymisation, blocking or deletion of unnecessary, excessive or non-compliant data;
  • confirmation of processing treatment;
  • data correction;
  • deletion of personal data processed with consent;
  • information about non-consent;
  • information on shared use of data;
  • intimacy;
  • freedom;
  • portability;
  • privacy;
  • revocation of consent; and
  • ownership.

In the context of the processing of personal data, meeting the principles and respecting the rights is not sufficient. Each processing agent – controller and operator – must observe the “organic requirements” for this processing. The main “organic requirement” is the centralised and unitised control of personal data processing operations, provided for in Article 37 of the LGPD. This requires that the agent should institute, maintain, and manage a system capable of providing, in real time, up-to-date information on the circularity of personal data in the organisation, including the categories of data, the purpose of the processing, and the protection measures in place.

In this sense, the processing agent should consider adopting strict and transparent internal policies, protocols for the regular processing of data, specific procedures and, at a more objective level, control tools, such as the assessment of the impact of the processing on the privacy of individuals and the assessment of legitimate interest (LIA).

Sensitive data, which the LGPD defines as that referring to racial or ethnic origin, religious conviction, political opinion, union membership or religious, philosophical or political organisation, health, sex life, genetics or biometrics (Article 5, II), receive special protection under the law.

In this sense, this data may (or should only) be processed under the following bases:

  • consent;
  • compliance with a legal or regulatory obligation by the controller;
  • execution of public policies;
  • regular exercise of rights in judicial, administrative or arbitration proceedings;
  • guarantee of fraud prevention and the security of the data subject;
  • protection of the life or physical safety of the data subject or a third party;
  • conducting studies by a research body; and
  • protection of health by health professionals.

Personal data related to children and adolescents may also be processed, but the legislation requires that this be done under an unclear criterion of the “best interest”. This means that processing this type of personal data must consider the special protection given to the intimacy and privacy of children and adolescents, without creating an insurmountable restriction. The legislation also imposes two conditions: the processing must comply with the principles and rules of the LGPD and, in the case of children, the consent of a parent or legal guardian is required.

In the context of the processing of personal data for research and development, especially of health and life protection products and services, the subject is more complex and must be evaluated from two angles: that of the processing carried out by the government and that of the processing resulting from applications and mechanisms provided by private companies.

The LGPD provides (Article 13) that it must occur within the public entity – which, in other words, means that hiring third parties for processing and sharing the data is not allowed – and under strict security and reliability protocols, including anonymisation or pseudonymisation.

Even in these cases, the processing must respect the principles and requirements of the LGPD, especially if it involves handling of sensitive data or data of high relevance to individuals.

The same level of care applies to the processing resulting from applications and mechanisms provided by private companies (MedTechs, HealthTechs, etc).

Even stricter criteria may also apply. For example, in the case of R&D, the processing of personal data must be based on a “strictly legitimate purpose”, ie, one that, in origin, has a legitimacy that cannot be questioned. In addition, the processing must be based on irreversible anonymisation, in which the data cannot be linked to a specific person and which, therefore placing it outside the rules of the LGPD.

In terms of commercial use of this data, Brazil follows the European system: it is prohibited to process the data for commercial purposes or to achieve economic advantage, except for the provision of health services.

In the context of personal data processing and AI, two premises are strictly essential: “human in the loop” (“human on the top”) and strict compliance with data protection legislation.

These principles give rise to some requirements. The first is that the use of personal data in AI systems, models, or resources must always consider risk-based regulation, which prohibits AI being used for behavioural manipulation or sensitive categorisation (eg, the use of biometric data). The second is the strict and rigorous assessment of high-risk AI, considered to be systems that can significantly impact fundamental rights, public safety or the health of the population, in line with the criteria of the European AI Act. The third requirement is that AI should always be subject to active human supervision, especially in the case of automated decisions.

In Brazil, the regulation of AI is still under discussion. The most recent document in this regard is Bill 2,338, approved by the Senate and currently under consideration in the Chamber of Deputies. According to the Bill, some high-risk categories should require much stricter restrictions, such as employee recruitment and selection, justice and law enforcement, and facial recognition and biometrics.

The Bill also considers some modalities of AI as excessive risk (Article 14), and as a result prohibits models whose objective is, for example, behavioural manipulation, real-time biometric identification, and emotion recognition.

The expression “data breach” is part of a larger category of events harmful to personal data called “security incident.” Resolution CD/ANPD 15/24 defines a security incident as “any confirmed adverse event, related to the violation of the properties of confidentiality, integrity, availability, and authenticity of personal data security” (Article 3, XII).

In general, a security incident can encompass different types of attacks on personal data, including the violation itself, improper disclosure, and data leakage, ranging in severity from the most minor to the most serious.

The Brazilian regulator has established a set of measures necessary to deal with a security incident, manage its effects and prevent its repetition over time, as outlined below.

  • Communication to the regulator: the controller must formally notify the ANDP of the incident within three days of the occurrence.
  • Communication to the data subject: the controller must formally communicate the incident to the data subject within three days of becoming aware that the incident has affected the data subject՚s data.
  • Information: the controller must include, in the communication, a description of the nature and category of personal data affected.
  • Recordkeeping: the controller must maintain a detailed record of the incident for five years.
  • Management: the DPO takes over the management of the incident.
  • Compliance: the controller and DPO establish how and to what extent regulatory compliance was impacted by the incident.

A security incident may require an inspection action by the regulator, with or without the application of penalties. However, there is always the possibility that an incident that compromises a large amount of data or exceeds the controller's limits will be investigated by the ANPD or even by the Public Prosecutor՚s Office. In addition, an incident may also lead to claims for indemnification (reparation) against the controller, requiring it to compensate data subjects for losses and damages (LGPD, Article 42).

In Brazil, the regulatory authority for personal data and its protection has been well established since Law 13,853/19, which created the ANPD.

This authority is competent to, among other responsibilities (LGPD, Article 55-J):

  • co-ordinate with public regulatory authorities to exercise their powers in specific sectors of economic and governmental activities subject to regulation;
  • enter into an agreement with processing agents to eliminate irregularities, legal uncertainty or litigation in the context of administrative proceedings;
  • deliberate, in the administrative sphere, on a conclusive basis, on the interpretation of this Law;
  • provide for the forms of publicity of personal data processing operations;
  • issue regulations and procedures on personal data protection and privacy, as well as on personal data protection impact reports for cases in which the processing represents a high risk to the guarantee of the general principles of personal data protection provided for in this Law;
  • draft guidelines for the National Policy for the Protection of Personal Data and Privacy;
  • supervise and apply sanctions in the case of non-compliant data processing, through an administrative process that ensures the adversarial process, full defence and the right to appeal;
  • implement simplified mechanisms, including by electronic means, for the registration of complaints about the violation of this Law; and
  • promote co-operation with personal data protection authorities in other countries, of an international or transnational nature.

The ANPD's role, although predominantly guiding (at this stage of Brazilian legislation), has a strongly instructive, supervisory and sanctioning character.

This is possibly why its investigative process, especially regarding direct violations of personal data protection legislation, is especially detailed. Under the General Coordination of Inspection (CGF), the focus of this process is to evaluate, analyse and determine the level of compliance with the General Data Protection Law by processing agents, and may have a preventive or repressive nature.

The sequence is:

  • the investigation can start with reports from data subjects, notifications of security incidents, impact reports or sample inspections;
  • the ANPD notifies the processing agent to present information, evidence, and proof, which may include document collection and even local audits;
  • the ANPD verifies the consistency of the responses, and some measures, especially prevention, can be adopted at this stage;
  • opening of an Administrative Sanctioning Proceeding (PAS), if the irregularities continue or are sufficiently serious;
  • establishment of the PAS and notification of the investigated party;
  • production of evidence and analysis of facts by the technical area of the ANPD;
  • the processing agent can present their defence;
  • decision on the application of sanctions, with the possibility of appeal; and
  • deadline for the investigated party to appeal.

The issue of binding status still generates debate, but in general it is understood that the ANPD՚s decisions are binding, albeit in a stricter sense. This means that they bind personal data processing agents (controllers and operators) and are definitive in the administrative sphere, after the applicable appeals. However, the “atypical” binding, the one referred to other spheres of power or to agents who did not participate in the administrative process, remains debatable. This is the case, for example, for ANPD investigations shared with the Administrative Council for Economic Defense (CADE), whose use could inform the internal proceedings of this Council on anti-competitive practices. Although such co-ordination exists, the binding effect, due to the lack of essential legal prerequisites, is not yet fully mandatory in these cases.

Investigative actions within the scope of the ANPD begin with the General Coordination of Inspection (CGF), and from there unfold within a very specific procedure.

A complaint, a news item or an event of public knowledge can start an investigation. At this stage, the ANPD is presented with a “cold fact”, raw material for analysis. The next stage is the preparatory procedure, which, even though it is not mandatory, allows the ANPD to collect preliminary elements to support a potential PAS. The next step is the collection of evidence by the Regulator՚s technical team, which searches different sources for the material necessary to “instruct” the investigation process. Subsequently, the party being investigated, who may be a personal data processing agent, is notified and has ten business days to respond.

According to the LGPD, different penalties can be applied to cases of violation of legal rules on personal data. The sanctions list (LGPD, Article 52) includes:

  • warning;
  • fines of up to 2% of the revenue of the private legal entity, group or conglomerate, limited, in total, to BRL50 million per infraction;
  • daily fines, subject to a total limit (see previous bullet);
  • publication of the infraction;
  • blocking of the personal data to which the infraction refers until it is regularised;
  • deletion of the personal data to which the infringement refers;
  • partial suspension of the operation of the database to which the infraction refers;
  • suspension of the exercise of the personal data processing activity to which the infringement refers; and
  • partial or total prohibition of the exercise of activities related to data processing.

As for the criteria for setting penalties, the legislation provides that the seriousness and nature of the infraction, the size of the offending agent, the advantage obtained or intended, recidivism, the co-operation of the agent, the mitigation measures adopted by them and their economic situation may be taken into account.

There have been many developments in recent months in terms of privacy, security in the processing of personal data and inspection, especially in the expansion of certain basic concepts.

Some notable events include the following.

  • The advancement of operations with crypto-assets has required that extra care be taken with personal data, especially those submitted to blockchain-based trading. These operations often involve personal data of the traders that crosses borders, which implies international sharing and indistinct storage.
  • The pressure on the processing of sensitive personal data has significantly increased. The LGPD՚s protection of this data does not seem to be sufficient to stop practices such as “biotyping”, passive biometrics, and the transfer of databases without the knowledge (and even without agreement) of the data subjects.
  • The judiciary has explicitly recognised, following a decision of the Superior Court of Justice (STJ), that there is presumed moral damage (without the need for explicit proof) if a processing agent shares personal data without the knowledge or agreement of the data subject.
  • The international transfer of personal data has gained more attention from the ANPD, following the publication of Resolution 19, which imposes stricter conditions on this transfer.

For organisations, there are many lessons:

  • policies are insufficient; it is necessary to demonstrate the complete mapping of data flows (dynamic), so that the data subject knows how and for what purpose their data is processes;
  • embed privacy considerations from the outset when developing new AI products, services, and algorithms;
  • implement quick and effective incident response plans;
  • liability for personal data is joint and several;
  • adopt rigorous anonymisation and pseudonymisation techniques when dealing with sensitive data and train AI models to minimise risk.

Privacy is now treated as a competitive differentiator and a driver of customer loyalty (privacy-led marketing), beyond just a compliance requirement.

Arguably, the number of litigations and disputes involving privacy and personal data is growing year by year. These conflicts are no longer about financial issues, but relate to the improper, unauthorised, rights-violating or even illegal use of personal data for commercial purposes, and in some cases, illicit activities.

With the consolidation of the LGPD and the structuring of the ANPD, the trend towards greater judicialisation is being observed at different levels, including in the main Brazilian courts.

Litigation has originated from these main sources:

  • security incidents;
  • misuse of personal data;
  • inadequate processing, without purpose and without legal basis;
  • invasive biometrics;
  • sharing for financial purposes;
  • violation of minimum rights by some mass consumer companies;
  • improper commercial profiling; and
  • use of personal data to contract unsolicited services.

These disputes range from personal data subjects and consumers to civil society entities and the Public Prosecutor՚s Office. There are already signs that groups of data subjects, impacted by the misuse of their data, are organising to file complaints with the regulator, establishing a regulatory basis for relevant lawsuits. Employees, with the support of their unions, are taking action against practices such as the use of biometric data by companies to monitor working hours in remote work and the so-called “broad base”, a non-formal platform that gathers personal data of various types, including sensitive data, to establish behavioural patterns and guide hiring decisions based on employee profiles.

Litigation related to personal data and privacy has also strongly influenced the courts, which now recognise convictions based on presumed moral damage, or even moral damage for the repeated practice of violations of personal data legislation. Although with some caution, jurisprudence already considers that, under certain circumstances, moral damage arises from the mere absence or ineffectiveness of internal mechanisms of organisations, which results in security incidents. Incidents which were previously considered “external fortuitous” (attributable to causes beyond the control) are beginning to be seen as an “internal failure”, a defect in practices and policies, under the responsibility of the personal data controller.

Assessing the effects of a data breach, for example, is still done cautiously, and on a case-by-case basis. Factors include if the violated data is sensitive, if the impacted person is a child or elderly, if the event had public repercussions, or if it was nothing more than a mere inconvenience.

More recently, some courts, especially the one in São Paulo, have decided that “hacker” attacks on systems cannot always be considered an “external fortuitous” (cause beyond the control of the organisation). If the company does not demonstrate, with evidence, that it adopts and practices adequate security measures, the case is one of “internal fortuitousness”, in line with the principle of accountability (LGPD, Article 6, X).

Another important basis for decisions is to consider that the data controller is only exempt from being liable for a personal data breach if it proves one of the so-called “exclusions”, as provided for in Article 43 of the LGPD.

At the end of 2025, the Superior Court of Justice recognised the existence of presumed moral damage in a case of sharing personal data related to the monthly income, address, and personal telephone numbers of consumers to third parties, regardless of proof of actual harm, on the grounds that this type of conduct violates the legitimate expectations, privacy, and personality rights of the data subject.

A topic under discussion in Brazil is whether there is a basis for collective or diffuse (indistinct) compensation for damages resulting from violations of personal data protection legislation.

In general terms, collective redress in Brazil is a regulated and mature matter in the courts, serving to safeguard rights that are not strictly individual and to provide financial compensation (if applicable), based on the Federal Constitution and developments under the Consumer Protection Code and the Public Civil Action Law (Law 7.347/85).

What has been put to the test is whether, in the context of personal data as a fundamental right, in which the privacy and intimacy of individuals are at stake, collective or diffuse remedies are the most appropriate.

The General Data Protection Law does not refer to “diffuse damage”, but to “collective damage” (Article 42). The distinction matters. In cases of “diffuse damage”, the impact falls on an indistinct set of individuals, not necessarily belonging to a specific or determined collective. In the case of “collective damage”, a group or class of individuals is impacted, directly or indirectly, and thereby acquires the right to seek reparation. Therefore, if a security incident impacts not only certain individuals, but a collective of them, the legal framework for comprehensive reparation (“collective”) applies. This reparation covers both the patrimonial (material) and the moral (immaterial) aspects, and the controller or operator is fully responsible.

Collective damage can be claimed by the Public Prosecutor՚s Office, the Public Defender՚s Office, the Union, states, municipalities or legally constituted associations.

However, the reparability of the damage requires the fulfilment of certain conditions. First, the damage must have resulted from the activity of the processing agent with respect to the personal data under their control, that is, data that they themself handle. Second, the responsibility of the processing agent must be evidenced, not assumed. In this regard, one of the most important discussions is whether this liability is objective or subjective, that is, whether it depends on a conscious or assumed attitude of the agent (recent court decisions indicate that this liability is objective). Third, the harmful event must be assessed against the exclusions provided for in the LGPD (Article 43), which can relieve the processing agent of the responsibility for the event and, consequently, the obligation to provide compensation or reparation.

In Brazil, the regulation of the internet of things (IoT) has advanced significantly, but there are still no specific provisions on the protection and processing of non-personal data in this field.

However, some non-specific laws deal with this subject.

  • Industrial Property Law (Law 9.279/96): deals with the protection and processing of non-personal data in the context of business secrets and confidential technical elements.
  • Civil Rights Framework for the Internet (Law 12,965/14): establishes principles, guarantees, rights and duties for the use of the internet, including keeping records of connection and access to applications in general.
  • Sectoral regulations: agencies such as the telecommunications regulator (Anatel) and the Central Bank (Open Finance) have guidelines on the sharing of non-personal data in financial services.

In general terms, companies also use some non-personal data protection mechanisms within the scope of their respective businesses:

  • non-personal data processing agreements;
  • cloud and IoT contracts; and
  • non-personal data protocols.

However, one issue worries companies and regulators: the so-called “gray area”, or “inflection zone”. This arises when certain activities involve the coexistence of the processing of personal data with the processing of non-personal data. In such cases, due to the existence of “associated personal data”, the LGPD applies, at least in a dominant or preponderant character.

Brazil does not yet have its own Data Act, such as the EU Data Act, which determines who can access and make use of data generated by connected products (the “provenance data”). The ANPD has intensified studies on anonymisation and may, in the coming months, regulate the use and sharing of IoT technical data.

Something that can help a lot in this regard is that Brazil and the European Union have now mutually recognised (in the case of the EU this happened very recently, in 2026) that their respective personal data protection systems offer an adequate level of protection for this data. This is a significant development, which may even facilitate the procedures for the international transfer of personal data between the two blocs.

The interaction between data protection and data protection adopts, in general terms, the integrative principle. This means that the two subsystems (legislative and regulatory) are harmoniously co-ordinated to offer individuals a set of safeguards that work with each other. In this way, while the legislation (especially the LGPD) places the processing of personal data within specific guidelines (requirement for legal bases, observance of principles, greater care with sensitive data, etc), regulation operates at a higher level, seeking to order and discipline the behaviour of processing agents and ensure compliance with the legislation, including sanctions.

If the LGPD establishes that the processing of personal data is only possible within the established legal hypotheses (the so-called “legal bases” in Articles 7 and 11) and that non-compliance may generate penalties and an obligation to repair damages, the regulation (through the ANPD and its rules) works to ensure that these legal requirements are met from a “macro” point of view. The government can use mechanisms (inspections, audits, etc) to ensure that the guidelines established by the LGPD are observed.

On the other hand, there is the relationship between this framework and non-personal data. In general terms, if there is personal data being processed, whether alone or in combination with non-personal data, or even with strictly technical data (which does not touch on aspects of privacy and intimacy of individuals), the LGPD applies and also applies to the regulatory data exercised by the ANPD. This concept, known as “cross-regulation”, allows different regulators to act within the same process, each in its own scope of action, so enabling “extended protection” to occur.

In this sense, the different regulators interact to ensure, for example, “privacy by design” and “privacy by default.” This means, for example, that AI systems and capabilities must be created and configured to also protect data automatically, that personal data in model training must be protected by measures (such as anonymisation) to prevent the AI model from “decorating” and exposing sensitive data (data inversion), and that privacy impact assessments (DPIA) are made on high-risk AI systems.

The idea of “regulatory interaction” (which Brazil has adopted, at least in part), although useful and operational, reflects a distinction also adopted by the EU Data Act: personal data, which directly relate to the privacy and intimacy of individuals, and non-personal data, which focus on competition, innovation, access, sharing, and invention. In both cases, organisations move away from the role of “owners” of data and assume that of “controllers”; this change has strong legal and legal impacts.

In the context of strictly personal data (typically personal, related to the privacy and intimacy of individuals), the focus is on its protection, since this has to do with fundamental rights, established by the Federal Constitution. “Protection” includes a whole set of legal and regulatory guidelines, such as:

  • access (LGPD, Article 9);
  • confirmation of existence (LGPD, Article 19);
  • portability (LGPD, Article 18, V); and
  • revocation of consent (LGPD, Article 18, IX).

In the field of non-personal data (not related to natural persons), the idea of “protection” is mainly related to the following.

  • Sharing: this should only occur under “FRAND (fair, reasonable and non-discriminatory)” conditions, which can help prevent holders of essential patents from refusing to license them or demanding abusive fees, impacting competition and technological innovation.
  • Right of access and use: users (companies or consumers) have the right to access data generated by connected products and services, most often in real time.
  • Interoperability: users have the right to replace data service providers or providers, including cloud switching.
  • Protection of secrets: manufacturers and developers can refuse to share data if there is a risk of exposing industrial secrets.
  • Termination with migration: users must be guaranteed data portability at the end of their contract, without facing abusive fees.

For both personal and non-personal data, organisations must take certain actions, including:

  • update storage, update and deletion policies, to ensure that at the end of the contractual relationship portability is possible, with rollback and contingency plans;
  • implement standards for data portability between cloud providers with a reduction (or elimination) of switching fees;
  • implement safeguards that are at least reasonable against unlawful access to data by third governments;
  • include FRAND terms in non-personal data contracts;
  • map data generated by IoT devices and ensure that the interface allows direct user access to the data;
  • react in terms of infrastructure;
  • review policies and agreements on personal data; and
  • review contracts and data agreements in general.

For personal data, the Brazilian authority is the ANPD, as established in the LGPD, Article 55-J, I.

A recent issue involves the ANPD՚s competence to “ensure the observance of commercial and industrial secrets” (LGPD, Article 55-J, II, and ANPD Ordinance 1/21, Article 16, XXI). Commercial and industrial secrets, in general, fall within the Industrial Property Law (Law 9.279/96), which provides penalties for acts of violation of these secrets, since secrecy is essential to safeguard the competitiveness and economic value of the information. In practice, what the LGPD establishes (or should be understood to establish) is that the protection of personal data must also account for the protection of commercial and industrial secrets related or linked to them. This means that the ANPD does not technically have “direct” authority over these secrets, but a derivative or ancillary role in ensuring their protection.

For non-personal data, other Brazilian regulatory entities can act, such as CADE, in the case of competition issues (or linked competition), Anvisa, ANS and even, for financial services and products, the Central Bank.

Online tracking (and its derivatives, such as active tracking) and its technologies, including the use of cookies, web beacons, fingerprints, and SDKs (Software Development Kits), are governed by the Brazilian LGPD. In practice, this method of obtaining a person՚s location in real time suggests that data from individuals can be accessed and used both to distinguish them from others and to know where they are, eliminating other possible places. As a result, regulation falls on the ANPD.

The fundamental requirement for compliance in these cases is that the collection of personal data is transparent and based, whenever possible, on the user՚s consent, allowing the user to manage their own settings.

For consent models, it is possible to apply the following approaches.

  • Opt-in: requires prior, free, informed and unambiguous consent for the use of cookies that are not strictly necessary, such as tracking cookies for personalised advertising or behaviour analysis.
  • Opt-out: for strictly necessary cookies (technical or functional), which guarantee the proper functioning of the page in navigation.
  • Third-party and tracking cookies: pure opt-out (first track and then give the option to opt-out) is not recommended as the best compliance by the ANPD.

In any scenario, certain “essential rules” must be applied, as follows.

  • Third-party cookies: being more invasive, they require greater rigour in obtaining consent, as they track the user on different pages and applications.
  • Children and adolescents: require extra safeguards, via measures proportional to the risks.
  • Preference management: a cookie banner must be adopted for the user to reject or accept specific categories of cookies.
  • Transparency: pages and applications must have a “Cookie Policy”, explaining what personal data is collected, the purpose and the duration of storage.

In the national territory, personalised and targeted advertising in Brazil is basically regulated by the LGPD, the Consumer Protection Code and, at some level, the Brazilian Code of Advertising Self-Regulation (CONAR).

This type of activity depends on some conditions, including:

  • free, informed, specific and unambiguous consent (if used), preferably with an active confirmation (opt-in), via pre-checked boxes;
  • the ECA Digital (Law 15,211/25) has specific rules for digital platforms, prohibiting the profiling of children and adolescents for targeted advertising, including the requirement of age verification mechanisms and parental consent;
  • explicit purpose and well-established legal basis (LGPD, Articles 7 and 11);
  • non-discrimination and non-abusiveness (undue intrusion);
  • children՚s advertising is considered an abusive practice, according to the CDC (and CONAR՚s understanding), due to the vulnerability and inexperience of the target audience;
  • content restrictions for certain audiences (children and the elderly, for example, via sectoral agreements or good practices in advertising and content marketing); and
  • transparency for the data subject.

In the context of work and related activities, even outside the company՚s physical environment, compliance depends on the application of the LGPD and the Consolidation of Labour Laws (CLT). This is a typical case of combined regulation, in which two systems come together to protect the privacy and intimacy of workers. In a way, this harmonises the employer՚s directive authority (CLT, Article 2), the employee՚s self-determination and fundamental rights, including the protection and control of their personal data.

The monitoring of employees (considered work monitoring) is permitted under the legislation and by the courts, but without exaggeration, with reasonable prudence and with specific criteria, including:

  • background checks: only in exceptional situations, where the nature of the function or the level of trust requires it;
  • BYOD (Bring Your Own Device);
  • control of working hours: monitoring must respect private life;
  • candidate data: collect only those strictly necessary for the vacancy;
  • specific and explicit purpose: monitoring must not be generic;
  • limited to the professional context: monitoring must not extend to personal or family matters;
  • work tools: monitoring can include professional tools; and
  • transparency.

In an M&A scenario, attention to personal data and to the privacy and intimacy of individuals must be rigorous and absolute. The risks increase, and the possibility of a security incident increases as negotiations progress. In addition, commercial and industrial secrets, in general associated with individual developers and inventors, whose data may be in evidence, are at stake.

In this context, some requirements are non-negotiable, including the following.

  • Legal basis: avoid diffuse bases, including for sharing.
  • Purpose of the processing of personal data: must be precise, objective, explicit (non-generic) and transparent.
  • Mapping and stratification of risks in the target (company).

In due diligence, the following practices should be observed.

  • Cautious sharing: minimised the exchange of documents with personal data.
  • Use of anonymised or aggregated information whenever possible.
  • Secure virtual data room (VDR): data transfers must take place in secure digital environments, with access control, encryption, and audit logs.
  • Confidentiality agreements (NDAs) must include data protection clauses, holding the parties responsible for their improper handling.

On the change of control and signing, the following applies.

  • R&W (representations and warranties): purchase and sale agreements (SPAs) must have representations and warranties confirming that the target company follows the LGPD.
  • Risks and indemnities: due diligence can reveal data liabilities, which leads to adjustments in the purchase price or specific indemnification clauses for regulatory sanctions.
  • Transfer of assets: in the transfer of databases, it must be verified that the original purpose of the collection of personal data allows the transfer to the new controller and that it does not constitute an international transfer of personal data.

For the internal transparency notices, the following applies.

  • Data subjects (customers, employees and others) must be notified of the change of controller (especially if there is a change in the way their data will be processed).
  • Opt-out and consent (or authorisation): it may be necessary to notify the data subjects and allow them to exercise their rights, including opt-out if the future processing is incompatible with the original purpose.

For post-closing considerations, the following should be observed.

  • Deletion of unnecessary data: data not necessary for business continuity must be deleted or anonymised.
  • Systems audit (legacy): legacy systems of the acquired company may be vulnerable. Technical integration must ensure that the security of personal data is maintained (security by design).
  • Policy harmonisation: unify privacy policies of the buyer and the acquiree.

The topic of international transfer of personal data has been under discussion in Brazil since the entry of the General Data Protection Law in 2018. Considered as “external sharing”, international transfer brings many serious challenges, such as the risk of “infinite dispersion” of data and the difficulty of its repatriation when the processing ends or can no longer be carried out.

Recently, the ANPD issued Resolution 19, which specifically addresses this issue. According to the Resolution, transfer is the “processing operation through which a processing agent transmits, shares or makes available access to personal data to another processing agent”. If the agent receiving the data is outside Brazil, the transfer is international, as established by the LGPD (Article 5, XV), which considers it as a movement “of personal data to a foreign country or international organization of which the country is a member”.

In the LGPD, the international transfer of personal data is permitted (Article 33), but only:

  • to countries or international organisations that provide a level of protection of personal data adequate to that provided for in this Law;
  • where the national authority authorises the transfer;
  • where the transfer is necessary for international legal co-operation between public intelligence, investigative and prosecution bodies, in accordance with the instruments of international law;
  • where the transfer is necessary for the execution of public policy or legal attribution of the public service, with prior public disclosure in accordance with Article 23, I of this Law;
  • where the transfer is necessary for the protection of the life or physical safety of the data subject or a third party;
  • where the transfer results in a commitment assumed in an international co-operation agreement;
  • where necessary to comply with items II, V and VI of Article 7 of the LGPD;
  • where the controller provides guarantees of compliance with the principles, the rights of the data subject and the data protection regime provided for in the LGPD, in the form of:
    1. specific contractual clauses for a given transfer;
    2. standard contractual clauses;
    3. global corporate standards;
    4. regularly issued seals, certificates and codes of conduct; and
  • where the data subject has provided their specific and informed consent to the transfer.

In the above-mentioned Resolution 19, the ANPD also established which standard clauses must be adopted by controllers if the international transfer of personal data can use this mechanism.

Fortunately, with regard to the European Union, the EU and Brazil already recognise that both personal data protection systems (GDPR and LGPD) are equivalent, which facilitates international transfers based on “essential equivalence” (LGPD, Article 33, I).

For the international transfer of personal data, some protocols and conditions must be met, especially after ANPD Resolution 19.

An international transfer of personal data can only occur under one of the “specific regulatory conditions” established by the LGPD (Article 33). In practice, each of these “regulatory conditions” is an “additional legal basis” (in addition to those provided for in Articles 7 and 11 of the LGPD) for the transfer, a kind of double layer of protection. This allows the data subject to be sure that their data will only be shared with another nation under the legal guarantee of “formal and material adequacy”, the so-called principle of double command.

In international transfers of personal data, some guarantees are required, such as:

  • although regulatory approval is not required for each transfer, the use of at least one of the mechanisms provided for in Article 33 of the LGPD is mandatory;
  • express purpose and well-established legal basis;
  • data mapping, security assessment, and implementation of safeguards;
  • it may also be necessary to base the transfer on Binding Corporate Rules (BRCs) if the transfer takes place within the same economic group, provided that they are approved by the competent data protection authority; and
  • an impact assessment (TIA) may be required ‒ the national controller must assess whether the legislation of the destination country compromises data protections.

For international transfers of non-personal data (any data not directly related to the privacy of individuals) certain safeguards are also required, including:

  • express purpose and transparent purpose;
  • some European and US standards (such as ITAR or EAR) require approval to transfer technical data, software, or technology to “unauthorised persons” or embargoed countries;
  • formal request to government agencies;
  • compliance with bilateral or multilateral treaties;
  • identification of a compliance officer;
  • knowing precisely what data is going out and who will receive it;
  • contracts with standard clauses or evidence of encryption; and
  • legal assessment, ie, making sure that the transfer does not violate security or privacy laws.

Neither the LGPD nor the Brazilian data regulator clearly imposes a restriction on the location of personal data, which means that, by definition, the data does not need to be stored exclusively in Brazil, or even physically located in Brazil. In practice, the data must be “locatable”, but not necessarily “localised”, that is, processing agents must know where the data can be found, but this does not necessarily imply that it must be stored in a specific country or confined to a specific storage location.

However, some sectors, such as finance, have more explicit rules on the location of personal data. The Central Bank, for example, as a regulator, requires that financial institutions that use cloud services hosted abroad must ensure full and immediate access to data by the Brazilian regulator. If the cloud is in a country without a co-operation agreement with the Central Bank, the contracting requires prior authorisation from this regulator (CMN Resolution 4,893/21, Articles 11 and 12). The health sector is another that sets limitations on the international transfer of data related to clinical research, the development of medical resources and pharmacology. In particular, raw, sensitive and restricted data must be kept locally or transferred under confidentiality rules.

Remote access is another hot topic. As a rule, remote access is considered an international transfer of data, especially personal data. Under the LGPD, access (which is a form of processing, Article 5, X) does not require data and those who access it to be in the same country. It follows that access from Brazil to personal data in another country does constitute an international transfer and is therefore subject to the requirements of the LGPD and Resolution No 19 of the ANPD.

Blocking statutes are basically legal norms adopted by a country or economic bloc to prevent or nullify the effects of the legislation of other countries that try to impose sanctions or trade restrictions, which can protect local companies or economic sectors from foreign penalties or sanctions.

There are different mechanisms for the recognition of foreign court decisions and data protection laws (especially governing personal data) that greatly restrict foreign discovery, compliance with sanctions, and the cross-border transfer of personal and non-personal data. In practice, it is a safeguard based on the principle of extraterritoriality, whose main mission is to protect national sovereignty, commercial and industrial secrets and the privacy of citizens.

Some countries, such as Canada, China, and France, have passed laws prohibiting citizens and companies from providing documents or information to foreign court proceedings unless under international treaties such as the Hague Convention on the Taking of Evidence Abroad. In Brazil, the Federal Supreme Court continuously reaffirms that foreign laws and acts do not have automatic legal effects in Brazil, since they depend on a local homologation process or, at least, on international co-operative acts.

The blocking rules, even if they were not created with personal data in mind, act as a “formal reinforcement” of the legislation for the protection of this data. A good example of this is the LGPD, which prohibits the international transfer of personal data to countries that do not guarantee an adequate level of protection similar to that of Brazil. In this sense, the discovery (the production of evidence) that involves or depends on personal data, without valid authorisation, violates the LGPD.

In terms of the international transfer of personal data, the most recent development, which is highlighted in Brazil and the European Union, is that both parties have recognised that their frameworks for the protection and safeguarding of personal data are “materially equivalent” and ensure an adequate level of protection (decision of the European Parliament of 1/26/26 and ANPD Resolution 32).

Another development refers to the EU-US Data Privacy Framework (DPF), whose validity was upheld in September 2025 by the General Court of the European Union, which reinforces legal certainty for EU-US transfers.

Other developments are also underway, for example:

  • the transfer of non-personal data to more advanced research countries;
  • “relativised” international transfer where the data is “dual-localised”, stored in the country of origin and in a destination country; and
  • transfer for an essential period.

In addition, changes are expected in the near future:

  • intensification of regulatory oversight of the ANPD;
  • more requirements for personal data mapping and privacy impact reports;
  • stricter obligations for personal data policies;
  • ANPD must publish a list of countries considered safe for data transfers (adequacy decisions);
  • the DPF may face new legal challenges through activists, despite its retention;
  • focus on sensitive data and AI; and
  • emergence of new concepts such as “reserved data” and “data in process”.
Lopes Pinto, Nagasse Advogados

Rua Helena, 235, 4th floor
Vila Olímpia
São Paulo
Brazil
04552-050

+55 11 2665 9200

+55 11 98311 0108

contato@lopespinto.com.br www.lopespinto.com.br/
Author Business Card

Trends and Developments


Author



Bialer Falsetti Associados (BFA) is a legal and public policy consulting firm established in São Paulo in 2009 as a boutique firm to address the needs of companies in the technology, media and telecommunications industries. The firm also has an office in Brasília to ensure proximity to regulators, policy makers and quick turnaround for clients’ needs. Its lawyers have experience at cutting-edge law firms, legal departments of large companies and government institutions, aiming to provide high quality services in an efficient and personalised way. BFA handles public policy mandates and advises on the regulatory aspects of the technology sector.

Introduction

As our lives become more digital, privacy and data protection have increasingly turned into an everyday topics and concerns. In the past decade, the concept of informational self-determination has ceased to be restricted to the world of privacy professionals and has become a topic of dinner-table conversations among friends. These conversations touch basic human rights as well as political decisions by global leaders.

This coming year is set to represent a turning point for privacy and data protection in Brazil. As of 2026, the Brazilian Data Protection Authority, an entity which previously had administrative and financial constraints, has been transformed into a national regulatory body, with technical, administrative and financial autonomy. The Agency will more than double the number of employees dedicated to carrying out its legal mandate. As a result, the responsive approach that has characterised the Agency՚s activities is likely to gradually evolve to a more robust one, with numerous enforcement activities and potentially heavier sanctions being applied.

Protection of Children

In addition to the transformation of its legal nature, with the increase in both financial and human resources, the Brazilian Data Protection Agency (ANPD) enters 2026 with a new mandate: it has become the organisation within the Federal Administration responsible for enforcing the Brazilian Minors՚ Online Safety Law, known as the ECA Digital, that was enacted in September 2025.

The centrality of this topic for the Agency՚s activities is such that the Regulatory Agenda for the 2025–2026 biennium was updated at the end of the year, expanding the agenda to include aspects of the implementation of the ECA Digital, namely:

  • age verification mechanisms;
  • the scope and general obligations of the ECA Digital applicable to suppliers of information technology products or services; and
  • oversight and sanctioning of the ECA Digital, including the revision of existing resolutions around proceedings and enforcement procedures.

The regulatory initiative includes the preparation of a guidance document to clarify the main concepts related to the scope of application of the ECA Digital, as well as the duties applicable to information technology products or services. Additionally, the introduction of interpretative guidelines is important for ensuring a proper understanding of the scope of application of the ECA Digital and providing greater legal certainty for regulated entities in the process of implementing the law. An upcoming guideline will also establish clarifications on the duties of prevention, protection, information, and security, which are some of the most challenging concepts to be defined within the new framework. Establishing boundaries to ensure that organisations are clear on what needs to be done will be critical for the purposes of regulatory compliance.

While there is an international trend towards online safety laws with specific provisions around minors, age verification and assurance mechanisms being increasingly considered and a call for more tools for parent supervision, Brazil seems to have taken the lead with a more structured and comprehensive set of obligations. Time will tell if all these duties can be implemented without substantially disrupting the online experience for both minors and adults.

Priorities for Regulation and Enforcement in Brazil

A map of priorities for 2026-2027 was published by ANPD at the end of the year. This is an important indication as to which areas the ANPD will focus its attention and enforcement initiatives, and therefore where additional caution should be taken by processing agents. Among others, the following key topics were identified.

  • Rights of personal data subjects, especially regarding the processing of biometric data: the ANPD has emphasised technical security measures and the underlying legal basis, eg, health and financial data. This has been one of the most challenging topics since the approval of the Brazilian Data Protection Law (LGPD) in 2018 as it raises numerous questions from processors on the correct approach for implementation. Decisions usually involve substantial investments in processes, systems and personal training. Furthermore, specific aspects of the Brazilian market may require regulations that differ from best practices adopted in other jurisdictions. This year, we can expect enforcement activities related to data subjects՚ rights in different areas, as well as some related to the secondary use of personal data for the delivery of targeted commercial advertising, especially through profiling techniques.
  • Protection of children and adolescents in the digital environment to verify how processors will comply with the requirements of the ECA Digital: in 2026, this will involve inspection activities aimed at verifying the legality of the processing of personal data of these subjects and proposing safeguards to controllers within the scope of inspection activities (such as age verification mechanisms). In 2027, the enforcement activities are expected to expand to include considerations around configuration, by design and by default, of the most protective model available in relation to privacy and personal data protection, including parental supervision tools. A number of now-established industry practices may have to be reviewed to comply with the new standards. Enforcement activities will also look at the adoption of measures to prevent children and adolescents from accessing inappropriate, unsuitable, or legally prohibited content, including age verification mechanisms.
  • Processing of personal data by the government: the ANPD will focus on promoting and disseminating greater compliance with the LGPD, especially regarding the sharing of personal data, the adoption of technical safeguards in the management and governance of processed data, and the use of biometric data. Considering that governments are often the largest processors of personal data, clear parameters are central to ensuring compliance with the data protection laws and constitutional principles around fundamental rights, in a much broader sense than the privacy discussion per se. While there is an international trend towards optimising the sharing of data and the use of data for the purpose of generating value and better public policies, the privacy concerns that such sharing, especially of personal data can raise from a privacy perspective cannot be overstated.
  • AI and emerging technologies in the context of personal data processing has also been identified as a priority for the ANPD. In addition to the scope of automated decision-making parameters, in the context of training and use of AI systems, the following will also be reviewed: (i) rights of data subjects; (ii) principles of the LGPD; (iii) legal hypotheses; and (iv) good practices and governance.

AI

AI has increasingly caught the attention of data protection regulators due to its intersection with personal data, even though so much of AI does not involve any personal data. While the ANPD has already established specific proceedings to address the use of personal data in AI training, its legal authority in the AI field is currently limited to the revision of automated decisions or the use of personal data, as that is specifically provided for in the LGPD.

However, there are advanced legislative initiatives to expand the ANPD՚s legal authority, including it becoming the central co-ordinating body for AI regulation in Brazil. International experience shows that this is not a one-size-fits-all discussion and different institutional arrangements are possible, none of which being the best or the right one. The challenge is to ensure that regulation around AI and data protection do not merge into a single approach as that could hinder the development of AI, and, consequently, the social and economic benefits it can bring.

Data, Competition and Digital Markets

The central role that data plays in the organisation of the economy cannot be overstated and has led to an overlap of important discussions around the role of data, the structure of markets and the power that companies operating in this area have.

As with regulation, the discussions around the need for a Digital Market Act and a Digital Services Act will continue to be a part of the conversation around preparing digital public policies aimed at addressing a new momentum and organisation of the economy and society. Competition authorities will continue to move closer to the data universe, prompting reflections around the role of data from a more structural perspective, rather than from an individual rights approach. Legislative initiatives are less likely to advance in 2026 considering the presidential election cycle and the world cup, which are likely to divert attention from digital policies, at least for a few weeks.

Cross Border Data Flow

Finally, cross border data flow is one of the most central issues around the world and is always a hot topic, especially given the many geopolitical situations the world currently faces. Data needs to flow freely to ensure international trade, to enable the provision of products and services and for society to develop in a more integrated manner. Brazil starts 2026 with a mutual adequacy decision with the EU; this is long overdue and is expected to bring legal certainty to a number of businesses. While the LGPD establishes data transfer mechanisms similar to those of the GDPR, a number of them were still pending regulation and the market was anxiously waiting for an EU adequacy decision to resolve some of the legal uncertainty. With the decision from the EU, Brazil can now work on expediting adequacy decisions with other jurisdictions, as well as on expediting equivalence decisions for substantially similar contractual clauses.

Final Remarks

While regulation is typically slow and a few steps behind technological development, the coming months are expected to be substantially active in advancing the regulation of the digital environment, particularly in terms of important matters affecting the processing of personal data. A more robust and detailed regulatory framework for the protection of minors online will also be a priority, closely followed by the advancements around the intersection between AI and personal data. The establishment of a regulatory framework on AI will be a long-term project. The goal will be to ensure that regulators are able to incorporate innovation into their oversight, while individual rights are ensured and protected.

Bialer Falsetti Associados

Av. Santo Amaro, 3330 - cj 12
Brooklin, São Paulo - SP, 04556-300
Brazil

+55 11 5102-2297

linkedin.com/company/bialer-falsetti
Author Business Card

Law and Practice

Authors



Lopes Pinto, Nagasse Advogados is based in São Paulo. The firm provides expertise across many areas, including corporate and business law, tax and planning, data protection (LGPD, GDPR, and PIPL), contracts, regulation, digital assets, blockchain, transportation, logistics, labour, infrastructure, agribusiness, banking and finance, bioscience, civil law, corporate governance, compliance, tech law, and legal risks. The team of highly skilled professionals possesses in-depth experience of national and multinational companies and law firms, and the modus operandi of organisations and businesses. Lopes Pinto, Nagasse Advogados prides itself on being a highly ethical firm, focused on achieving results and providing excellent service to its clients.

Trends and Developments

Author



Bialer Falsetti Associados (BFA) is a legal and public policy consulting firm established in São Paulo in 2009 as a boutique firm to address the needs of companies in the technology, media and telecommunications industries. The firm also has an office in Brasília to ensure proximity to regulators, policy makers and quick turnaround for clients’ needs. Its lawyers have experience at cutting-edge law firms, legal departments of large companies and government institutions, aiming to provide high quality services in an efficient and personalised way. BFA handles public policy mandates and advises on the regulatory aspects of the technology sector.

Compare law and practice by selecting locations and topic(s)

{{searchBoxHeader}}

Select Topic(s)

loading ...
{{topic.title}}

Please select at least one chapter and one topic to use the compare functionality.