Data Protection & Privacy 2025

Last Updated March 11, 2025

Brazil

Law and Practice

Authors



Lopes Pinto, Nagasse Advogados is based in São Paulo. The firm provides expertise across many areas, including corporate and business law, tax and planning, data protection (LGPD, GDPR and PIPL), contracts, regulation, digital assets, blockchain, transportation, logistics, labour, infrastructure, agribusiness, banking and finance, bioscience, civil law, corporate governance, compliance, tech law, and legal risks. The team of highly skilled professionals possesses in-depth experience of national and multinational companies and law firms, and the modus operandi of organisations and businesses. Lopes Pinto, Nagasse Advogados prides itself on being a highly ethical firm, focused on achieving results and providing excellent service to its clients. Since 2006, it has been recognised as one of the most highly regarded law firms by Época, a Brazilian news and analysis magazine.

In Brazil, the protection of privacy is fundamentally established in the Federal Constitution of 1988 (Article 5, X), which considers it inviolable, and therefore protected against any act of invasion or misuse. Next, the Civil Code (Article 21) also considers that the private life of individuals is inviolable. Both rules seem to be in tune with the basic principle of the inviolability of privacy, and both extend their effects to the three known dimensions of privacy: the private dimension, the intimate and the secret. In addition, directly or indirectly, other rules also have a strongly collaborative role in the protection of privacy, such as the Statute of the Child and Adolescent (Law 8.069/90), the Consumer Protection Code (Law 8.078/90) and the Civil Rights Framework for the Internet (Law 12.965/14).

Resulting from the protection of privacy is the protection of personal data, thus considered those references or attributes that identify or can identify a living natural individual. In this context, the General Data Protection Law, known as LGPD (Law 13,709/18), was born, the main purpose of which is to give effectiveness to the protection of personal data, a component of individual assets, included in the Constitution as a fundamental right of each citizen.

From a legislative point of view, the Brazilian LGPD is mirrored in the General Data Protection Regulation, or GDPR, approved in the European Union in 2016. The concepts, premises and principles of the GDPR, in general, have been incorporated into the LGPD, except for some terminology. This is the case, for example, with the expression “personal data”, which the GDPR preferred to call “personal information”, even though “information” is more of an organised composition of “data”, and “data” refers better to a minimum referential unit.

Brazilian legislation differs from US legislation. While the Brazilian legislation applies to the entire national territory and to all activities, in the US certain activities have their own regulation (HIIPA, for the health sector), and the same is also true for each state, as in California with the CCPA.

The regulation model, in Brazil, is called “universal”, which means that there is a single regulation for all activities, for all segments and for all agents. The regulator model reflects this option, called “concentrated”, in which an agent has the competence to regulate the protection of personal data and, to a certain extent, the privacy of individuals.

From time to time it is said that the Brazilian Judiciary would also have a regulatory role, as it can act to stop conducts that attack privacy and data protection. But this is debatable, insofar as the Judiciary basically depends on provocation to act, while the traditional regulator can act on its own initiative, even without provocation. Under the General Data Protection Law, the Brazilian regulator is the National Data Protection Authority (ANPD), an agent that recently gained the status of a federal agency.

In general terms, the ANPD is responsible for ensuring, implementing and supervising compliance with the LGPD and other rules on personal data protection, and thereby protecting the fundamental rights of freedom and privacy and the free development of the personality of the natural person. The role of this regulator is broad and goes beyond inspecting and applying sanctions for violations of the LGPD. In addition, it also has normative, interpretative and deliberative functions, with an extensive instrumental arsenal to ensure legal certainty to relationships involving the processing of personal data and to the various agents.

The protection of privacy and the protection of personal data are strongly associated, and therefore generally the regulator’s action on issues related to the protection of personal data has effects on privacy, and vice versa. This means that regulatory action regarding a legal breach of personal data almost always benefits individuals’ privacy and privacy protection as well.

As a rule, the ANPD, the Brazilian regulator, can act on its own initiative, if and when any event – announced, disclosed or discovered – puts the privacy of individuals at risk or represents a violation of the terms of the protection legislation. But it can also act on the external initiative, of a person whose data was improperly accessed, for example.

The person interested in provoking the ANPD’s action can file a “petition”, that is, a request in which they inform that they were unable to exercise their rights as a data subject before the controller. But presenting this “petition” requires that the data subject has first addressed the controller, in a formal and proven manner.

Another possibility is to submit a “complaint” to the ANPD, that is, a communication of an alleged violation of Brazilian personal data protection legislation. But this “complaint” has peculiarities, such as that it does not relate, in general, to a specific situation of a certain holder of personal data.

Once the regulator begins to act, an administrative proceeding can be opened, so that the facts and circumstances related to the reported or reported event can be ascertained, guaranteeing all those involved a full defence and adversarial proceedings. In cases like this, the regulator expresses itself at the end, deciding to archive or, depending on the situation, to apply a penalty, which can range from a warning to the payment of a fine of 2% of the controller’s revenue, limited to BRL50 million per infraction.

In the last two years, the Brazilian regulator has acted more intensively, and many administrative proceedings have been concluded, most of them with stricter recommendations and even penalties (the so-called sanctioning processes).

Some of these cases draw more attention:

  • the Ministry of Health, opened in 2022 to investigate non-compliance with the ANPD’s request, absence of a personal data officer (DPO) and failure to report a security incident; and
  • Santa Catarina State Department of Health, opened in 2022 for lack of communication to holders of security incidents, absence of security measures and non-compliance with ANPD determinations.

But the regulator has also turned its efforts to the private sector, and more than twenty inspection proceedings have been opened against aviation companies, telecom giants, gas distributors and even managers of ride-hailing apps.

The Brazilian regulator has faced challenges regarding the regulation of AI, especially due to the exponential visibility and relevance of the topic, its technological tools, and its implications for people’s privacy and data protection.

Brazil does not yet have a specific normative and regulatory system for AI. It was only in December 2024 that the Senate approved a Bill that regulates AI, and now this Bill is being examined in the Chamber of Deputies. In a way, the Project brings important concepts, especially by placing the human being as the principal “of all things”, at the centre of decisions.

Among other rules, the text of the Bill considers as high risk any AI system that can cause significant harm to individuals or groups, which includes:

  • selection of students, recruitment of workers, and concessions of public services;
  • management of migratory processes;
  • evaluation of calls for essential services, and operation of autonomous vehicles; and
  • biometric identification systems.

For the Project, high-risk systems will be subject to strict governance, permanent monitoring and the requirement of bias mitigation measures. Organisations that use these technologies will need to conduct security testing and adopt practices that ensure transparency and fairness.

In the field of Copyright, companies that use protected content to train AI tools will have to remunerate the owners of the works, considering principles of proportionality and reasonableness. The Bill also takes care of the use of data, with rules on moderation of use, centralised registration and benefits (direct and indirect) to the holders, as the case may be.

Some topics, such as social media algorithms and online content moderation, were not included in the current Bill and will be dealt with separately. But some systems were banned, including autonomous weapons, citizen ranking tools for access to public goods and services, and risk assessments for criminal behaviour.

One of the main concerns about AI regulation – including by the Brazilian personal data regulator – is to ensure that the use of AI considers and respects the privacy of individuals and their fundamental rights, especially regarding the processing of sensitive personal data.

For 2025, the big challenge is to ensure that AI systems, including legal and operational systems based on it, are aligned with the General Data Protection Law, ensuring that personal data, isolated or in an information model, is handled safely and responsibly.

The regulation of AI is a very important step, but with it some issues need to be properly addressed. Subjects such as cognitive biometrics have been gaining more and more space, and great advances in wearable technology allow consumer devices, including brain-computer interfaces, fitness wearables, and extended reality headsets, to process and transmit data about human mental states and conditions – cognitive, emotional, affective and conative. All of this raises legitimate concerns about the misuse of sensitive data and the risks associated with exposing deeply intimate elements. As a result, organisations can exploit this data for commercial purposes and governments can use it in mass surveillance practices, which would violate fundamental rights and guarantees and compromise privacy.

With AI and the protection of related personal data, new concepts emerge, such as the so-called “intimate data”, or “psychorights”, directly related to the most sensitive and vulnerable parts of human identity, with direct effects on things such as productivity and cognitive enhancement, usually aimed at economic interests.

And not only that: regulators around the world are concerned with the “targeted processing” of personal data from AI systems, that is, a digital interruption of personal identity, privacy, cognitive freedom, and mental integrity, from collected data, for the purpose of “programming” or “reprogramming” conducts and behaviours, in some cases for political and commercial purposes, affecting the privacy of individuals.

Privacy litigation was not, in Brazil’s recent past, a trend, at least not consistently. But with the Civil Rights Framework for the Internet and the General Data Protection Law, or LPGD, the number of these disputes has increased.

Some themes were more recurrent in these litigations:

  • consumer groups and digital rights organisations filed class actions against companies for data leaks and misuse of personal information, seeking redress and changes in treatment practices;
  • the increased use of facial recognition technologies and biometric data collection has given rise to lawsuits questioning the legality and ethics of these practices;
  • the issue of the transfer of personal data outside Brazil, especially to countries without an adequate level of protection, has generated litigation around compliance with the LGPD; and
  • legal disputes over the collection of personal data by messaging apps and social networks, including issues about consent, authorisation, and transparency, were also noted.

The protection of the personal data of minors has become a relevant topic, with litigation over the responsibility of platforms in the protection of this data and the privacy of children and adolescents.

International developments in privacy protection, especially regarding personal data and its use, have had a strong impact in Brazil. Some examples of this are:

  • advancement of legislation, based on the LGPD, inspired by the European GDPR;
  • incorporation of international standards, providing greater harmonisation and facilitating international trade, especially in sectors that deal with personal data;
  • increasing public awareness of privacy rights and the importance of personal data protection;
  • greater accountability of companies in organisations in general;
  • implementation challenges, especially regarding the lack of infrastructure and technical knowledge;
  • increased level of technical co-operation for the secure exchange of personal data, which is key in digital trade and working in global networks; and
  • intensification of the protection of sensitive personal data.

In Brazil, litigation involving privacy and personal data protection is increasing in number and complexity. For example, in 2021 there were 274 court decisions that considered the General Data Protection Law, but between 2022 and 2023 this number rose from 665 to 1206 (the data are from Brazilian entities, supported by the United Nations Development Programme (UNDP)).

The areas in which the most litigation involving personal data and the right to privacy has occurred are consumer law, civil law and labour law. In the field of consumer law, more and more people have resorted to Article 20 of the LGPD, which allows consumers (of a product or service) to be informed of the criteria of this type of decision, taken solely based on computerised processing of personal data that affects the interests of the holders, such as decisions regarding personal profile, professional, consumer, credit or personality aspects.

Subjects such as digital evidence of geolocation in labour lawsuits, identification of people via biometrics and access to data stored by employers have dominated labour law issues. In this area there are also disputes over the challenge of automated decisions related to transport apps, both about knowing what the existing relationship is (whether labour or merely commercial, in the civil sphere).

Other topics have dominated privacy litigation, such as issues about the sharing of personal data, its transfer to foreign countries, the use of personal data in the financial segment for the prevention of illicit acts, and credit analysis of the use of personal data in pharmacological research.

The financial sector is one of the most representative in privacy lawsuits, accounting for about 26% of LGPD-related lawsuits, which shows the complexity of processing personal data in this sector, especially regarding the use of personal information for credit assessment and fraud prevention.

The idea of protecting fundamental rights through collective instruments is not exactly new, and some court decisions prior to the LGPD show this. The Brazilian Constitution establishes that fundamental rights, such as the protection of personal data, can be protected collectively, which favours access to justice for a greater number of individuals, faster and more economically than the filing of thousands of separate lawsuits based on the same security incident.

Article 22 of the General Data Protection Law provides that: “The defense of the interests and rights of data subjects may be exercised in court, individually or collectively, in accordance with the provisions of the relevant legislation, regarding the instruments of individual and collective protection.” This basically means that the LGPD has well absorbed the concept of “class protection”, a type of protection that goes beyond merely individual interests, since it covers needs based on a “general and abstract interest”.

On the other hand, other modalities of privacy protection and responsible and consequent use of personal data have already been admitted in Brazil. One of these modalities is the Conduct Adjustment Agreement, or TAC, in which the government, taking upon itself the defence of meta-individual rights and prerogatives, signs, with the organisation accused of violating privacy, a kind of formal commitment, with specific clauses and conditions, which can, if not fulfilled, be taken to the Judiciary as an enforceable title. An example of this was the TAC that the Public Prosecutor’s Office of the Federal District and Territories formalised with a digital financial institution responsible for the data leak of almost 20,000 account holders. In this case, the banking institution paid BRL1.5 million in moral damages, intended for public agencies that fight cybercrimes.

In Brazil, although not at the pace of other countries, especially those in the EU, there are certain initiatives aimed at regulating IoT. Some Bills are in progress, such as the following.

  • 3,949/20 – national Internet of Things policy, addressing aspects such as security, privacy, and protection of personal data. Objective: create a regulatory environment that stimulates innovation and the development of IoT solutions.
  • 1,126/21 – regulation of connected devices, defining responsibilities for manufacturers and service providers. Objective: increase the security and protection of personal data collected through IoT devices.
  • 2,494/21 – creation of guidelines for the implementation of IoT solutions in areas such as health, public safety, and agriculture. Objective: to promote the responsible and safe use of technology to improve efficiency in strategic sectors.
  • 2,885/21 – security of personal data in IoT devices, with rules for the protection and management of private information. Objective: to combat the misuse of data and ensure the privacy of data subjects and users.
  • 3,183/21 – interoperability of IoT devices, aiming to ensure that different systems can communicate effectively. Objective: to facilitate the integration of technologies and promote a more symmetrical IoT ecosystem.

Unfortunately, there is still no legal framework for IoT in Brazil. Even with Decree 9,854/19, the issue was not resolved. This is because this Decree is not exactly a technical regulation of IoT, but only a set of premises aimed at implementing and developing IoT in the national territory.

In the current context, and while IoT regulation is not approved, the General Data Protection Law has been used to safeguard personal data circulating in IoT environments and systems, so that the privacy of data subjects is minimally protected. Subjects such as collection, sharing, use, and processing of data in IoT systems are considered according to the rules of the LGPD.

But even so, with the new IoT trends, specific regulation is highly necessary to deal with topics such as:

  • AI, which will make IoT-connected devices more efficient and user-friendly;
  • “edge computing”, which can bring more speed in the exchange of information within the network and autonomy to devices that should relieve the work of servers;
  • LEO-based satellite connectivity, which can expand the possibilities of IoT, especially in agriculture and logistics, thanks in part to the wide coverage and low latency provided by near-Earth satellites; and
  • convergence of LPWAN technology, which promises to bring more multi-connectivity, providing end-to-end (E2E) connections, which are very important for sectors such as mobility.

Personal data is part of the assets of individuals, and the Brazilian Federal Constitution established that the protection of this data is a fundamental right, which means a right placed in a category above other rights. As a result, there is no longer a “free zone” for the processing and use of personal data, especially with the emergence of the General Data Protection Law.

From this regulation, processing personal data becomes an expressly disciplined activity, and this includes not only the processing itself, but also the need for a purpose, the obligation to use a “legal basis” for each processing, respect for privacy as a premise and limitation regarding the processing of special category data, which includes sensitive data (LGPD, Article 11).

Basically, the regulation on personal data, provided for in the LGPD, is based on protective principles, among which are adequacy, necessity, quality, transparency, security, and non-discrimination (LGPD, Article 6). But that is not all. The legislation requires that all data processing respects the rights of the data subject (Articles 17 and 18, LGPD), which includes access to data, correction, updating, deletion and anonymisation.

In addition, outstanding actions, such as the need for a centralised record of the processing of personal data, maintained by the controller (Article 37, LGPD), the minimisation (essentiality and necessity) of data to be processed, and the controller’s liability for damages resulting from the misapplication of the LGPD or its non-compliance, complement the data protection framework in Brazil.

Even with the National Internet of Things Plan, instituted by Decree 9,854/19, and Law 14,108/20, which creates incentives for IoT systems and reduces to zero the rate of certain taxes and contributions for machine-to-machine communications, there are Bills under discussion on the subject.

Some concerns surround the topic of IoT. Protecting users’ privacy is one of them. In IoT, devices are always collecting and sharing personal data, in some way and to some extent, which raises questions about their misuse. Another is the security of connected devices. With the interconnection of things, different risks arise, such as unauthorised access to systems and the possibility of cybercrimes.

Civil liability is also a concern. With IoT, it is possible for a connected device to cause harm to other people, due to security failures, malfunctions, or decisions made autonomously. After all, this is a dilemma that is difficult to solve. Devices connected via IoT can, at some point in the chain, interact inappropriately, and this cannot always be attributed to a construction or design flaw. This is because there will always be room for a certain “autonomy of the non-conscious will”, which makes the responsibility for events enter a grey area.

That is also why, for IoT companies, the integrity of user data should be the highest priority, supported by a data protection strategy that is fully compliant with legislation, whatever it may be. As devices continuously record and process personal data, it is necessary to meticulously address the data protection obligations of each business, including how user data is stored, processed, and handled. As such, when producing IoT devices, privacy by design is essential, which can include purpose limitation, data minimisation, accuracy, storage limitation, integrity, and confidentiality.

In general, Brazil has a regulator for issues involving personal data, the National Data Protection Authority, or ANPD, established by the LGPD (Article 55-A), created by Decree 13,853/19 and whose structure was established by Decree 10,474/20. For the legislation, the ANPD “aims to guarantee the fundamental right to the protection of personal data, the fundamental rights of freedom and privacy, and the free development of the personality of the natural person” (Decree 10,474, Article 1).

Unlike other nations, the regulatory regime for the protection of personal data in Brazilian lands follows the “one theme, one regulator” model, which means that, in this matter, the ANPD has a “superposition” role, acting on any other public agents if the topic refers to personal data.

In short, what are called “cookies” are small text files that store what the user is doing for a period of time. Cookies store browsing history, logins and passwords, and perform a service in systems known as search engines. Because of the ability of cookies to store information, it is possible for them to work in texts, spreadsheets, presentations and even offline.

There are First and Third-Person cookies. The First-Person Traces are generated by the domain itself, and this means that, from the point of view of the page visited, they are the “digital traces”, or “footprints”, that the user leaves when looking for a product, looking for information or even making a compliment or complaint. When the system generates a “cookie”, it has an identifier that records the information in the company’s database, as well as in the user’s browser. Third-party cookies are from a source external to the domain – that is, companies that are third parties to the visited page that also trigger their cookies to record their visitors’ information.

In general, cookies capture user information, and can even capture some personal data, hence the controversy around the subject. In view of this, the captured personal data may not only have a destination with which the holder does not agree, but can also be stored indefinitely, creating a “non-expirable bank”.

In the context of the EU’s GDPR, there is a 12-month limit for the use of a cookie, and consequently the personal references it captures or uses, but in the LGPD there is no similar rule. What can be done is to apply the principle of necessity (LGPD, Article 6, III), according to which data can only be stored for the period necessary to fulfil the purpose expected of it. Thus, if a cookie loads personal data that no longer needs (or cannot) be used, it becomes legally invalid.

To regulate cookie issues, it is essential to have a policy, and it is important to include some conditions, such as the types of cookies that the controller can use, what their functionalities are, how long they are stored or kept, how they are used, how they can be avoided and what the consequences are of a “personal block” of cookies.

In addition, it is necessary to consider making it clear to the user that some cookies (session cookies, for example) do not collect personal data directly, but only store information in the form of an identification. But there are cookies that, even unintentionally, record all the text fields filled in during browsing, and this makes the data collection very varied, or even malicious, depending on the intention of the person who produced the cookie.

Personalised advertising is not exactly a new concept. Since the 1970s, it has been known to be a marketing strategy that makes use of data about consumer behaviours and preferences to produce targeted and relevant ads or even to speculate on potential user preferences about one or another product or service.

Legally, personalised advertising is delimited by some rules, especially the Consumer Protection Code (CDC), the General Data Protection Law (LGPD) and, less directly, by Law 12,232/10. The main legal element that guides personalised advertising is legal responsibility, which involves not only respecting people’s rights and prerogatives, including their privacy, but also considering the protection of personal data and the principles of transparency and adequate information.

The legal context of personalised advertising generally considers the following aspects.

  • Data collection.
  • Audience segmentation.
  • Algorithms and Machine Learning.
  • User Experience.
  • Privacy challenges.
  • Transparency and consent (if applicable).
  • Results and metrics.
  • Transparency.
  • Essentiality and necessity.

Labour relations in Brazil have long been part of concerns about the protection of privacy and security of personal data, not least because without processing personal data, employer–employee interaction would be impossible.

This processing is present during all phases of the employment relationship:

  • in the pre-contractual phase, with the obtaining of identification data, curriculum vitae and references of the candidate for the job vacancy;
  • in the execution phase, via data collection for registration, payment of salaries, union membership and health; and
  • in the post-employment relationship phase, with the storage of data of former employees for labour and social security purposes and making them available to inspection agencies.

In this sense, the regulation of the processing of personal data in the employment relationship faces some (major) challenges.

  • Processing of sensitive personal data, such as those related to biometrics, sexual orientation, pathologies and union membership – how to do this to give this data additional protection coverage, in order to store it in a watertight manner and with strictly limited access.
  • Image processing, so that an essential need (such as giving access to the company’s premises) does not transform ordinary data into sensitive data, based on the inadequate or pernicious handling of these images.
  • Sharing personal data with public bodies in a way that only occurs under essential circumstances and, even then, using secure and approved platforms.
  • Facial and body recognition.
  • Inclusion of data on race and ethnicity in employee records (Law 14,553/23).
  • Processing of data of children and adolescents.

When it comes to transactions with assets, whatever they may be, this raises very serious questions in terms of privacy protection and the processing of personal data. In general, transactions like this require handling of personal data, without which the operations cannot take place. But this handling leads to some aspects.

  • Digital assets (cryptocurrencies, tokens, etc) often operate on decentralised networks, and their transactions are recorded on blockchains, offering some transparency and strong privacy implications.
  • Much of the blockchains are public, which means that transactions can be traced. Even if the accesses occur via pseudonyms, data analysis can reveal the identity of users, compromising their privacy.
  • Certain cryptocurrencies have been specifically designed to offer greater privacy, using techniques such as obfuscation of addresses and transactions, but this does not mean an absolute guarantee that the personal data involved will not be discovered and used irregularly.
  • Regulators are concerned about privacy in asset transactions, precisely because of the difficulty in balancing transparency to combat money laundering, cybercrime, and tax evasion with the protection of users’ privacy and their personal data.
  • Privacy in asset transactions involves the topic of consent, but, from the perspective of Brazil’s General Data Protection Law, consent is a very fluid legal basis that allows its withdrawal at any time by the user, which would leave the transaction without specific legal support.
  • Smart contracts, which automate blockchain transactions, can be programmed to include privacy and data protection clauses, but effective implementation is highly complex.
  • Technologies such as zk-SNARKs (which change the way data is shared) have been exploited to enable verifiable transactions without revealing personal data and sensitive information.
  • Privacy concerns can impact the adoption of digital assets, as environmental users are hesitant to engage in systems that do not ensure adequate protection for their personal data.

Basically, processing personal data in the asset trading environment considers the same principles and requirements applicable in any other data processing, which must include at least:

  • an explicit purpose and at least one legal basis authorised by law;
  • minimisation of personal data (less is more);
  • transparency;
  • personal data protection rules by design;
  • Prior Privacy Impact Assessment;
  • legal compliance;
  • Security Incident Management;
  • establishing controller–operator rules;
  • sharing agreements with third parties; and
  • adoption of centralised registry protocols.

Recently, the National Data Protection Authority, the Brazilian regulator, approved and published a specific rule on the international transfer of personal data, especially from Brazil to other countries. This rule (Resolution CD/ANPD 19/2, or International Data Transfer Regulation) establishes conditions under which the international transfer of personal data is possible and considered legitimate.

The transfer, according to the ANPD rule, must, obviously, characterise the international sending or receiving of data (involving, therefore, different countries) (Article 5, ANPD), refer to processing subject to national legislation on the protection of personal data, and be supported by a legal hypothesis and a valid international transfer mechanism (Article 4, ANPD).

In addition, as established in the rule (Article 9, ANPD), “The international transfer of data may only be carried out to meet legitimate, specific, explicit and informed purposes to the holder, without the possibility of further processing in a manner incompatible with these purposes...”, and even so, “it shall be limited to the minimum necessary to achieve its purposes, covering the pertinent data, proportionate and not excessive in relation to the purposes of the data processing.”

On the other hand, every international transfer of data will always depend on a transfer mechanism considered valid (Article 9, II, ANPD):

  • an adequacy decision, issued by the ANPD, or via standard clauses;
  • standard contractual clauses, global corporate standards (BCRs), or specific contractual clauses; or
  • in the cases of LGPD, Article 33, II, “d”, and III to IX.

In Brazil, international transfers of personal data are not necessarily subject to government notification or approval. However, there are regulatory conditions, provided for both in the General Data Protection Law and in Resolution CD/ANPD 19/2, or International Data Transfer Regulation.

These conditions, in practice, subject the exporter of data abroad to a series of procedural requirements, such as the choice of a valid method of transfer, among those referred to in Resolution CD/ANPD 19/2 (Article 9, II). But, in addition, it is up to the controller – who is the one who, ultimately, makes decisions related to the processing of personal data – to take some precautions, such as limiting the data to be transferred, obtaining authorisation (and, in certain cases, consent) from the data subject, and controlling the data transferred and the international recipient of such data.

For practical purposes, localisation of personal data means both keeping personal data within the borders within which it was collected and complying with the security and privacy requirements of the place where it is or where it is to be processed. There is a distinction to consider between data location and data residency, although sometimes the two expressions are used interchangeably. Data residency applies most to where data is or is stored; data localisation, on the other hand, is the action of meeting the requirements of the data residency point.

Data localisation can be of three types:

  • the first, called “constriction”, is the one in which data cannot be transferred across borders, unless a copy is stored in the locality of the original jurisdiction;
  • the second, known as “mitigated”, is the one that, in addition to the need to copy the data, requires that the data processing must also be done locally; and
  • the third, “restrictive”, prohibits the sending of data abroad.

Many jurisdictions do not require strict compliance with the localisation of personal data, while others have requirements that oblige organisations to localise their data as accurately and reasonably as possible. But even if some jurisdictions do not require localisation, heavily regulated sectors, such as finance and healthcare, can adopt good practices, such as recommending that controllers establish a “safe circle” where data can be processed, including localisation, to additionally avoid regulatory and civil questions and litigation.

Locating personal data when the case involves a single controller based in a single country is clearly easier to fulfil, and even more so when the infrastructure used is local, such as on perfectly identified and established servers. It is more complicated to locate data in the case of cloud or overcloud computing, since the servers are accessed via the internet, which allows them to be located anywhere on the globe. In this case, it seems clear that organisations that rely on cloud computing have less visibility into where their data is handled and stored, as it is up to the cloud vendor to deal with such issues.

But it is good to remember that it is the controller – not the cloud provider – who must present information requested by the data subject, which includes knowing where their data is being processed or maintained. Therefore, it is always good and recommended that the controller obtain a “periodic statement” from the cloud provider about the location of the data stored in it.

Blocking statutes, also known as “block rules”, in the context of privacy and personal data, can refer either to rules and practices that allow the restriction of access to, and use of, personal data in certain situations and under certain conditions, in order to ensure the protection of the privacy of individuals, or to a law in one jurisdiction intended to prevent the application in that jurisdiction of a law of another jurisdiction.

In general, blocking statutes involve the following aspects.

  • Conditions – request from the holder, in cases of contestation or doubt about the accuracy of the data.
  • Inadequacy or excessiveness – for data that is no longer in accordance with the purposes for which it was collected.
  • Compliance – to meet legal or regulatory requirements.
  • Rights of the holders.
  • Right to block – to request that your data be blocked in some situations.
  • Right to information – data subjects must be informed about the status of their data and the reasons for the blocking.
  • Impact on availability – locked data cannot be used for processing, which can affect services and products.
  • Data review – blocking can be temporary (while investigating the legality of the processing or the accuracy of the data).
  • Procedure for blocking – data subjects must follow specific procedures to request blocking, usually through the channels of the companies that process the data.
  • Security – entities implementing blocking statutes must ensure that data is stored securely and that access to it is restricted.
  • Controllers must regularly monitor and audit their blocking practices to ensure compliance and protection of personal data.

Within the scope of the LGPD, blocks can be applied in two ways, in short:

  • at the request of the data subject, as a specific right of their condition (Article 18, IV, LGPD), and
  • as a penalty to the controller (Article 52, V, LGPD).

In any case, there are situations in which the blocking may not be applicable (data necessary for compliance with legal obligations or for the defence of rights in legal proceedings) (LGPD, Article 7, II and VI).

In addition to the recent rule approved by the ANPD (Resolution CD/ANPD 19/2, or International Data Transfer Regulation), some concepts and guidelines have gained space in Brazil regarding the international transfer of personal data.

One of these developments is the clarification of what “adequate level of protection” means: for the ANPD this means equivalence in the level of protection of personal data, that is, the regulator will seek a protection structure similar, or reasonably comparable, although not necessarily identical to the one existing in the Brazilian territory.

Another welcome novelty is that the regulator has assumed the practice of reciprocity, which means that, in terms of international data transfer, it will prioritise countries that offer reciprocal treatment to Brazil. Finally, another favourable point is the growing acceptance of the concept of “responsible international transfer”; although there are minimum guidelines for this practice, it will always be up to the controller to make a conscious decision on the matter, responding, of course, for the excesses and non-compliance that are observed.

Lopes Pinto, Nagasse Advogados

Rua Helena, 235, 4th floor
Vila Olímpia
São Paulo – SP
04552-050
Brazil

+55 11 2665 9200

+55 11 98311 0108

contato@lopespinto.com.br www.lopespinto.com.br/
Author Business Card

Law and Practice

Authors



Lopes Pinto, Nagasse Advogados is based in São Paulo. The firm provides expertise across many areas, including corporate and business law, tax and planning, data protection (LGPD, GDPR and PIPL), contracts, regulation, digital assets, blockchain, transportation, logistics, labour, infrastructure, agribusiness, banking and finance, bioscience, civil law, corporate governance, compliance, tech law, and legal risks. The team of highly skilled professionals possesses in-depth experience of national and multinational companies and law firms, and the modus operandi of organisations and businesses. Lopes Pinto, Nagasse Advogados prides itself on being a highly ethical firm, focused on achieving results and providing excellent service to its clients. Since 2006, it has been recognised as one of the most highly regarded law firms by Época, a Brazilian news and analysis magazine.

Compare law and practice by selecting locations and topic(s)

{{searchBoxHeader}}

Select Topic(s)

loading ...
{{topic.title}}

Please select at least one chapter and one topic to use the compare functionality.