Recent Developments in South Korean TMT Regulations: Data Privacy, AI and Platform Regulations
Overview
Recent developments in South Korea’s technology, media and telecommunications (TMT) sector reveal three significant trends: increased regulatory oversight of foreign businesses, establishment of a comprehensive legal framework for AI, and refinement of the online platform governance regulatory regime.
Firstly, there has been continued strengthening of regulatory oversight over foreign businesses operating in South Korea. In 2024, the Personal Information Protection Commission (PIPC) imposed substantial administrative penalties on several foreign entities, including Meta, AliExpress, and the Worldcoin Foundation. Similarly, the Korea Communications Commission (KCC) has demonstrated its commitment to robust enforcement against foreign IT companies by pursuing amendments to the Telecommunications Business Act to increase potential fines related to in-app payment violations by platform operators such as Google and Apple, while also taking administrative actions against entities like Twitch and Telegram.
Secondly, South Korea has made substantial progress in establishing a comprehensive legal framework for AI. Following the European Union’s adoption of the AI Act in early 2024, South Korea became the second jurisdiction worldwide to enact comprehensive AI legislation with the passage of the Framework Act on Artificial Intelligence Development and Establishment of a Foundation for Trustworthiness (the “AI Framework Act”) by the National Assembly on 26 December 2024. Further, various government agencies, such as the Telecommunications Technology Association (TTA), an agency under the Ministry of Science and ICT (MSIT), and the PIPC, have also issued AI-related guidelines within their respective domains.
Thirdly, South Korea is refining its regulatory framework for online platforms. The MSIT is pursuing a dual approach of self-regulation coupled with targeted regulation through amendments to the Telecommunications Business Act, while the Korea Fair Trade Commission (KFTC) is considering introducing a regulatory framework that would establish a post-facto presumption mechanism for certain violations by dominant platform operators, rather than pre-designating such operators.
This article examines these three significant developments in detail and provides guidance for foreign businesses affected by these regulatory changes, to ensure their compliance with South Korea’s evolving TMT regulatory landscape.
Strengthening of regulatory oversight on foreign businesses
Personal information protection
South Korea has significantly enhanced its data protection enforcement regime, particularly concerning foreign businesses. This shift was driven by the second major amendment to the Personal Information Protection Act (PIPA), in 2023, which introduced two crucial changes: a transition from criminal penalties to economic sanctions, and a refined method for calculating administrative penalties by shifting the basis for penalty calculations from revenue attributable to violations to total revenue, subject to any exclusions substantiated by the data controller. These changes have enabled the PIPC to impose more targeted and significant penalties throughout 2024. This enforcement trend is expected to continue in 2025.
Notable enforcement actions against foreign businesses in 2024 include the following:
In particular, the PIPC’s press release for the Meta enforcement action sent a clear message to foreign business operators that “foreign businesses operating global services must comply with Korean PIPA requirements”, emphasising the “non-discriminatory application of PIPA to global companies serving Korean users”. This marked a decisive shift towards more rigorous enforcement of data protection regulations against foreign entities.
Another point of particular concern is the regulator’s proactive approach to emerging technologies, especially AI. From November 2023 to March 2024, the PIPC conducted preliminary compliance inspections of businesses developing or deploying large language models (LLMs) or LLM-based services. This initiative, which included major international players such as OpenAI, Google, Microsoft and Meta, led to comprehensive improvement recommendations being issued on 27 March 2024. The PIPC has explicitly committed to ongoing monitoring of “AI technological and industrial changes, including AI model advancement and proliferation of open-source models”, signalling sustained regulatory attention in this rapidly evolving sector.
The IT sector
In 2024, South Korea’s IT regulatory authorities significantly strengthened their oversight of global technology companies, with both the MSIT and the KCC taking decisive enforcement actions and pursuing legislative changes to enhance their regulatory framework.
The KCC’s enforcement activities in 2024 demonstrated its commitment to robust oversight of foreign IT companies. On 23 February 2024, the regulator imposed administrative penalties and fines of approximately KRW450 million (approximately USD310,000) on Twitch for multiple violations, including unjustified restrictions on streaming quality, unilateral discontinuation of VOD services, and failure to implement adequate technical measures against illegal content distribution. In another significant action, on 7 November 2024, the KCC formally mandated Telegram to designate a youth protection officer and implement comprehensive measures for monitoring and managing content that could be harmful to young people, marking an enhanced focus on content moderation obligations for foreign messaging platforms.
The regulatory scope extended beyond content and service issues to encompass system reliability. Following the 2022 nationwide disruption of Korea’s largest messenger service due to a data centre fire, the Framework Act on Broadcasting Communications Development was amended to impose enhanced disaster management obligations on major online service providers. In the first enforcement action under these new provisions, the MSIT imposed an administrative fine of KRW5 million (approximately USD3,400) on Meta in November 2024 for failing to report an Instagram service outage that occurred in March of that year.
Looking ahead, the Korean regulators continue to strengthen their oversight mechanisms. The KCC is pursuing amendments to the Telecommunications Business Act that would double the maximum administrative penalty for forced in-app payment practices from 3% to 6% of relevant revenue. This proposed enhancement, particularly targeting practices by companies like Google and Apple, reflects South Korea’s determination to maintain regulatory effectiveness comparable to other major jurisdictions, including the European Union.
Development related to artificial intelligence regulations
Establishment of the AI Framework Act
As major economies worldwide continue to develop their AI regulatory frameworks, South Korea emerged as a frontrunner by enacting comprehensive AI legislation in 2024. Following the EU’s adoption of the AI Act in early 2024, South Korea became the second jurisdiction globally to establish a comprehensive AI regulatory framework with the passage of the AI Framework Act by the National Assembly on 26 December 2024. The AI Framework Act is scheduled to take effect one year after its promulgation.
The legislative process reflected a careful balance between promoting innovation and ensuring adequate safeguards. While the government and industry stakeholders advocated for an innovation-focused approach with minimal regulation, civil society groups emphasised the need for robust safety measures and the protection of public interests. The resulting framework primarily focuses on promoting South Korea’s AI industry while incorporating specific regulatory measures, particularly for high-impact AI systems that parallel the EU AI Act’s “high-risk AI” concept.
The AI Framework Act defines several key concepts as follows:
Artificial intelligence (AI) system
An AI-based system that produces inferences, recommendations, decisions, or other outputs affecting real or virtual environments for given objectives, with varying levels of autonomy and adaptability.
High-impact AI
AI systems that may significantly impact or pose risks to human life, physical safety, or fundamental rights, operating in sectors specified by the AI Framework Act, including:
Generative AI
AI systems that produce various outputs, including text, sound, images, and videos, based on the structure and characteristics of the input data.
AI business
Entities engaged in business related to the AI industry, including “AI development businesses”, which develop and provide AI systems, and “AI user businesses”, which offer products or services utilising AI systems provided by AI development businesses.
Furthermore, the AI Framework Act establishes several key obligations for ensuring AI safety and trustworthiness as follows:
Transparency requirements
AI businesses providing AI-based products or services must:
Safety requirements
For AI systems exceeding computational thresholds specified by the Enforcement Decree, AI businesses must implement and submit to the MSIT results of:
High-impact AI verification
AI businesses must pre-screen whether their AI or related products/services qualify as high-impact AI, and may request MSIT’s confirmation of high-impact AI status if necessary.
Other requirements for AI businesses regarding high-impact AI
AI businesses must implement the following measures to ensure the safety and trustworthiness of high-impact AI. The MSIT may establish and announce specific details and recommend compliance with these measures:
High-impact AI impact assessment
AI businesses providing products or services using high-impact AI should endeavour to assess the potential impact on people’s fundamental rights in advance.
Domestic representative
AI businesses without a domestic address or place of business that meet certain thresholds for user numbers, revenue, etc, as specified by the Enforcement Decree, must designate in writing a domestic representative with an address or place of business located in South Korea and report this to the MSIT. The domestic representative will handle:
Any violations by the domestic representative will be deemed actions of the foreign AI business operator.
In addition, the MSIT is authorised to investigate potential violations upon discovery, report or complaint, and may order the cessation of violations, or corrective measures when violations are confirmed.
The AI Framework Act is scheduled to take effect one year after its promulgation. Notably, the AI Framework Act explicitly provides for extraterritorial application, extending its scope to activities conducted outside South Korea that affect the domestic market or users. Therefore, foreign businesses providing AI services to the Korean market, even from abroad, should carefully review and ensure compliance with the AI Framework Act’s requirements.
AI-related regulatory trends
Beyond the AI Framework Act, South Korea has been developing a comprehensive regulatory ecosystem for AI through various laws and guidelines. The amended PIPA, for instance, now explicitly addresses AI-related privacy concerns, providing data subjects with specific rights regarding automated decision-making, including the right to object and request explanations when automated systems make decisions that significantly affect individual rights in areas such as credit lending and employment. Similarly, the Credit Information Use and Protection Act grants data subjects the right to request an explanation and object to automated evaluation results from credit assessment entities.
Various government agencies have issued AI-related guidelines throughout 2023 and 2024. The MSIT, through its affiliated organisations, such as the National Information Society Agency (NIA) and TTA, has published the “Guide for Developing Trustworthy Artificial Intelligence” (March 2024). The PIPC has been particularly active, issuing several significant documents, including the “Guidelines on Rights of Data Subjects Regarding Automated Decisions” (September 2024), the “Guidelines for Processing Public Personal Information for AI Development and Services” (July 2024), the “Synthetic Data Generation Reference Model” (May 2024), and the “AI Privacy Risk Management Model” (December 2024).
In the financial sector, the Financial Services Commission (FSC) has established a comprehensive regulatory framework through its series of guidelines addressing AI development, implementation and security. Additionally, sector-specific guidance has emerged from other regulatory bodies, with the Ministry of Food and Drug Safety issuing guidelines regarding AI medical devices, and the Ministry of Culture, Sports and Tourism providing guidance on generative AI copyright issues.
Furthermore, the KCC is currently developing two key initiatives:
Development related to online platform regulations
Legislative discussions regarding online platform regulations continue to evolve in South Korea, with three regulatory authorities pursuing distinct approaches. The MSIT is advancing a self-regulation-focused policy for online platforms classified as “value-added telecommunications service providers” while introducing targeted regulations to enhance regulatory clarity. Meanwhile, the KFTC is developing platform-specific competition regulations similar to the EU’s Digital Markets Act (DMA), and the KCC is pursuing user protection measures akin to the EU’s Digital Services Act (DSA).
MSIT’s dual approach: self-regulation and targeted regulation
The MSIT has primarily promoted self-regulation in the online platform market, emphasising public-private co-operation while addressing ecosystem issues without impeding innovation. On 28 June 2024, the MSIT proposed amendments to the Telecommunications Business Act to establish legal frameworks for self-regulation by value-added telecommunications service providers and their associations, with government support measures. This bill is currently under parliamentary review.
In parallel with this self-regulatory approach, the MSIT is also pursuing “targeted regulation” through additional amendments to the Telecommunications Business Act. The ministry is currently determining specific regulatory areas, subjects, and prohibited conduct types that would be subject to such targeted oversight.
KFTC and KCC platform regulation initiatives
The KFTC initially announced plans on 19 December 2023, to enact the “Platform Fair Competition Promotion Act” to effectively regulate abuse of market dominance. The original proposal involved pre-designating certain businesses as “dominant platform operators” and monitoring four main types of violations: self-preferencing, tying, multi-homing restrictions, and most-favoured-nation requirements.
However, in September 2024, after reviewing input from industry stakeholders, experts and relevant agencies, the KFTC modified its approach. Rather than adopting a pre-designation system through special legislation similar to the EU DMA, the KFTC decided to pursue amendments to the Monopoly Regulation and Fair Trade Act. This revised approach would establish a post-facto presumption mechanism for certain violations by dominant platform operators, enabling enhanced sanctions based on retrospective assessment rather than pre-designation.
On the other hand, the KCC announced in May 2023 its plans to pursue the “Platform User Protection Act” (tentative name) to protect platform users from illegal content and privacy infringements. This initiative parallels the EU’s Digital Services Act and represents another significant development in Korea’s evolving platform regulation landscape that warrants continued monitoring by stakeholders.
Considerations for foreign businesses
The recent developments in South Korea’s TMT regulation framework have significant implications for foreign businesses. Understanding and implementing appropriate compliance measures is particularly crucial, as these regulations explicitly emphasise their application to foreign entities affecting the Korean market.
In the data privacy realm, foreign businesses should pay special attention to the PIPC’s strengthened enforcement stance, as evidenced by the substantial penalties imposed on global companies in 2024. Companies should establish robust compliance mechanisms for personal data processing.
For AI service providers, the AI Framework Act’s explicit extraterritorial application requires careful attention. Foreign companies meeting certain thresholds must designate a domestic representative in South Korea and ensure compliance with various obligations, including transparency requirements for AI-generated content and specific measures for high-impact AI systems. Particularly noteworthy is the requirement to assess whether their AI services qualify as high-impact AI, and implement appropriate safety and trustworthiness measures. Companies should also closely monitor the development of pending legislation, including the KCC’s “AI User Protection Act” and its guidelines on generative AI services.
Regarding platform regulations, foreign businesses should prepare for the evolving regulatory landscape shaped by multiple authorities. This includes continuous monitoring of the MSIT’s targeted regulation approach, preparing for the KFTC’s enhanced platform competition oversight, and ensuring compliance with the KCC’s user protection requirements. Close attention should be paid to the legislative progress of various proposed regulations, including the “Platform Fair Competition Promotion Act” and the “Platform User Protection Act”.
Hanjin Building
63 Namdaemun-ro
Jung-gu
Seoul 04532
South Korea
+82 2 772 4000
+82 2 772 4001 2
mail@leeko.com www.leeko.com