Advertisement

Indonesia’s New Child Digital Protection Law Sparks Debate Over Clarity and Enforcement

In a significant move to safeguard children in the digital realm, Indonesia has introduced a new government regulation on child digital protection, signed by President Prabowo Subianto on March 28, 2025. The policy, which sets a minimum age limit for social media usage and mandates electronic system providers (ESPs) to shield users under 17 from harmful content, has been hailed as a step forward. However, digital rights advocates and child protection experts have raised concerns over its lack of clarity, potential loopholes, and the absence of robust enforcement mechanisms. As the regulation awaits full implementation within the next two years, questions linger about its ability to balance child safety with data privacy and practical application.

A Bold Step for Child Safety Online

The new regulation targets a pressing issue in Indonesia, where millions of children access social media and digital platforms daily, often encountering content deemed harmful, such as pornography, violence, or material that could cause psychological harm or addiction. Under the policy, both public and private ESPs must ensure their products, services, and features are safe for users younger than 17. This includes conducting self-assessments to evaluate the risk their platforms pose to underage users—categorizing them as high or low risk—and submitting results to the Communications and Digital Ministry for verification.

Global tech giants operating in Indonesia, such as Google and ByteDance (the parent company of TikTok), have expressed support for the regulation and signaled readiness to comply. Instagram, owned by Meta, has also rolled out its Teen Accounts feature globally for users aged 13-17, incorporating protective tools and parental monitoring options. The feature is set to launch in Indonesia this month, aligning with the country’s push for safer digital spaces for minors.

While the intent behind the regulation is widely welcomed, its specifics have drawn scrutiny. Critics argue that the policy’s broad strokes leave too much room for interpretation, potentially undermining its effectiveness. The requirement for ESPs to self-assess risks, for instance, has sparked debate over accountability and transparency.

Concerns Over Self-Assessment and Data Privacy

Digital rights group Southeast Asia Freedom of Expression Network (SAFEnet) has cautioned that allowing ESPs to evaluate their own platforms could lead to underreporting of risks. Nenden Sekar Arum, executive director of SAFEnet, emphasized the need for independent oversight, stating, “Ideally, an independent party should be involved, even though the ministry will carry out a verification afterward.”

Further complicating matters are provisions related to age verification, which require ESPs to collect sensitive personal data from children. While the regulation mandates that such data be deleted immediately after verification—unless retention is permitted by law—Nenden warned that many platforms often store user credentials for identification purposes. This raises significant concerns about data protection gaps, especially in a country where privacy laws are still evolving.

The overlap between the new regulation and existing laws, such as the Electronic Information and Transactions (ITE) Law and the Personal Data Protection (PDP) Law, adds another layer of complexity. Wahyudi Djafar, executive director of the Institute for Community Studies and Advocacy (ELSAM), described the policy as “rigid” due to its lack of detailed technical guidance. He pointed out inconsistencies, such as the inclusion of “professional or work-related information” as part of children’s personal data under the regulation—a categorization that conflicts with the PDP Law and seems irrelevant given that minors under 17 are prohibited from working under Indonesia’s 2002 Child Protection Law.

Digital profiling is another contentious issue. While the regulation prohibits profiling for personalized services or market development targeting children, it includes a clause allowing such practices with user consent. Wahyudi noted that this loophole could undermine efforts to protect children from targeted advertising, urging that it be addressed in forthcoming technical regulations. “This should prevent advertising that targets children, but the consent loophole creates inconsistency” he said.

Implementation Challenges and Enforcement Gaps

The regulation will not be fully implemented until the Communications and Digital Ministry issues a decree, expected within two years of the signing. This transition period is intended to give platforms time to align their operations with the new requirements. However, the delay also fuels uncertainty about how effectively the policy will be enforced once active.

Diyah Puspitarini of the Indonesian Child Protection Commission (KPAI) welcomed the regulation as a significant advancement but highlighted the absence of clear provisions on law enforcement’s role. “It would be very beneficial if this regulation clearly outlined the role of law enforcement institutions, such as the police’s cybercrimes division” she said. Without such clarity, holding non-compliant platforms accountable or addressing violations swiftly could prove challenging.

Diyah also stressed the need for greater public awareness and government communication about the regulation. Many Indonesians, she noted, remain unaware of the policy and its implications for their children’s online safety. This lack of awareness could hinder the regulation’s impact, especially if parents and guardians are not equipped to support or monitor their children’s digital activities.

The Broader Context: Digital Literacy as a Priority

Beyond legislative measures, experts argue that improving digital literacy is critical to protecting children online—a need that the regulation does not directly address. Diyah emphasized that digital literacy in Indonesia lags behind the rapid pace of technological advancement, leaving children vulnerable to online risks. She called for collaborative efforts between the government, private sector, and civil society to equip young users with critical skills to navigate digital spaces safely.

Indonesia’s push for child digital protection comes amid growing global concern over the impact of social media and digital platforms on young users. Studies worldwide have linked excessive screen time and exposure to harmful content with mental health issues, cyberbullying, and even addiction among children. In Southeast Asia, where smartphone penetration is high and internet access continues to expand, governments are increasingly grappling with how to regulate digital environments without stifling innovation or infringing on privacy.

In Indonesia, the stakes are particularly high. With a population of over 270 million, including a significant youth demographic, the country faces unique challenges in balancing technological progress with child safety. The new regulation represents a critical test of the government’s ability to address these challenges, but its success will depend on how well ambiguities are resolved and enforcement mechanisms are defined.

Public and Industry Reactions

Public sentiment toward the regulation appears mixed. While some parents and child advocacy groups have expressed cautious optimism, others worry that the policy may not go far enough in addressing real-world risks. A recent YouGov survey indicated that many Indonesian parents support age restrictions for social media, reflecting a broader desire for stricter controls over children’s online activities. However, without effective communication and education campaigns, such support may not translate into meaningful change at the household level.

Industry responses have been largely positive, with major ESPs signaling their willingness to adapt. Instagram’s introduction of Teen Accounts, for instance, demonstrates a proactive approach to aligning with local regulations. Yet, smaller platforms or those with fewer resources may struggle to meet the policy’s requirements, potentially creating an uneven playing field in the digital market.

Looking Ahead: A Policy in Progress

As Indonesia moves toward full implementation of its child digital protection regulation, the coming years will be crucial in determining its effectiveness. Resolving ambiguities around self-assessment, data privacy, and digital profiling will be essential to ensuring that the policy achieves its goal of safeguarding young users without creating unintended consequences. Equally important will be the government’s ability to foster digital literacy and public awareness, empowering communities to play an active role in child online safety.

For now, the regulation stands as a promising yet imperfect framework—one that reflects the complexities of regulating the digital world in a rapidly evolving society. Whether it can deliver on its promise to protect Indonesia’s youngest citizens remains an open question, one that stakeholders across government, industry, and civil society will need to address collaboratively in the months and years ahead.

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and you agree to our Privacy Policy and Terms of Use
Advertisement