- Ages 3 to 5 years.
- Ages 6 to 9 years.
- Ages 10 to 12 years.
- Ages 13 to 15 years.
- Ages 16 to before 18 years.
The Tunas Government Regulation restores the prerogative of supervision to parents through Article 9. This rule distinguishes the licensing mechanism based on the level of independence of the child's age:
- Children under 17 years of age (Opt-in): The platform is strictly prohibited from opening account access before obtaining explicit permission from parents. The PSE is obliged to obtain approval within 24 hours; if not, access must be closed. This is a form of gatekeeping strict gatekeeping to protect children who are psychologically immature.
- Children aged 17 years up to under 18 years (Opt-out): At this transitional age towards adulthood, permission is deemed granted unless parents express rejection within 6 hours. This mechanism provides greater autonomy for adolescents while remaining under the parental safety net.
This system forces platforms to no longer use "formality" age verification and places the responsibility of verification on their technology.
Article 5 of the Tunas Government Regulation revolutionarily shifts the burden of proving risk from the public to the shoulders of digital platforms. Electronic System Operators (ESOs) are now required to conduct self-assessment (self-assessment) of their product risk profiles, especially regarding addiction aspects.
The government no longer tolerates algorithms designed predatorily to trigger behavioral changes and dependence. Risk assessments include exposure to pornography, violence, exploitation of children as consumers, and psychological and physiological health problems.
"This is not just a new policy, it's a change of habit, a change of behavior... which definitely requires time and effort, including efforts to combat addiction which may not be easy or comfortable for both children and parents."
This regulation affirms that children's personal data is a subject of protection, not a commodity. Article 19 strictly prohibits platforms from profiling children for the purpose of offering products or advertisements by default.profilingby default.default).
In addition, Article 18 prohibits the collection of precise location (geolocation) of children without a very compelling and limited reason. This is reinforced by the principle in Article 8: the best interests of the child must be prioritized above the commercial interests of the company. In other words, the financial gains of Big Tech should not sacrifice the privacy and physical security of Indonesian children.
The Government provides no room for PSEs to ignore these rules. Article 38 stipulates progressive administrative sanctions designed to compel compliance:
- Written Warning: As an initial warning for minor or unintentional violations.
- Administrative Fine: Financial sanctions targeting the profits of stubborn platforms.
- Temporary Suspension: Termination of services on specific features or products proven to be in violation.
- Access Termination (Blocking): This is the "Nuclear Option"—permanently terminating platform access within the Indonesian territory for Electronic System Providers that continue to defy the nation's legal sovereignty.
The presence of Government Regulation Number 17 of 2025 is a significant step, but regulation is only half the battle. The success of this rule heavily relies on our courage as parents and the community to participate in safeguarding, monitoring, and reprimanding platforms that refuse to comply.
The state has chosen its path: delaying access until children are truly ready. The question now is, are we as a society ready to change our digital habits, or will we allow algorithms to continue dictating the future of the nation's next generation? Let's wait until the children are ready.
Write a comment