Abstract
This article offers the tripartite of “law,” “community governance,” and “industry leadership” for framing how the regulation of child Influencers can be studied. It provides a brief overview of key issues pertaining to the regulation of children and online fame, especially regarding tensions around definitions and scope, which can limit the jurisdiction of the law. It also offers a stocktake of child Influencer regulation in the Asia Pacific region. These data are based on a recently published report from the Influencer Ethnography Research Lab, comprehensively reviewing how 18 countries in the Asia Pacific region profile for the regulation of Influencers at large. The framework is guided by the analysis of industry and government policy documents and will provide insights on categories of hard and soft governance; extrinsic and intrinsic regulation; and areas of governance including consumer, well-being, tax and advertising, labor, ethics, culture, and modesty.
The tripartite: law, community governance, and industry leadership
In Australia, the Office of the Children’s Guardian (2024, emphasis mine) stipulates regulations for “when and how babies, children and young people are employed for their modeling or acting services.” However, the Law Society of NSW Journal asserts that regulation to protect child Influencers and children on family social media channels still “lags far behind” (Woods, 2023, emphasis mine). While formal, legal regulation is still in progress—despite appeals and recommendations from scholars in academia (e.g., Livingstone et al., 2017, Waldo, 2023–2024)—community governance by avid social media followers fills an important gap.
As sociologist Alex Turvy and I have noted (Turvy & Abidin, 2025): Community governance is usually reactive, not preemptive, in response to controversy or crisis. During these periods, users recognize a “deviant” act as norm-breaking, label it as a violation of community standards, spotlight it in discussions, and escalate it for action via “soft governance.”
In other words, community governance relies on followers being first on the scene, the eyes on the ground, to constantly albeit voluntarily point our—the public, the media, the government regulators—attention to gaps in governance, and to spotlight work further work may be needed.
While legal regulation is still a work in progress, and community governance is doing its best, industry leaders are stepping up. To name an example: The Australia Influencer Marketing Council (AiMCO, 2024b) “is the foremost Australian industry body that brings together the expertise of a diverse collective of industry professionals, marketers and content creators who are committed to elevating Influencer marketing best practice, campaign measurement and industry knowledge.” Among its resources include the “Influencer Marketing Code of Practice,” “Ad Disclosure Guide,” “Guide to Gifting,” and the like, all co-produced with active and working professionals who support the Influencer industry (AiMCO, 2024a). But beyond Australia, what are the benchmarks for regulating child Influencers in the region?
Regulating children in the Asia Pacific influencer markets
Between 2021 and 2023, the Influencer Ethnography Research Lab conducted a comprehensive review of how 18 countries in the Asia Pacific region provide for the regulation of Influencers through government and law, industry and guidelines, and para-government and para-industry bodies. The countries reviewed include Australia, Cambodia, China, Hong Kong, India, Indonesia, Japan, Laos, Macau, Malaysia, New Zealand, Philippines, Singapore, South Korea, Taiwan, Thailand, and Vietnam, and the special inclusion of the United Arab Emirates (UAE) as a major market in the neighboring regions. We adopted a bilingual approach for almost all the countries, surveying:
(a) Publicly available data that noted government policies and regulations pointing to legal frameworks that apply to Influencer marketing.
(b) Publicly available documents from companies in the Influencer industry that noted best practice, community guidelines, and other forms of formal and informal governance.
(c) Press and news articles that have provided up-to-date overviews of Influencer cultures, emerging trends, and important issues.
(d) The academic scholarship that has taken stock of Influencer regulation in each local market.
Our efforts culminated in an expansive report “Benchmarking Influencer Regulations in the Asia Pacific” (Abidin & Hong-Phuc, 2023), compromising both hard governance, where rules arise from formal legal regulation, and soft governance, where guidelines instruct on best practice or rely on persuasion but are not binding (cf. Maggetti, 2015).
Borrowing from the life sciences, the laws and guidelines can also be thought of as forms of intrinsic regulation and extrinsic regulation, referring respectively to when the Influencer industry self-regulates through its stakeholders to achieve status quo again, and when external intervention is warranted to bring the industry back under control, respectively. In the following sections, we consider a selection of examples of regulations that may apply to children in the Influencer industry.
Defining “child labor”
In the report, we looked into how different country markets define “child labour” (Abidin & Hong-Phuc, 2023, pp. 175–177). In general, we observed that (1) most jurisdictions focused on age-based benchmarking, generally prohibiting formal work for children under 13 or 15 years old, and stipulating graduated allowances for children between 14 and 16- or 18 years old; (2) some jurisdictions considered the types of work conducted and the environment of the workplace, delineating for instance industrial and non-industrial settings, safe and hazardous sectors, and light and heavy work; (3) some jurisdictions specified the duration of the work and the time of day when it took place, stipulating limits to working hours and prohibitions to work during school hours. A brief summary of how “child labor” is broadly defined across the 18 markets is below, accurate as at the time of writing:
Australia: Regulated by state and territory laws, generally prohibiting employment of children below 13 and placing restrictions on children aged 13–15.
Cambodia: Prohibits employment of children under 15, with some exceptions allowing for light work.
China: Prohibits recruitment of minors under 16, with some exceptions provided for by the State (He, 2016).
Hong Kong: Prohibits employment below age 13, with restrictions for ages 13–15.
India: Prohibits employment of children below 14 and restricts adolescents (14–18) from hazardous work.
Indonesia: Minimum employment age is 15, but children aged 13–15 can engage in light work.
Japan: Children under 18 may be employed under strict restrictions on working hours and work on rest days, prohibition of night work, restrictions on engaging in dangerous and harmful jobs, but children under 15 shall not be employed as workers. Children under 13 can be exceptionally employed in motion picture production and theatrical performance with permission (Ministry of Foreign Affairs of Japan, n.d.).
Laos: Prohibitions on child labor, with limitations and penalties. Regulations on working hours and hazardous sectors for children.
Macau: Restrictions on types of work for children under 16.
Malaysia: Employment of children under 15 is restricted, with exceptions for certain types of work.
New Zealand: Employment agreements for individuals under 18 with assistance. Restrictions on work hours during school and penalties for non-compliance.
Philippines: Children below 15 are generally prohibited from employment.
Singapore: Employment only for those at the minimum age at 13, with restrictions for those below 15.
South Korea: Minimum age for employment is 15.
Taiwan: Employment is generally prohibited for children under 15.
Thailand: Prohibits employment of children under 15.
Vietnam: Prohibits employment of anyone under 15.
UAE: Prohibits employment of children under 15, with some exceptions for light work.
Primary focus of online protection for children
In the report, we scoped the primary focus of online protection for children, and where data were available, we reviewed the specific law, act, or guideline that stipulates the online or digital well-being issues that the government is focused on (Abidin & Hong-Phuc, 2023, pp. 177–185). In general, we observed that (1) most countries focus on privacy acts and protection of personal data, specific to prohibitions on the collection of personal data from children; (2) age-based prohibitions begin as young as 10 years old and end as old as 20 years old; (3) some countries consider “online harm,” although definitions are vague, and when specified they refer to cyberbullying, online harassment, mental well-being, and sexual exploitation. A brief summary of our findings across the 18 markets is below, accurate as at the time of writing:
Australia: The Australian Privacy Principles (APPs) and Privacy Act 1988 ensure organizations only collect the personal data of minors when necessary. The eSafety Commissioner specifically focuses on cyberbullying and other online harms that target minors.
Cambodia: While there is no specific digital privacy law for children, the Law on Suppression of Human Trafficking and Sexual Exploitation does offer some protections against online child exploitation.
China: The Cybersecurity Law and Cyber Protection of Children’s Personal Data restrict collecting personal data from minors under 14 without parental consent.
Hong Kong: The office of the Privacy Commissioner for Personal Data provides guidelines for personal data involving children. While the Personal Data (Privacy) Ordinance (PDPO) mandates that data collection must be necessary and not excessive. Consent is often sought from parents for children.
India: The draft Personal Data Protection Bill, 2019 designates children as data principals with certain rights, and entities interacting with children online must ensure higher standards of data protection and seek parental consent.
Indonesia: The Child Protection Law considers any harm to the child’s mental well-being as abuse, which could extend to online harm.
Japan: The Child Protection Law and the Act on the Protection of Personal Information (APPI) consider any harm to the child’s mental well-being and privacy as abuse, which could extend to online harm, although specifics about children are not detailed.
Laos: Information on legal provisions not readily accessible, although in-routes are being made in partnership with UNICEF advocating for children’s rights in digital spaces (Fukami, 2024).
Macau: The Personal Data Protection Act (PDPA) does not specify different regulations for children, but organizations usually seek parental consent when dealing with minors.
Malaysia: The PDPA (2010) generally requires consent for data collection, with added emphasis on obtaining guardian consent for minors. Similarly, the Child Act 2001 concerns the protection and well-fare of children.
New Zealand: The PDPA (2010) generally requires consent for data collection, with added emphasis on obtaining guardian consent for minors.
Philippines: The Data Privacy Act seeks consent for data collection, especially important when dealing with minors.
Singapore: The PDPA (2012) is general, and organizations often have stricter guidelines when dealing with data of minors to ensure protection and seek parental consent.
South Korea: Specific laws governing online content that might be harmful to minors and requires parental consent for websites collecting data from those under 14, and guidelines for protecting children and adolescents in internet-based media.
Taiwan: Under the PDPA, businesses need consent to collect, process, or use the data of individuals under 20, making it unique in specifying an age limit higher than the usual “minor” definition.
Thailand: The PDPA entails parental consent is required for collecting data from children under the age of 10.
Vietnam: The Law on Children (2016) emphasizes the protection of children’s privacy rights in digital environments, including stipulations against online harassment, abuse, and exploitation.
UAE: While there is no specific federal law on children’s online privacy, various emirates have regulations to ensure children’s safety, which can extend to their online presence.
Regulations and guidelines specific to children in the Influencer industry
In reviewing the legal provisions, para-government initiatives, and industry guidelines, our report surfaced some regulations and guidelines that address children in the Influencer industry and in commercial social media spaces more specifically (Abidin & Hong-Phuc, 2023, pp. 185–188). A selection of examples includes:
Australia: The Australian Competition and Consumer Commission’s (ACCC) latest interim report for the Digital Platforms Services Inquiry has acknowledged key issues relevant to kidfluencers, including privacy concerns and possible labor exploitation issues. Concerns have been raised in Australia regarding a current lack of protections for young Australian content creators.
China: The Administration of internet livestreaming e-commerce information, content, and services stipulates that all livestreamers must be over 16 to operate without parental consent.
Japan: Ohisama, a firm specializing in social insurance and labor, states that child YouTubers only fall under the jurisdiction of the Labor Standards Act if a company technically employs them as YouTubers. If they are unaffiliated with a company and only make revenue directly through YouTube’s systems, it can be assumed that they would not be breaking the law.
Korea: The Korea Communications Commission (KCC) published the Guideline for the Protection of Children (under age 14) and Adolescents (under 19) to protect children and adolescents who appear in internet-based personal media broadcasting. Content creators and MCN channels should avoid sexualizing children and not insert advertisements in content. Children should not livestream at night (22:00–06:00 hrs) and conduct broadcasting for three straight hours without breaks.
Conclusion
Words matter, because definitions matter, because the potential scope and applications of the law are intertwined with permissible interpretations of the law. The tripartite of legal governance by government, community governance by followers, and industry governance by stakeholders continues to be important, as platform governance by platforms is not universal and subject to the whims and fancies of the owners and CEOs of big corporations that are ultimately motivated by profit. Furthermore, while some platforms have been making in-routes into improving the well-being of their underaged users and especially children entangled in the Influencer industry, their efforts are limited to recommendations, best practice, and community guidelines rather than legal jurisdiction with more concrete penalties and consequences for violators. But put together, all of these stakeholders can constitute a form of patchwork governance (cf. McDonnell, 2017) to fill the voids—even if temporarily—left by another party until improvements are made; these will be a collective attempt to take shared ownership and responsibility for the next generation, who will live with the consequences of our generational actions and inaction, long after we are gone.
Footnotes
Acknowledgements
The author thanks IERLab Research Assistant Naomi Robinson for assistance with referencing work.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: The production of this Special Issue was supported by an Australian Research Council Discovery Early Career Researcher Award (DE190100789), held at the Influencer Ethnography Research Lab (IERLab) at Curtin University, and Strategic Research Investment from the Faculty of Humanities and Research Office at Curtin University.
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
