Abstract
At the TikTok & Children Symposium, the TikTok Cultures Research Network initiated a dialogue with TikTok Trust & Safety personnel to learn about their provisions and priorities for young people. The industry fireside chat was attended by scholars of TikTok cultures and intended to facilitate research agenda with real-world applications for researchers to respond to market issues as they unfold. Such cross-sector conversations provide the rare opportunity to glean insight on backend processes, query how decision-making is structured to consider other stakeholders, and learn about how big tech companies may navigate competing priorities and negotiate tensions. In the edited transcript that follows, the dialogue focuses on timely issues pertaining to minor safety and well-being, platform design and user diversity, and industry-academic partnerships. The dialogue closes with a selection of Q&As from the session, focusing especially on design changes, API access, age-gating, and the balance between universality and regionality.
Background & introductions
On 8 May 2023, the TikTok Cultures Research Network (TCRN) hosted its seventh event titled the ‘TikTok & Children Symposium’, in conjunction with the Australian Research Council (ARC) Centre of Excellence for the Digital Child. The event opened with an industry fireside chat led by TCRN Founder Professor Crystal Abidin, with Ms Claire Gartland, Youth Safety & Wellbeing Global Product Policy Lead, Trust & Safety Global Product Policy (as of May 2023) and Ms Kathryn Grant, Outreach & Partnerships Manager, Research Partnerships Lead, TikTok Trust & Safety (as of May 2023). Below is an edited transcript of the conversation.
Welcome Kathryn and Claire! It is really good to see you both again. Perhaps for those of us who are not familiar with what it is like to work in industry, would you like to tell us a bit about your portfolio, who you are, and what you do with TikTok?
I mentioned that I lead our Youth Safety and Wellbeing team. This team is staffed with global experts in adolescent development, education and children's rights, who really consider how youth may be uniquely affected by content, interactions, and platform design features in ways that are really developmentally different from experiences for adults. Our team's North Star is to mitigate psychological and physical risks and promote healthy expression identity development, really working to ensure that all of our product policies are adapted to young people's best interests, unique developmental life stages as well as globally diverse experiences. We will talk about this a little bit more shortly, but we are structured within the Product Policy Department as what we call a ‘policy horizontal’, which means we are organized to collaborate broadly across all product policy categories such as exploitation and abuse, harassment and bullying, violent extremism, misinformation to really ensure that that subject matter expertise is embedded into all aspects of the policies that we developed. Over to you, Kathryn.
Concretely, we tend to break down our portfolios on this team into topical regional or program roles so that means that we have folks on our team who focus on a specific issue in terms of building partnerships in that space or a specific region or even a specific market in terms of places that we really need to support the community on our platform. My role is a global program role, so I lead research partnerships and that is something that we think about as a whole global effort. When we talk about bringing in external expertise into the company, there are many different ways that we do that. Principally, I would say that it involves consultations, in which we try to bring academic experts, but also folks with lived experience dealing with some of the issues that manifest online.
We also have Content and Safety Advisory Councils across the different regions that we operate in, and I will say there is at least one Youth Safety and Wellbeing Expert on each of those. What they tend to help us do is really localize our approach which is very important to how we think about safety on the platform.
We also run the Transparency and Accountability Centers. Those are physical spaces where people can come and learn about how the platform operates, what the opportunities are on TikTok, how our content moderation works, how our recommendation system works. We announced those right before the pandemic hit, and so it has been a little while coming. But right now we have sites in Los Angeles and Dublin that are open for visitors and we will have one in Singapore soon – we are excited to get that off the ground as well. 1
Here are a couple of youth safety related things that I will mention as part of my portfolio; we have a couple of partnerships that I think are worth highlighting. One is with the Tech Coalition which, for those who are not familiar, is an industry group that brings different companies together to fight against online child sexual exploitation and abuse. That has been a really meaningful place to have a role. I co-chair the research working group there and that gives us an opportunity to bring in experts to talk to different companies at once. It also has been a path to providing funding for some really important research projects through the Tech Coalition Safe Online Fund. The other partnership that I will mention is with the Digital Wellness Lab that is based out of Boston Children's Hospital. They tend to focus on issues like problematic interactive media use, like screen time, and they look at different types of online platforms, like entertainment platforms, social media, gaming, and how young people are spending their time online – whether it is quality time or a lot of quantity of time. They have been really helpful advisors to us as we continue to think about some of the tools and features that I think Claire will probably speak to more in a little bit.
I have also been spending a lot of time thinking about or supporting our work to develop and deploy a research API and internal processes to make sure that we have a really defensible approach to the research that we take on either internally or with partners. I will leave it there, but I am really excited to have this conversation.
Balancing platform-led safety and empowerment via ‘developmentally optimal’ experiences
But today, I think it will be quite tricky for anyone to claim that TikTok is still just a space for ‘children’ and for ‘kids’. We have seen on the research side that evolution in the demographic, but also always battled tensions with moral panics – like needing to ‘save the children’ – all the way to the other end of the spectrum where we talk about agency, rights, the ability to have digital literacy skills. That is what it looks like to us. As for you folks on the platform end, how do you feel that your management of policy or the overview of dealing with ‘children’ and ‘minors’ may have evolved over time?
As I mentioned earlier, we are structured as a policy ‘horizontal team’. We work quite closely with ‘vertical teams’. They are focused on things like exploitation and abuse, harassment and bullying, sexually suggestive content, mental health. All of these teams really take a developmental approach from the start, while thinking about how we can scale those protections to protect our most vulnerable users while also balancing expression and privacy and empowerment considerations for older users as well. We also have a dedicated product team within the Trust and Safety organization that is focused on minor safety across all aspects of the platform, whether it is the ‘For You’ feed, Live, Direct Messaging, Comments; you name it, and they are really focused on ensuring that we are enforcing these policies at a global scale implementing proactive strategies to detect and prevent things like child sexual exploitation and abuse, as well as really leaning into promoting youth-centered product design. And you can see that in different things like Default Settings, so higher privacy default settings that are really focused on youth safety by design, as well as some wellbeing tools that I will talk about a little bit later on, in terms of some updates we have made to screen time management and family pairing.
But again, we really think about this not just protecting young people from potential risks, but shaping their experiences online while of course having back-end protections to make sure that really bad actors are not interacting with young people, and violative content is removed. And this approach is really reflected in our recently updated Community Guidelines. You may have seen that these went into effect last month on April 21st [2023]. As you know, these guidelines establish a code of conduct for using the platform and they are informed by International Legal Frameworks, industry best practices, input from our community, safety and public health experts, and of course as Kathryn noted, our regional Safety Advisory Councils as well. As part of the recent updates that came into effect last month, we introduced a new section dedicated to youth safety and wellbeing which I think really reflects this sort of across-the-board ‘horizontal approach’ that we take to considering young people's best interests across all of the policies and product interventions we have in place.
That section details things like our minimum age requirements; as most folks know you need to be 13 or older to have an account as well as youth specific content policies which includes things like child sexual abuse material, youth physical abuse, bullying, dangerous activities and challenges, exposure to overtly mature themes, and more. And really, a major part of what we include as well in this update is the efforts we take to ensure young people receive developmentally optimal experiences on TikTok while also allowing them the choice to make, to create, the experiences that are right for them based on their own developmental stage. So this includes upstream risk prevention strategies such as limiting access to certain product features; as you might know, you cannot use Direct Messaging unless you are 16 or older on the app, you cannot go Live unless you are 18 or older.
We have also developed content levels that sort content by levels of thematic comfort and I think this is an area where you see we are really baking in youth safety from the start while also allowing for content that might be appealing to older audiences to be shown in the ‘For You’ feed, when that is appropriate based on the age of that account holder. We also have more restrictive default privacy settings and we make content created by someone under the age of 16 ineligible for the ‘For You’ feed, while again also allowing older teens and adults to really have that full experience again, scaling that based on the maturity levels and comfort levels of our user population. Kathryn, let me turn it over to you to speak a little bit about this from a research perspective as well.
As I mentioned briefly in my introduction, one of my core responsibilities is to focus on building the right internal processes that support research here at TikTok. What that means in terms of youth safety and well-being is that as we grow and build our research function, we know that we also have to think about protecting young people who might be subjects of research. [For issues like] children who are using our platform [while aged] under-18, we need to think very carefully about how we use their data, if we interact with them more directly to conduct research, having all the right protections in place that I am sure this audience is very familiar with. It has been a really deliberate and thoughtful effort to ensure that we have the right resources and guidance in place before undertaking research, and before we leapt into the world of research partnerships more fully. We are thinking about making sure that our work adheres to the best practices in ethical research, responsible data handling, and then also the best way to articulate those expectations to potential partners.
The other thing that is on my mind that fits in this section is thinking about how we prioritize what research we take on. I think the safety of young people online is always going to be really high on the list of our priorities just given the sort of higher responsibility we feel to young people using our platform; also the demographic of people who use TikTok who tend to be younger, though we do see a broadening of that as we grow and reach new communities. But I will say that as we prioritize that, it is always going to be a balance between staying nimble and [being] responsive to new trends and issues as they come up, but also remaining focused on long-term priorities. There are, of course, high harm potential areas that we have to continually think about and improve on, and then just also persistent challenges that are really tough for the industry to handle; so we need to continuously evaluate the space and see if things are changing, and how we can adapt to be supportive of the young people who use TikTok. I will leave it there, but we do try to keep evolving as we see the challenges evolve, for sure.
New initiatives and partnerships
I know from our back-end conversations that you are working on many exciting projects, and I wonder if in relation to these two points, you can tell us a bit about what is coming up on the horizon for your teams? As you know, for the rest of today, in the five sessions where we will be hosting our panelists, we are going to be focused on five themes, they are: ‘care’, ‘parenting’, ‘concerns’, ‘play’, and ‘regulation’. So, is there any one of these themes you might want to respond to, so that you can give us a bit of a teaser on what is coming up in your departments?
Research also shows that being more aware of how we spend our time can help us be more intentional about the decisions we make. So, with that in mind, we are also prompting teens to set a daily screen time limit if they opt out of that 60-min default, and spend more than 100 min on TikTok in a day. This really builds on previous efforts that we have rolled out to encourage teens to enable screen time management. We have found that those healthy use nudges have actually increased the use of our screen time tools by 234%. We are also starting to send teens weekly inbox notifications that recap their overall amount of screen time, and we think these nudges are not trying to be overly prescriptive, are not trying to decide what is developmentally optimal for each user, but rather, trying to encourage healthy use, and to encourage more intentionality in how young people engage with the platform.
In terms of ‘parenting’, we have a tool called Family Pairing 4 , 5 which folks may be familiar with, that is TikTok's version of what we call ‘parental controls’; on TikTok this allows parents and guardians to link their accounts with those of their teens and set up a variety of privacy and safety controls, such as screen time limits setting a teen's account to private, deciding who can comment on their videos, as well as a range of other features. We see this as really enhancing our suite of safety tools and complements our work to provide greater access to product features as users reach key milestones for digital literacy. It is also part of our continued work toward providing parents with better ability to guide their teens online experience while allowing time to educate about online safety and digital citizenship. Our design philosophy for Family Pairing is always centered on balancing parental supervision with teens’ privacy and autonomy needs, while recognizing that different teens develop and mature at different rates. It is really putting control and empowering teens and parents to set the types of limits that are right for them based on offline conversations, based on their unique needs. But one of the key parts of this design philosophy is that teens have to actively agree to link their accounts to their parents through Family Pairing, and can unlink at any time. [This is] taking a really ‘privacy-first’ approach to designing Family Pairing, so things like private videos stay private, parents do not get access to Direct Messaging, and a range of other privacy considerations.
In terms of looking ahead, we are going to continue investing in these wellbeing product features – looking at areas like social comparison and other screen time issues, really ensuring that this is both global and research-backed, and really developmentally-informed to continue you know empowering young people, and their parents, and all users in fact, with tools to really ensure and create the experiences that are right for them. I will leave it there and turn it over to Kathryn.
As you know we have committed to deliver a research API globally. Currently, it is open for application to US-based academics, but we are working to expand that further. 6 That site 7 will update as we go and move to the next spaces as we go along. That data sharing mechanism will include public data on videos accounts and comments. We see that as one way that we developed [a response] ahead of this regulation, but we do think it is part of our offering to be responsive to that. We are also working through the best mechanisms to share data with vetted researchers under the DSA. We are really grateful again, as we think about partnerships and how we work through these sometimes tricky intellectual problems of how you share data and protect privacy and everything, that we are in close touch with the European Digital Media Observatory among others. It is just a really exciting time to have a really committed, innovative team here – and with our colleagues on other platforms – to think through new options and potential tools that TikTok could bring to the table.
To sum up, I have been spending a lot of time working on the research API and helping make sure that we tested it and that we improve as we go. As we roll that out to more markets, I hope we will see a lot of interest and see it get put to good use particularly to benefit young people online. That is what is on my mind in the regulation space, and hopefully it is useful to the audience here.
Research partnerships and academic-industry collaboration
On that note of being quite open to this conversation about your research access to platform data, as well as Claire's mention that your policy is actually all research-backed, I want to be a bit cheeky – We have a room full of academics who are very invested in TikTok. We do qualitative research every time you launch a new feature, we are probably one of the first people to know because we are on the platform, we need to study the impact straight away. So, if you had a wish list for researchers and the types of research that we would do or the types of knowledge that we could get from our very diverse perspectives – as you know, we have got scholars from six continents in attendance today – tell us a bit about what you would wish out of the ‘genies’ from us in terms of research projects, progress, concepts, findings.
I think in terms of the theme of ‘parenting’ that we touched on earlier, Family Pairing is a tool that we offer globally and one that we are continuing to iterate on to make sure that it is really serving the needs of our community. In that vein, we really see an opportunity for more research into regional and cultural variances in the parent-child relationship, and how this manifests in navigating online experiences and safety controls. We hear quite a bit about the ‘American style’ or an ‘American approach’ to parental supervision, parental controls, and we recognize that this is very different in different cultures – in particular, understanding what this looks like in different regions, perhaps where children drive household technology adoption and are often more advanced in digital literacy development than older generations. I think that would be really helpful for us and our product team colleagues as we continue building out these kinds of tools that empower parents to set some limits over their teens’ experience, but also building opportunities and bridges for offline conversations as well. Again, really understanding what that looks like at a regional and cultural level and where those differences may lie. I think another theme, and I think going just broadly by safety by design considerations, I know this is something that governments around the world and that we internally are thinking quite a bit about.
I think, Crystal, you had mentioned at the start of the conversation that we often hear folks really focused on sort of the risks and harms posed to young people online, but there are also many great benefits for young people when it comes to accessing technology, building online communities. More research into best practices for empowering young people to access those benefits through safety by design strategies and overall, how we should be striking balances between safety and expression, and ensuring that we honor and empower young people to exercise their digital human rights online. I will leave it there and turn it over to Kathryn, whom I am sure has a long wishlist in this regard.
There is another thing that I will mention, since this is such a great, incredible group of researchers here. For TikTok, the data sharing space and more substantive research collaboration is still really new; in some ways, it is my ‘baby’ here at TikTok, and I really want it to thrive. And so to do that, I would really like the research community overall to feel empowered to give us feedback and hopefully be willing to try out different models of collaboration with us. I know that for that to work, we have to take responsibility for earning researchers’ trust. So to that end, making sure the tools we provide – such as the research API, when you are able to access it – is useful and reliable, and show that we can be responsive to research findings as we evolve our approach over time. Those things are really top of mind for me. I think we are really open to feedback. Once the API rolls out, 8 I would love to work with TikTok Cultures Research Network and do a demo, so that people know what is available and what you are in for with that data access. That kind of opportunity would be really exciting.
The last thing I will mention is something that comes up a lot for us: more youth-led research. That, to me, is very interesting and could be an area where partnering with folks like you all could make a lot of sense, just in terms of how to best set that up and make sure it is successful […] I am very excited to continue to collaborate and get feedback, and make sure that what we are able to provide in terms of research partnerships is really a win-win for everybody involved.
Q&A: Design changes
We have a question specifically for Claire: Do you have an example of the way that the design of TikTok has changed already due to child safety concerns?
Q&A: API access
Q&A: Age-gating
For me – and Claire, you can also chime in – I think for me, what we are thinking about here is the trade-off in terms of privacy. Are there ways that we can verify users’ age without being invasive? And so that is something that I think is a challenge that we can certainly partner with the research community to answer, and also, if we have to make a trade-off, what are our users [and] their parents most comfortable with? I think those are huge questions that having an evidence-based approach to is really important. Claire, feel free to chime in because I know you are thinking about this a lot as well.
Q&A: Universality & regionality
I think the other thing is, when we talk about the reasons for us to work with external partners on research, one really good indicator that is helpful for us is that the research partner would bring a new perspective. When we think about how to add the most value for research, that is a really important thing for us – whether it is actually a different region or like a different culture or a new methodological approach, things like that are really important to have a well-rounded research program here.
Parting words
Footnotes
Funding
The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by a Strategic Research Investment Package at Curtin University, and an Australian Research Council Discovery Early Career Researcher Award (DECRA; DE190100789).
Notes
Author biographies
Crystal Abidin is Founder of the TikTok Cultures Research Network, and Associate Investigator at ARC Centre of Excellence for the Digital Child.
Claire Gartland is the former Youth Safety & Wellbeing Global Product Policy Lead, Trust & Safety Global Product Policy Department at TikTok.
Kathryn Grant is the Outreach & Partnerships Manager, Research Partnerships Lead, at TikTok Trust & Safety.
