Abstract
Children’s digital playgrounds have evolved from commercialized digital spaces such as websites and games to include an array of convergent digital media consisting of social media platforms, mobile apps, and the internet of toys. In these digital spaces, children’s data is shared with companies for analytics, personalization, and advertising. This article describes children’s digital playgrounds as a data assemblage involving commercial surveillance of children, ages 3–12. The privacy sweep is used as a method to follow the personal information traces that can be expected to be disclosed through typical use of two children’s digital playgrounds: the YouTube Kids app and Fisher-Price Smart Toy plush animal and companion app. To trace the data flows, privacy policies and other publicly available documents were analyzed using political economy and privacy informed indicators. This article concludes by reflecting upon the dataveillance and commercialization practices that trouble the privacy rights of the child and parent when data assemblages in children’s digital playgrounds are surveillant.
Introduction
In 2012, Google Inc. filed a patent application for a “device … perhaps in the form factor of a doll or toy … to control one or more media devices” (DeVaul and Aminzade, 2012). If the technology from the patent application materializes, it would allow a young child (or other user) to interact with a playful interface to operate a device like a smart television. Google’s patent application illustrates some of the emergent possibilities encompassing children’s digital playgrounds that incorporate the internet of things (IoT) and the internet of toys (IoToys). The International Telecommunication Union (ITU) describes the IoT “as a global infrastructure for the information society, enabling advanced services by interconnecting (physical and virtual) things based on existing and evolving interoperable information and communication technologies” (ITU, 2013: 1). Expanding this definition from things to play-related interfaces, Holloway and Green (2016: 506) specify that the IoToys is “where toys not only relate one-on-one to children but are wirelessly connected to other toys and/or database data”.
Children’s digital playgrounds thus encompass both an emergent app economy and ecosystem (Burroughs, 2017) and the IoToys. Within this milieu, an array of play related websites, games, toys, and apps are scaffolded upon the internet to engage children and their caregivers to disclose data or personal information for analysis and use by corporations, or other actors. This may include sharing biometric data (voice, facial image) or personal information (name, age). Although it may seem futuristic, the big data analytics of the app economy and IoToys is operational today, with the valuation of the IoToys market expected to reach $11.3 billion by 2020 (from 2015 Juniper Research, cited in Mascheroni and Holloway, 2017).
This article expands upon earlier research on digital playgrounds (Chung and Grimes, 2006; Grimes and Shade, 2005), which refer to the commercialized digital spaces where marketers leverage the data that young people share in websites and games. We extend the idea of the digital playground to include convergent digital media consisting of social media platforms, mobile apps, and the connected contexts of the IoToys designed for young people ages 3–12. The privacy implications for children in this age range who may encounter these digital playgrounds are pressing to examine because they are the group excluded from many social media platforms as users (e.g., Facebook, Twitter, Snapchat). In place of social media sites, children’s apps and IoToy usage are readily emerging as the data production and processing hub for commercially relevant information about this very young consumer age group. Although privacy legislation, notably the Children’s Online Privacy Protection Act (COPPA) in the US, deters youngsters under 13 from creating accounts with many websites and services, their data is still collected via parentally controlled accounts or apps (Federal Trade Commission, 1998–2015).
This article takes a social constructivist view of technology 1 to examine the emerging data assemblage of technologies and practices that facilitate the surveillance of youngsters. We analyze YouTube Kids and the Fisher-Price Smart Toy as part of the data assemblage of children’s digital playgrounds. YouTube Kids was selected to analyze because of the prominence of Google in generating value from our digital activities; what Vaidhyanathan refers to as “the Googlization of everything” (2012), and which now includes the youth market. Announcing the launch of YouTube Kids, Google described it as “the first Google product built from the ground up with little ones in mind” (Google Blog, 2015). In addition to YouTube Kids, the 2012 patent application by Google makes clear that toy-like devices could be developed as part of the IoToys. We thus include the Fisher-Price Smart Toy in our analysis as one example of a plush toy that is currently available and which augments apps, games, websites and other interfaces that constitute the IoToys.
To analyze YouTube Kids and the Fisher-Price Smart Toy, we introduce the privacy sweep, as a method used by privacy enforcement agencies to bring public awareness, compliance, and regulatory cooperation with privacy legislation. As we argue, problematic privacy risks and the commercialization of play are evident in the current configuration and nodes of children’s digital playgrounds. In our discussion and conclusion, we present five reflections from the data assemblage that emerge from our privacy sweep: (1) parents are acting as data proxies and supervisors for children; (2) personalization is proposed as the benefit for disclosing data; (3) promotional culture and advertising is present in children’s digital playgrounds; (4) digital playgrounds exhibit algorithmic opaqueness; and (5) the positioning of organizations must be considered in relation to data stewardship.
Digital playgrounds as data assemblage
A data assemblage refers to a heterogenous network and socio-technical system “whose central concern is the production, management, analysis and translation of data and derived information products for commercial, governmental, administrative, bureacratic or other purposes” (Kitchin and Lauriault, 2018: 8). Kitchin and Lauriault describe a data assemblage as intertwined and overlapping apparatuses (e.g., political economy, marketplaces, subjectivities and communities) involved in producing data. They link the concept of the data assemblage to various theories, which aid us to interrogate the idea; for instance, foundational to the data assemblage is Foucault’s theory of the dispotif and the power/knowledge relationship which can explain how data infrastructures are “always inscribed in … power” (p. 9).
Within this article, actor network theory, as a branch of the social construction of technology, assists us to trace a data assemblage. Building upon studies of the production of scientific knowledge and technologies, Latour (2007) described an increasing traceability of activities through digitally mediated technologies. Emerging data assemblages, including those produced within children’s digital playgrounds, can be viewed as sites where research is needed, reinforcing Kitchin’s call for a “pressing need for case studies that trace out the sociotechnical arrangements of whole assemblages” (2014: 188).
Earlier digital playgrounds provide a foundation to understand the commercialized and surveilled data assemblage associated with its current design, including platforms, apps, and the IoToys. NeoPets, launched in 1999, is an example of a popular digital playground. Designed as an online community for children to create and nourish a digital pet, the site generated revenue from immersive advertising™ (a term they trademarked), a practice derived from product placement techniques in broadcasting (Grimes and Shade, 2005). NeoPets ushered in an array of digital playgrounds including virtual worlds, gaming (massively multiplayer online games), and interactive websites featuring stories and contests. As Grimes documents, toy companies, children’s media companies, and private and public television networks established virtual worlds, as extensions of their popular brands and toylines (2015a: 111), with many garnering “significant profits through the sale of tie-in toys, media crossovers, and other ancillary products” (2015b: 128). Advertising strategies were integrated into the sites and games (“advergames”), including product placement and micro-transmissions (“pay to play”), raising “a new set of questions in terms of how players’ labor and intellectual property become commodified in the process” (2013: 394).
Big data has since shaped digital playgrounds, configuring family dynamics, escalating marketing techniques via data-driven advertising techniques like programmatic advertising, and heightening concerns over how datafication intensifies surveillance and commodification, while lessening children’s privacy rights. Throughout their life cycle, children and young people are intrinsically enveloped in an infrastructure of digital surveillance, from cradle through their school years (Leaver, 2017; Shade and Singh, 2016). Different types of digital surveillance are implicated, including self-surveillance, social and participatory surveillance, and familial surveillance. Children are also, as Lupton and Williamson argue, “configured as algorithmic assemblages as the result of these [datafied] practices, with the possibility that their complexities, potentialities and opportunities may be circumscribed” (2017: 787).
Taylor and Rooney highlight how digital technologies increasingly blur the boundaries of private and public lives. Using the example of the WiFi connected IoToys Hello Barbie talking doll, they remark that, “the family perimeter has become porous and commercial entities are encroaching further into children’s lives, using surveillance techniques that are increasingly difficult for children and families to resist” (2017: 6–7). Digitally connected devices rely on information that can be transmitted “from the sanctity of the family home” (6) to remote databases managed by corporately owned data analytics firms. Children’s digital play is thus suffused with a surveillant assemblage, wherein an abstracted notion of the child—consisting of demographic characteristics and discrete pieces of information—becomes valuable to corporations and marketers (Rooney, 2012). As Haggerty and Ericson (2000) describe, a surveillant assemblage occurs when a data double can be constructed from “a series of discrete flows” which are “reassembled” to represent a person (605). As we detail in the conclusion of this article, while children and young people engage with an array of digital technologies throughout their life cycle, the persistent commercialization and datafication of their play raise ethical tensions and privacy concerns about whether the parent can maintain control over their child’s digital identity (Berman and Albright, 2017).
The privacy sweep as a method to trace the surveillant data assemblage of the digital playgrounds
Two illustrative nodes of children’s digital playgrounds that demonstrate the commercialization, datafication, and the surveillant assemblage that we describe in this article include the YouTube Kids app and the Fisher-Price Smart Toy line.
One challenge in researching the data assemblage of digital playgrounds with these types of technologies is that companies collect and use data in ways difficult for ordinary users to discern. For this reason, we draw from the method of a privacy sweep, developed by privacy enforcement agencies, to begin to interrogate the flow of data. Theoretically, our use of the privacy sweep is informed by actor network theory and the idea that our digital activities are increasingly traceable (Latour, 2007). To develop indicators for the privacy sweep, we draw from the work of the Global Privacy Enforcement Network (GPEN), who have developed a range of questions and prompts. Adapting a selection of the GPEN indicators assists us to examine what personal information is collected about identifiable individuals who interact with digital playground technologies. In our analysis, we are also informed by political economy, which is identified by Kitchin (2014) as an apparatus to interrogate the broader context in which a data assemblage emerges. Exploring how personal information is handled in children’s digital playgrounds enables us to explore the context of commercial surveillance. The following sub-sections outline previous GPEN privacy sweeps and introduce the indicators we use in this paper for analysis.
Previous GPEN privacy sweeps
The GPEN involves collaboration between 47 countries that regulate privacy in various ways (GPEN, n.d.). In 2013, GPEN adopted the language of a privacy sweep to refer to a rapid, but systematic and distributed effort by privacy enforcing agencies (PEAs) to assess digital technologies for privacy risks. As described by Bennett (2015), a privacy sweep is “a broad-cross-national research exercise designed mainly to recreate the consumer experience and to assess the transparency of personal information practices against a common set of indicators” (2). GPEN sweeps have established privacy relevant indicators to allow for rapid examination of privacy policies (2013), mobile apps (2014), apps and websites for children (2015), the IoT (2016), and educational applications for school aged youth (2017) (Office of the Privacy Commissioner of Canada (OPC), 2016a; Information Commissioner's Office of the United Kingdom (ICO), 2016a).
Developing indicators and selecting texts for analysis
Privacy sweep indicators. a
aIndicator 1 adapted from Kitchin (2014). Indicators 2–3 are largely quoted from the OPC (2016b) with both omissions and additions made by the authors.4
YouTube Kids and Family Link Indicator 2.
Fisher-Price Smart Toy Indicator 2.
Indicator one is informed by Kitchin’s (2014) attention to the political economy considerations of a data assemblage. Mosco (1996) described political economy of communication as the study of power relations in the “production, distribution and consumption of resources” (25). Here we extend to consider the broader context of children’s toys that situate personal information as an increasingly valuable commodity. To examine the political economic context surrounding YouTube Kids and the Fisher-Price Smart Toy, we analyzed an array of texts including blog posts, popular press pieces, and websites.
Indicators two and three begin with an exploration of what personal information is collected, used, and disclosed in children’s digital playgrounds. Our privacy sweep’s attention to the collection, use and disclosure of personal information relies upon the notice and consent model of privacy. To consider personal information relevant indicators for YouTube Kids and Fisher-Price Smart Toy, we examined the privacy policies, homepages, and associated online documents (see Google Family Link, n.d.; Smart Toy, n.d.-b, n.d.-c; YouTube Kids, n.d.; YouTube Kids Parental Guide, 2018 a-d). Recognizing that privacy policies and associated documents can be changed, or sometimes even removed from the web, we archived the materials using the Internet Archive. 3 For the Fisher-Price Smart Toy, we also analyzed a vulnerabilty report from a cybersecurity firm that detailed specific elements of personal information collected about young people through the toy that were vulnerable to hacking (Stanislav, 2016). Reviewing these documents and focusing on personal information facilitate an exploration of what types of profiles may be compiled on young people.
In planning our methodological steps for analysis, it is important to note that we were constrained by the Terms of Use for the Fisher-Price Smart Toy. The Smart Toy Terms of Use document outlines that it does not permit the “use of the Smart Toy App for purposes of comparison with or benchmarking against products or services made available by third parties” (n.d.-c). Although this clause seems unduly restrictive, we chose to adhere to the Smart Toy Terms of Use by focusing our analysis on texts and various materials that are available outside of the app. Table 1 introduces the indicators we utilized and subsequently we share our results.
Results of a privacy sweep of nodes in digital playgrounds
YouTube Kids
Indicator 1: What political economy elements surround the implementation of the digital playground node and its privacy policy?
Google launched YouTube Kids, “a safer version of YouTube” (YouTube Official Blog, 2015) in 2015, enabling children and parents to browse a curated “family friendly” version of their prolific site, featuring popular children’s videos and new content. YouTube’s global head of family and learning remarked that navigation was simple and visually attractive for children, “with big, colorful, swipeable buttons designed for little pudgy fingers” (Hamedy, 2015). Parental controls were built into the app: a timer to regulate screen time, an option to turn off and regulate the sound on videos, a method to regulate search options to pre-selected videos chosen by parents, and a mode to communicate product feedback (Google Blog, 2015). Two years after its launch, Google boasted the app received “30 billion views and over 8 million weekly active viewers” (Google Blog, 2017).
Children’s media brands have a presence on YouTube Kids through their channels. These include Mattel (Barbie, Hot Wheels, Fisher-Price products), Hasbro, Disney, Crayola, Lego, Nintendo and Nerf. Marketing approaches include animated shows featuring toys as characters, children’s play (e.g., toy competitions—races with Hot Wheels, Nerf wars), and “how to” videos (crafts/DIY projects). A popular genre of video blurring the line between kid-generated and commercial productions are “unboxing videos”, where commercial products are literally unwrapped from their packaging, constructed, and played. Popular unboxing channels can attract billions of views and generate significant revenues for their creators (Marsh, 2016).
Algorithms mediate the content that children view. In their instructions for search, Google details that, “Videos in search results are selected by our algorithm without human review. We’ve taken a number of precautions to ensure that families searching in YouTube Kids will see results that are appropriate for younger audiences” (YouTube Parental Guide, 2018c). Algorithms recommend and prompt children to click on popular content, shaping “new expectations about narrative structure and informational environments” (LaFrance, 2017). During our use of YouTube Kids, it was readily apparent that typing a connected toy’s name, such as the Fisher-Price Smart Toy, cues up numerous unboxing and toy review videos for a young person to consume. In this manner the IoToys is both represented through content in video player apps, and enacted through data collection and analytics.
Depending on how children sign in to YouTube Kids, there are two relevant privacy policies. The most common is for the child/parent to sign in to YouTube Kids and adhere to relevant portions of the Google Privacy Policy. The other, launched in 2017, allows children under the age of 13 to create their own Google account, and sign in via Family Link. Family Link, available only in the US, enables parents to create Google accounts for their children under the age of 13, thus remaining COPPA compliant. To create a child’s Google account, a parent must read the Family Link Disclosure for parents, including both their Privacy Notice for Google Accounts Managed with Family Link, and the Google Privacy Policy, and give consent by authorizing a “small fee” on their credit card. Until this fee is given, “Google will not knowingly collect, use or disclose your child’s personal information unless you’ve provided this consent” (Google Family Link, n.d.-c).
A dedicated webpage provides detail on the privacy policies (YouTube Kids Parental Guide, 2018d), with the page linking to the Family Link page (Google Family Link, n.d.-a). In addition, Google’s Privacy Policy (n.d.) provides detailed information from the YouTube Kids Privacy Policy that may be applicable.
Indicator 2: Do privacy communications adequately explain how personal information is collected, used, and disclosed?
For YouTube Kids (See Table 2): Information is collected for internal operations: spam/abuse prevention, content license protection, determining language, and improving services. Information is also collected to offer personalized content, provide contextual advertising and frequency capping. Children cannot share personal information with third parties or make data publicly available.
Family Link specifies that information is collected to: communicate important notices; to “provide, maintain, develop, and improve our products, content, and services”; to authenticate/identify child’s identity; “to protect Google and our users.” Information may be used internally for auditing, data analysis, and research. Explicitly stated is that the privacy policy does not apply to any third party apps and websites that the child may use.
Indicator 3: Are users fully informed about how personal information collected by the device is stored and safeguarded?
YouTube Kids: No, but Google’s Privacy Policy states that the collection and storage of information and personal information may be stored locally on one’s personal device through browser web storage and application data caches. A section on Information Security in the Google Privacy Policy describes encryption via secure sockets layer (SSL); two-step verification, Chrome Safe Browsing, restriction of personal information to employees of Google, their contractors and agents who need this information to process information for Google; it states that these agents adhere to “strict contractual confidentiality obligations and may be disciplined or terminated if they fail to meet these obligations.”
Family Link: Not explicitly—Local storage on the child’s device may be deployed
Fisher-Price Smart Toy
Indicator 1: What political economy elements surround the implementation of the digital playground node and its privacy policy?
Fisher-Price is a toy company founded in 1930 and well known for iconic children’s toys, including the Little People line and Chatter Telephone (Fisher-Price, n.d.-a). Fisher-Price merged with toy giant Mattel in 1993, and in the IoToys market, they are currently “teamed up” with Smart Toy, a Limited Liability Corporation (LLC), to release a plush monkey, bear, and panda toy (Fisher-Price, n.d.-a; Smart Toy, n.d.-a).
Smart Toy was originally founded by designer Carly Gloge and programmer Isaac Squires through their 2013 crowdfunded Kickstarter campaign to bring the Ubooly smart toy (a plush smart toy which housed a mobile phone or tablet to act as the interactive face and the interface for playtime with a child) to market (Kickstarter, n.d.; Smart Toy, n.d.-a; Ubooly, n.d.). Ubooly was bought by “Cartwheel Kids, LLC in 2014” (Watry, 2015) and rebranded to “Smart Toy.” Seeking collaborators, there were “discussions with entertainment partners, including Disney, about the possibility of incorporating Smart Toys’ platform into the companies’ brands” (Hutchins, 2015). Incorporation of the platform was achieved in 2015 with Fisher-Price; the Ubooly Kickstarter page positioned their “Monkey, Bear and Panda” as Ubooly’s “siblings.” The Fisher-Price Smart Toys now include the plush animals, a companion app, and smart cards that provide different programs or activities, such as playing a game or making up a story. The product’s homepage describes Smart Toy as “an interactive learning friend with all the brains of a computer, without the screen” (Fisher-Price, n.d.-b), geared for children aged 3–8 (Smart Toy, n.d.-d). Features of the Fisher-Price Smart Toy are highlighted for children and parents through video clips (Fisher-Price, n.d.-b) and through unboxing and review videos accessible through YouTube Kids.
Indicator 2: Do privacy communications adequately explain how personal information is collected, used, and disclosed?
The Fisher-Price Smart Toy privacy policy and terms of use documents each consist of a webpage, with the Fisher-Price and Smart Toy logos at the top (Smart Toy, n.d.-b, n.d.-c). Both the privacy policy and terms of use distinguish Smart Toy as its own LLC. From the terms of use document: Welcome to the world of Smart Toy®. Our toys allow kids to have conversations and go on magical adventures with their toys. A Smart Toy® will function without any connectivity, but pairing it to your home’s Wi-Fi connection via the parent app (“Smart Toy App”) will allow the toy to automatically update with new content, know your child’s name and receive firmware updates.
Indicator 3: Are users fully informed about how personal information collected by the device is stored and safeguarded?
The Fisher-Price Smart Toy privacy policy emphasizes where data may be stored by the Smart Toy Service without specific information about securing any data stored on the toy and how it may transmit data to the app or Smart Toy for service delivery. In terms of locating data, the policy states that Smart Toy is in the US, and “you consent to the processing and transfer of information in and to the U.S. and other countries.” There is also the specification for European users, that data “may be disclosed to overseas recipients located … outside of the European Economic Area” and without similar data protection provision. The privacy policy does not contain any details about how data will be safeguarded (e.g., the use of encryption or secure transmission protocols) from the toy as a device that interfaces with an app or backend analytics through an API.
The lack of information in the privacy policy about the interface between the toy, the app, and Smart Toy is troubling given the past security vulnerabilities uncovered by Stanislav (2016). Specifically, Stanislav “discovered that the [Fisher-Price Smart Toy] programmers had designed the bear’s backend to use unsecured application programming interfaces (APIs)—portions of code that allow pieces of software to interact—enabling an attacker to learn profile information about registered children, such as names, dates of birth, genders, and spoken languages” (Hackett, 2016).
Discussion and conclusion
Children’s digital playgrounds consist of convergent digital media embedded into domestic accoutrements. Our privacy sweep of the data relevant resources pertaining to YouTube Kids and the Fisher-Price Smart Toy reveal many nuances of how personal information is collected, stored, and used. Apps and the IoToys bring to the children’s digital playground new gadgets replete with dataveillance and commercialization that trouble the privacy rights of the child and parent.
When considering the Fisher-Price Smart Toy and YouTube Kids, it is important to identify that some networked linkages are being established through content. Smart toys intersect with YouTube Kids through unboxing videos and similar influencer marketing videos. YouTube Kids has come under criticism from US digital rights organizations who filed complaints to the Federal Trade Commission about the overt commercialization and deceptive advertising practices embedded in the site, especially influencer marketing (Campaign for a Commercial-Free Childhood and Center for Digital Democracy, 2015; Campaign for a Commercial-Free Childhood et al., 2016). Central to these critiques is that digital forms of entertainment are evading the rules of broadcast and cable television with respect to children’s marketing. In the US, the Federal Communication Commission (FCC) prohibits “host-selling”, referring to the practice of advertisers pitching their products during shows using a character in the program; in 2004 this was extended to include websites (FCC, 2017). Another troubling aspect is that many ads are covert.
From the privacy sweep and our analysis of the homepages, privacy policies and related materials for YouTube Kids and Fisher-Price Smart Toy, five issues emerge that highlight the power dynamics inscribed through the data assemblages of children’s digital playgrounds. Our use of political economy and personal information relevant indicators in our privacy sweep aid us to begin to sketch some of the characteristics of the surveillant data assemblage of children’s digital playgrounds. First, parents are tasked to act as data proxies and supervisors for their children. Second, personalization of interactive experiences is positioned as a benefit for children and parents as end-users who disclose data. Third, promotional culture and advertising is tied to data disclosure in digital playgrounds. Fourth, the algorithms that power digital playgrounds remain opaque for parents. Fifth, there are patterns in data stewardship for digital playgrounds that parents, or other interested parties, may wish to examine.
1. Parents as data proxies and supervisors
From the privacy sweep, it is evident that corporations are tasking parents with the large responsibility of managing and overseeing digital playground accounts used by children under the age of 13. Privacy policies make clear that the parent typically acts as the data proxy for their child. The parent stands in for the young person to type their personal details, and to agree to the terms and conditions, or privacy policy. Holloway and Green describe that with the IoToys companies “shift responsibility … to parents” (2016: 507). Our analysis demonstrates multiple ways in which the parent becomes the responsible party or supervisor of the account in everyday life but also for backend processing of the data.
For YouTube Kids, the parental guide specifies that adults can “set a timer to limit how much time a child spends on the app. Other controls include blocking content, turning off search, clearing history, pausing history, and other features.” Furthermore, from Google’s Family Link (n.d.-b), the parent can “manage apps, keep an eye on screen time, and set a bedtime for your child’s device.” Parental supervision through the Fisher-Price Smart Toy app is slightly different in that it is also mediated by the toy. As described in the product FAQ “The Smart Toy® parent app … allows parents to trigger activities including daily helpers like teeth brushing …” (Smart Toy, n.d.-d). When the parent selects an activity in the app, the child is encouraged through the voice of the toy to perform the desired activity, such as playing a game. Parental supervision also extends to advertising contexts through potential backend analytics on data provided through personalized accounts.
2. Personalization of interactive experiences
Analysis of both YouTube Kids and Fisher-Price Smart Toy documents demonstrates the personalization of interactive experiences. For YouTube Kids, parents can create a profile for each child user, which includes their name and an avatar image. For Fisher-Price Smart Toy, parents can share details like the child’s name, which can be spoken back to the child by the toy. Individualized accounts are promoted for each unique child who is “like a Snowflake” (Smart Toy, n.d.-e).
The privacy policies relevant to YouTube Kids and the Fisher-Price Smart Toy reveal further details about how personalization is enacted. For Google’s Family Link accounts, a child’s information is used “to offer them tailored content, such as more relevant app recommendations or search results” (Google Family Link, n.d.-a). With Fisher-Price Smart Toys data may be used, “to monitor and analyze usage and trends and demographic information, and to personalize and improve the Service, our technology and our users’ experiences” (Smart Toy, n.d.-b). While the privacy policy does not make clear what data relevant to a child’s learning will be collected, the product’s website states that parents will be able to use the companion app to “gain insights on how your child is learning” (Smart Toy, n.d.-e).
Data sharing with external companies, beyond the known trusted brand, is a possibility with both YouTube Kids and Fisher-Price Smart Toy. Tracing this sharing of data from the privacy sweep, aids in better understanding the surveillant data assemblage that emerges from the IoToys and the promotional and advertising related activities which may ensue.
3. Promotional culture and advertising
Montgomery et al. (2017) enumerate eight key trends in the commercialization of children’s media and the commodification of content that trouble privacy rights for the parent and child. These include: cross-platform media; data-driven marketing practices; personalization; location and “path to purchase” targeting; continuous, integrated measurement; the internet of things; ubiquitous data collection; and the dissolving boundaries between school, home, social, and personal lives.
The implications of advertising, based on data and information collected from children’s use of digital playgrounds, remain a major consideration. Privacy policies for both Smart Toy and YouTube Kids contain language limiting the possibility of behavioural or personalized advertisements. Smart Toy explicitly states that, “information will not be used for any behavioral advertising” (Smart Toy, n.d.-b). The Google Family Link account specifies that, “Google will not serve personalized ads to your child, which means ads will not be based on information from your child’s account” but specifies information such as search queries or the “general locations (such as city or state)” may be used to generate and serve advertisements. Although Smart Toy states they will not conduct behavioral advertising, they “may use third parties to provide analytics services and advertising services” (n.d.-b). Whether parents understand the differences between behavioral, personalized, and other forms of advertisements that are generated and served from their children’s data in children’s digital playgrounds remains an avenue for future research.
4. Algorithmic opaqueness
In addition to potential confusion about advertising, parents may also be unsure of how the algorithms operate in digital playgrounds. The algorithms that offer recommendations for content in YouTube Kids, or enable a toy like the Fisher-Price Smart Toy to function, may remain opaque to parents in various ways. Some attention to the operation of algorithms has surfaced through news stories detailing the presence of disturbing content on YouTube Kids (LaFrance, 2017; Mashewari, 2017). In relation to the content that is recommended to young people, Lafrance (2017) describes that YouTube shares some information about how its algorithms operate. A YouTube spokeswoman was however unwilling to answer Lafrance’s questions about how the data inputs differ between the algorithms for YouTube Kids versus the original YouTube site (Lafrance, 2017).
Algorithms are also closely guarded for the Fisher-Price Smart Toy app. The Terms of Use document for the Fisher-Price Smart Toy includes a clause that restricts users from attempting to “reverse engineer or otherwise attempt to … discover any source code, underlying ideas or algorithms of the Smart Toy app” (n.d.-c). Additionally, the previously cited restriction against use of the app for comparison or benchmarking may restrict researchers, parents, or other interested parties from examining the inputs of personal information and other data disclosures that inform the algorithms of the Smart Toy app.
5. Patterns in data stewardship
The YouTube Kids app and Fisher-Price Smart Toy represent two distinct possibilities in data stewardship within digital playgrounds. In the case of YouTube Kids, data is collected and held by Google, a dominant technology company holding massive amounts of data about individuals, and increasingly families. In contrast, data collected through the Fisher-Price Smart Toy resides with the Smart Toy LLC, a company at arm’s length from Fisher-Price.
These two models for data stewardship raise several considerations for parents. In the case of YouTube Kids, parents may wish to consider the array of data which Google collects on their child or the larger household over time. For the Fisher-Price Smart Toy, the data is collected by the Smart Toy LLC (Smart Toy, n.d.-b). The legal partnership between Fisher-Price and Smart Toy has privacy implications for parents and their children. In this case, it is important to identify that while the toy comes from a trusted brand, a startup company is the primary organization to collect and manage the data for the companion data. Parents need to be aware that the data collected through YouTube Kids or Fisher-Price Smart Toy could serve as a financial asset for the companies involved.
Data assemblages are emerging through the toys, apps, and playful interfaces that comprise children’s digital playgrounds. As data is collected and used, platforms, apps, and the IoToys raise alarming ethical tensions and privacy concerns, including device security (Chaudron et al., 2017; Electronic Privacy Information Center, 2016; Forbrukerradet, 2016). For parents, can they maintain control over their child’s communicative practices and digital identity? For the child, what are the mediating factors when their playground is branded and influenced by commercial forces? As this article highlights, the dataveillance properties within the panoply of children’s digital playgrounds are becoming embedded in the everyday practices of child’s play, with implications for early commercial profiling and targeting of youngsters.
Footnotes
Acknowledgments
We thank the anonymous reviewers as well as colleagues who attended the Data Power 2017 conference, who provided helpful feedback on this work. Thanks as well to Rianka Singh at the Faculty of Information, University of Toronto, for her initial research on YouTube Kids.
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: The eQuality Project, a partnership grant funded by the Social Sciences and Humanities Research Council of Canada under grant #895-2015-1002.
