On April 8th, 2018 a coalition of 20 child advocacy, privacy and consumer groups filed a complaint asking the US Federal Trade Commission to investigate the Google-owned video site for alleged violations of children’s online privacy laws. The primary argument in the complaint appears to be that based on the content of YouTube, the service should be considered “child-directed” and that based on YouTube’s practice of hiring tens of thousands of “content moderators” it is likely that the site has actual knowledge that the content on the service is directed to, appeals to and is used by children under 13.
From the information presented in the complaint, there seems to be some merit in this argument, but I’d like to explore some of the points that were not highlighted in the report and dig into the implications in the context of YouTube’s use in the school environment.
If You are Under 13–Go Away
YouTube’s terms make a common statement – that “the Service is not intended for children under 13”, and the complaint shows Google/YouTube account signup screen shots that deny the user the ability to create an account if the user inputs a birthday indicating they are under 13. While this is a very common clause, it may run contrary to the FTC’s answer to question D.4 of the COPPA FAQ
4. I run a site that I believe may fall within the FTC’s sub-category of a website directed to children but where it is acceptable to age-screen users. Can I age-screen and completely block users who identify as being under age 13 from participating in any aspect of my site?
No. If your site falls within the definition of a “Web site or online service directed to children” as set forth in paragraph (1) of 16 C.F.R. § 312.2, then you may not block children from participating altogether, even if you do not intend children to be your primary target audience. Instead, what the amended Rule now permits you to do is to use an age screen in order to differentiate between your child and non-child users. You may decide to offer different activities, or functions, to your users depending upon age, but you may not altogether prohibit children from participating in a child-directed site or service.
Additionally, YouTube only age screens during the account creation process and not unauthenticated browsing of child-directed content.
Prevalence of YouTube as a “plug-in” in EDTech tools
One of the recent changes to COPPA was the expansion to cover the use of “plug-ins” and other 3rd party data collection tools (COPPA FAQ A.5). While the direct use of the YouTube Webste and mobile apps is common in school districts, the integration of YouTube into popular EDTech tools marketed to school districts is also very common. In many cases these tools are clearly child directed and may have actual knowledge of age (e.g.by a user identifying a grade level) . Example of popular EDTech tools that integrate with YouTube include:
- Khan Academy, FlipGrid, EDPuzzle, PlayPostit, SMARTNotebook, PowToon, Schoolbox etc.
- Most Learning Management Systems (e.g. Canvas, Blackboard, Moodle etc.)
- Microsoft Office, Google Docs, Slides, Classroom
YouTube Settings in GSuite for Education
YouTube, while not one of the “core” GSuite for Education tools, occupies an unusual middle ground. YouTube is classified as an “additional service”, not covered by the EDU terms, but while most of the other additional services have no management settings, YouTube has a significant number of settings in the GSuite Admin control panel that exist specifically to support filtering and viewing of YouTube content in schools.
YouTube settings in G Suite allow administrators to restrict which YouTube videos are viewable, which videos might show up as recommendations, and which videos are returned in YouTube Search results for signed-in Apps users. These restrictions can be applied by DNS setting, HTTP header injection or as Chromebook policies. Admins can grant the ability to approve restricted videos to verified teachers. Because of the core Classroom service, Google knows which users are students and which are verified teachers based on the membership of the special google for business group teachers_classroom.
Admins can apply one of 4 permissions policies in YouTube settings, configurable by each organizational unit:
While Google does not use this analogy, I believe it would be reasonable to characterize “strict” as G-rated, and Moderate as “PG” rated.
This has been Google’s approach to schools and YouTube since 2015, from 2011-2015 Google had a solution for managing YouTube in schools called YouTube for Schools, that had curated playlists with 400,000 educational videos and allowed schools to make network settings that limited the content to YouTube EDU videos. YouTube for Schools was not dependent on a school being a GSuite for Education customer.
For me, the important thing about Google’s treatment of YouTube in the GSuite environment is that their answer to how does a school provide “restricted” access to YouTube content is that the school must provide the child with a YouTube account. Restrictions aside-and those restrictions only apply on the schools network, Chrombook or other managed environment and typically not when the student signs in from home, or from an unmanaged device-this account is basically the same as any other consumer YouTube account. So the answer to a parent who might be concerned about what their child might view on YouTube, is to give that child an account where they can publish on YouTube.
GSuite Admin Notice and Consent when Enabling YouTube
If a school wants to enable YouTube for any student, they must get the parent’s permission. In 2016, Google added a consent screen with very detailed wording and two required check-boxes to make it very clear what a school Google Admin’s responsibilities are when they turn on a service that is not covered by the “core” GSuite agreement.
However, it appears that an early 2018 update to the Admin panel may have introduced a bug that causes this consent screen to not be displayed to the admin in 4 out of 5 use cases, depending on the click-path followed to enable the service (bug reported to Google as case#15553563 on 4/17/18).
Google’s practice of requiring the school to obtain parental permission is also worth dissecting. The shift in the last year from the use of under 13 to the use of 18 or under would seem to indicate an acceptance that the both COPPA and FERPA apply to the use in the school/GSuite for Education context. The precedent for schools obtaining permission for data use that fall outside of one of the FERPA exceptions is well understood, however the use of the school to obtain permission under COPPA may be more problematic. The FTC’s COPPA FAQ lays out specific, narrow cases where a school might provide consent on behalf of a parent, and the rule provides a mechanism for interested parties to file a written request for Commission approval of parental consent methods not currently enumerated in 16 C.F.R. § 312.5(b). See 16 C.F.R. § 312.12(a). Currently, delegating the responsibility of verifying the parent and collecting the parental consent on behalf of the vendor is not one of the approved methods. This method does not necessarily allow for the parent to inspect or prevent further collection. Additionally the original COPPA final rule includes language that COPPA does not prevent schools from acting as intermediaries between the vendor and the parent, but in this example, the consent does not get sent to Google, so the school is acting as the endpoint, not the intermediary for consent between the parent and Google.
What does Google Say about Collection in non-core services for EDU accounts?
In Google’s response to Senator Franken’s list of questions about data collection in GSuite for Education, Google responded that…
“We have worked to restrict use of K12 student personal information: our GAFE Privacy Notice promises not to use any K12 student personal information to target ads, including in services outside the GAFE core services. In some products, like Search and Maps, we have removed the ads altogether for signed in K12 GAFE users, while other services may show only contextual ads to these users. Contextual ads are selected without reference to any user profile or other personal information about the user”
However in January, 2017, Mississippi AG Jim Hood sued Google, claiming that the company is gathering personal data on students who use YouTube based on limited testing where investigators logged on to a laptop with a student’s educational e-mail address and password and made some queries on YouTube. Then they logged out, went to a different browser, and logged in again and [Youtube]”started shooting ads at us dealing with the same query that that child had put in.
Specifically in YouTube
In the online helps for YouTube, Google says that “the ads you see while watching YouTube videos are tailored to your interests and based on your Google Ad Settings, the videos you’ve viewed, and whether you’re signed in or not.” When you’re logged in, [..the following] anonymous signals [emphasis mine] may determine which ads you see:
- Types of videos you’ve viewed
- The apps on your device and your use of apps
- Websites you visit
- Anonymous identifiers associated with your mobile device
- Previous interactions with Google’s ads or advertising services
- Your geographic location
- Age range
- YouTube video interactions
- Whether you are signed in or not, the ads you see are based on the content of the videos you’ve viewed.
It is unclear why Google refers to these signals as “anonymous”. Without a better understanding of that, and if or how this differs for EDU customers, I would be hard pressed to call this anything other than behavioral advertising
Testing YouTube Data Collection
Determining exactly what data YouTube collects on a user is challenging. A packet capture provide some information, but not the whole story. The list below are the doubleclick domain connections for a User Not logged in to Any Google service (including YouTube), in Firefox Browser with history and cache cleared, accessing single YouTube page.
The same domains were observed for a User logged in to GSuite service with the YouTube enabled, in Firefox Browser accessing single YouTube page. Google provides the ability for a site to signal to doubleclick that it is “child-directed” by sending a specific tag(tfcd). This tag does not appear to be in use for GSuite users.
YouTube appears to collect user viewing and search history by default for GSuite for Education users that have the YouTube service enabled. There is not administrator setting to disable the collection of viewing and search history.
- Google should clarify what data is collected by YouTube, doubleclick and any 3rd party networks from GSuite for education accounts. The preferred behavior would be similar to the behavior of Google search for a user signed in to a GSuite for Education account. Google specifically states that data is not collected, and ads are not shown to the EDU user (see this example),
- Clarify what is meant by the “anonymous” signals that drive YouTube ads, and how does this definition align with the generally accepted understanding of “anonymous”, and even if anonymous, does it still meet the definition of “behavioral advertising”,
- Google should develop a method for restricting and white-listing YouTube content that is not dependent on the parent agreeing to consent to the creation of a YouTube account for their child,
- 3rd party EDTech tools that embed YouTube videos in their application, and/or provide a feature to do this should review their privacy policies and make any necessary updates to provide appropriate notice of YouTube as a “plugin” service and consider if it is possible to send the child directed tag,
- The the practice of EDTech providers offloading the responsibility of collecting parental consent on their behalf represents a challenging, and arguably legal gray area. This needs to be part of the ongoing conversation among regulators, vendors, advocates and educators related to the interaction between COPPA and FERPA