Confusing COPPA Terms on YouTube Lead to More Questions for the FTC
In September, Google paid a landmark fine of $170 million to the Federal Trade Commission after an extensive investigation found that its YouTube platform had violated the Children’s Online Privacy and Protection Act (COPPA).
In the statement, the FTC maintained that YouTube knowingly exploited data collected from children to serve its advertising agenda. As a result, the video content platform has recently undergone several major changes to its infrastructure to curtail similar issues in the future.
Now, YouTube has recently turned to the FTC to request further clarification on confusing COPPA terms and how they’ll be implemented.
“Currently, the FTC’s guidance requires platforms must treat anyone watching primarily child-directed content as children under 13,” YouTube said in a statement. “This does not match what we see on YouTube, where adults watch favourite cartoons from their childhood or teachers look for content to share with their students.”
This leaves many content creators and influencers to wonder where their future stands with YouTube, and whether they can still continue to monetize their channels.
What is COPPA?
COPPA was passed in 1998 to protect children under the age of 13 across the Internet. Managed by the Federal Trade Commission, the Act specifies what needs to be included in privacy policies, requires websites to collect parental consent before gathering data from underage users, and outlines the responsibilities that website owners are required to uphold to maintain children’s safety and privacy online.
The FTC Ruling & Aftermath
YouTube agreed to pay $170 million in fines to the FTC and New York State for violation of COPPA. The settlement found the media company guilty for illegally collecting personal data from children and using it to siphon personalized ads.
Even with the penalty, FTC’s Rohit Chopra was unsatisfied with the settlement charges amount. “Financial penalties need to be meaningful or they will not deter misconduct,” she said in a statement. Alphabet, Google’s parent company, is slated to generate $161 billion in revenue this year, making $170 million feel more like a slap on the wrist.
As a result of the settlement, Google agreed to overhaul its platform to adjust the way children can interact with that content. However, YouTube’s approach has caused a stir.
To comply with COPPA, YouTube influencers are now asked if their content serves an audience of children before they upload videos. If the content is geared towards kids, YouTube won’t collect data without consent from a parent, and won’t use behavior targeting tactics to show children relevant ads.
For content creators that rely on ad revenue streams to keep their channels going, this will be a huge hit. Currently, several of the highest earners are YouTube create content geared towards both children and adults:
New YouTube Policy Coming Soon
But moving forward, bigger infrastructure changes are on the way. Videos that are marked for children will no longer have a comments section, effectively decimating engagement and community on those channels. Customization options, like screens and info cards, will no longer be available to YouTubers who create content for children. And lastly, users that are subscribed to channels that publish videos for kids will not be notified when a new video is uploaded and those videos will not appear in YouTube search or in recommended videos.
With each of these factors in the mix, YouTubers whose audience include children will essentially be isolated from the rest of YouTube. Users will not be able to discover those pieces of content in search, interact with content creators in comments, or navigate to related content from that channel.
YouTube will also enforce its own algorithm monitoring tool as a backup measure to locate mislabeled content—videos that serve a young audience but aren’t specifically labeled as such during the upload process. If YouTube finds a video that it believes caters to kids, the creator will not be able to appeal the decision. The FTC also noted that they would begin targeting content creators who mislabel their videos, fining them for up to $42,000 per video.
The new policy will go into effect on January 1st, 2020.
Pushback & Criticism
These new policies could be quite dire for content creators. One of the biggest concerns is that there are many videos that blur the line between content that caters to children and content that caters to adults. This is particularly true for the gaming niche, one of the most popular categories on YouTube. According to Tubular, 15% of all content on YouTube fall under the gaming umbrella. And in 2018 alone, users watched 50 billion hours of YouTube gaming videos.
In one video, YouTuber Matt Patrick (who runs a channel called Game Theorists) questioned whether some of the most popular YouTube channels would be able to survive 2020, stating that the policy could eliminate creators that YouTube users have been following since they were young. “If you’re a content creator, you’ve got to be nervous about what 2020 is bringing to your life, and to your channel,” he said in the video, which has amassed nearly 320,000 views to date.
The following is an excerpt from a COPPA staff report, which outlines how it determines what content is considered for children:
Others are concerned about YouTube’s content-scouring algorithms, concerned that they could be unfairly penalized, with no possibility of revoking an algorithmic decision.
In an interview with the Verge, Dan Eardley, a YouTuber who reviews collectable toys, said,
“Creators are being held directly responsible by the FTC,“ told The Verge. “So if the FTC decides that [we] are indeed targeting children, we’ll be fined. That is frightening. It’s especially scary because the verbiage of ‘kid directed’ vs ‘kid attractive’ isn’t very clear. It’s hard to know if we’re in violation or not.”
New Harassment Policy
To add insult to injury, YouTube recently added a new harassment policy to its community guidelines to penalize videos that insult users on specific identities like race, gender, and orientation. The new policy went into effect immediately to “content that maliciously insults someone based on protected attributes,” which applies to malicious comments as well.
Although this seems like a good step in the right direction, users weren’t happy with the changes, citing distrust in YouTube’s ability to be fair with what violates those terms, as well as the platform’s steps towards additional levels of censorship.
What users have seen across social media platforms that adjust their policies is that those censorships come at the cost of losing creative content that doesn’t always cross the line. This announcement, coupled with the COPPA ruling, have left many creators unsure of the future.
YouTube Seeks Further Clarification from FTC
Due to the immense new layers limitations on content creators, YouTube has officially turned to the FTC for further clarification. Essentially, they want to understand how its creators can best comply with COPPA and the FTC ruling. As previously mentioned, it’s very difficult to determine which content is child-directed and which is not.
Gray area content leaves many of those creators unsure of where they stand on a platform that’s always supported them. Because creators will be held liable in the future (rather than YouTube itself), YouTube advised content creators to consult with a lawyer to determine how the ruling will impact their channels moving forward, and to determine whether their videos should, in fact, be labelled for children.
“Ultimately, we can’t provide legal advice,” YouTube said. “We’re unable to confirm whether or not your content is Made for Kids. That decision is up to you taking into consideration these factors.”
In their announcement to the FTC, they addressed gray-area content, saying, “Creators of such videos have also conveyed the value of product features that wouldn’t be supported on their content. For example, creators have expressed the value of using comments to get helpful feedback from older viewers. This is why we support allowing platforms to treat adults as adults if there are measures in place to help confirm that the user is an adult viewing kids’ content.”