In September 2019, YouTube was fined by the Federal Trade Commission (FTC) for breaching the law on the Child Online Privacy Protection Act (COPPA), which prevented websites from taking personal information from their users under the age of 13. In order to comply with this law, YouTube has readjusted its uploading system. This change, however, has not only affected content creators and increased apprehension of what they upload, but also viewers by limiting their engagement on videos marked as for children.
Content creators must now note if their video is made for people over or under the age of 13. Users who identify their videos as made for people under the age of 13 run limited ads, which means that certain YouTubers may lose compensation from some of their videos.
These videos are also limited on YouTube’s recommended section, meaning that fewer people are going to be able to see their videos. The FTC has given guidelines for determining if their content is child-directed, but YouTubers have complained that most of their videos fall into a grey area where they are having trouble distinguishing which category their videos fall into. It is also established that if a video is falsely labeled, the YouTuber can face a fine up to $42,000 per video violation.
Viewers also lose some features previously available to some of their videos. Any video labeled as aimed for people under 13 can no longer receive comments and can no longer be pulled down for multitask viewing.
The FTC is reviewing their rules on COPPA due to the changes that take place with technology and to examine the effectiveness of COPPA on a whole.