YouTube and its trouble with the COPPAs
If you’ve been aware of the social media buzz in recent days you will no doubt have heard lots of people discussing YouTube, COPPA and how it’s the zombified remains of the ‘Adpocalypse’, so what the heck is going on?
In finer grain detail it states children under 13 may only legally supply personal information, knowingly or unknowingly, with their parents’ permission. This has caused a lot of consternation for many websites most notably social media, but also those that collect personal information via forms and tracking methods, prevent children under 13 from using their services due to the cost of meeting the legal stipulation.
That’s where YouTube comes in, since its creation in 2005 the platform has been growing not only its content but also one of the most complicated, mysterious and downright judgemental discovery and advertising algorithms in the business. There have been multiple controversies surrounding the algorithm in the last decade often with the platforms content creators citing a lack of communication and intentional subterfuge on YouTubes part as the reason their videos are suddenly less, or far more, popular. This begins to make more sense when you realise that there are only two realistic ways of making money on the platform; sponsorship, something reserved only for the true upper echelon of each demographic/interest niche; or ad revenue, essentially a cut of the money generated from each advertisement run on each video on a creators channel, something available to basically any user but with a low yield per view. As such a lot of YouTube content creator culture is based around attempting to please the algorithm, as better relationship with the platform’s discovery means more views for videos and the channel in general meaning, more ad revenue.
Earlier this year, YouTube paid the Federal Trade Commission $200 million for an out of court settlement after an investigation into whether they had violated the COPPA.
The upshot of this beyond YouTube’s embarrassment and fine is beginning in 2020, all content creators will be asked to make a confirmation when they upload a video as to whether the video is ‘directed’ at children. They’ll also have to do this for each of their previously uploaded videos and in fact for the description of their whole channel.
‘Directed’ at children in this case refers to children as being the primary or intended audience that are only a secondary audience according to the video’s analytics.
Some of the things YouTube considers to make a video considered to be ‘directed’ at children include: containing characters, celebrities, or toys that appeal to children, notably animated or cartoon characters; include activities that appeal to children; and algorithmic evidence a video’s audience being primarily or secondarily those under 13.
This leaves content creators in a bit of a tough situation. Any video they describe as being directed at children will lose a lot of features, they not only rely on for making money, i.e. personalised paid adverts will not be run, but also key engagement indicators and generators such as; a comments section, pop-up information cards and much more. These videos would likely not produce much revenue for the channel meaning the incentive for children’s creative content could almost disappear overnight.
However, what happens if they just don’t mark their videos as being for children? Should this be seen as a wanton act there is the overarching possibility of being sued by the Federal Trade Commission, not pleasant. However, the likelihood of the FTC trawling for content themselves is incredibly small, after-all YouTube have been acting without considering COPPA since their very beginning and were only brought up on it this year after 14 years! The likeliest solution will be YouTube self-invigilating the content with the ability to pull advertising from or shut-down channels that don’t comply with the law. Unfortunately, this raises another habitual issue of the YouTube community.
Many content creators have a distinct lack of faith in YouTubes approach to assessing it’s content. Like many large social media companies, it uses code and algorithms to scan and detect things like bad language, violence and hate speech in its videos to either remove or demonetise as the need may be. It’s likely a similar program will be used to detect content directed at children, the problem being, it doesn’t work. Depending who you ask the current algorithms are either far too lenient or incredibly strict, with the opinion changing on a weekly basis, what most agree is they are more often than not completely off the mark and ineffectual. All of this bodes badly for the moderation of the videos and their desired audience and could compel many desperate creators to either chance their arm at not marking their product in accordance with COPPA or jumping to another platform where they can more easily access their target audience.
Whatever happens come early 2020 we will likely see a severe impact on children’s content made for YouTube but only time will truly tell the impact of COPPA on the whole platforms advertising potential.