Parents, we need to talk about YouTube. With over a billion users, YouTube’s audience includes nearly one-third of all of the people on the internet. By any measure, the site’s an excellent entertainment resource—for adults. For kids, it’s problematic. We’re not talking about mature videos, intended for adults, that children might accidentally stumble onto while looking for something to watch; we’re talking about disturbing, shocking videos purposely created for children. Some of those videos are capable of traumatizing children, and if you allow your kids to browse the site unsupervised, you’re taking a significant risk. Here’s a basic overview of the problem: YouTube uses various algorithms to match search terms to appropriate videos. Type in a search term like, “how to cut a dog’s hair,” and the site will provide you with a list of (relatively) high-quality instructional videos; if one of those videos is subpar, poor user ratings will eventually drive that clip from the search results. While YouTube (and Google, which owns YouTube) keeps its search algorithm factors a secret, we know that likes, dislikes, video length, and exact-match keywords play a significant role. That last point is crucial: If a video matches the exact keywords you type into the search bar, it’ll have a better chance of showing up on your search results.
For the most part, the system works great—for adults.
Children, however, don’t know how to search for content like adults. When younger kids look for videos on YouTube, they’re often typing in a few simple keywords and clicking on the first interesting clip that comes up. They don’t know how to like or dislike videos, and they don’t mind sitting through longer content. They also don’t recognize video titles that flagrantly take advantage of YouTube’s search algorithm. That makes them easy prey for content creators. For instance, if an adult searches “how to cut a dog’s hair” and finds a video titled, “cut dog hair how to cut dog hair cut golden retriever hair dog barber,” the adult will probably recognize that the video is trying to trick people into clicking; a toddler wouldn’t draw the same conclusion. That brings us to the problem. YouTube’s algorithm currently rewards videos that steal copyrighted characters, use crass titles, and contain shocking content. If the clips keep young kids clicking, they’re valuable—regardless of whether the content itself is harmful. Take a look at this clip: https://youtu.be/lfwxfQoobiA Titled, “Disney Pixar Coco 2 Miguel Hector Wrong Heads Finger family Nursery Rhymes song,” it’s a relatively harmless (if slightly disturbing) example of the problem. It’s clearly designed for extremely young viewers, and at over 10 minutes long, it has probably made some decent money for its creator (the YouTube channel Super Story). With that said, it steals copyrighted content and purposely exploits YouTube’s algorithm to do so. That video we linked currently has more than 3.6 million views. Super Story has dozens of similar videos, most of which have tens of thousands of views. Another video (which we won’t link here and have reported to YouTube) has a similarly exploitative title, but with a lewd keyword hashtag. It contains violent, bizarre content, including Mickey Mouse fighting while dressed up as various Marvel superheroes. It’s over 35 minutes long. While these examples are strange, we’re just scratching the surface. Other clips include explicit sexual content, bad language, and depictions of violent acts. We’re not linking those videos for obvious reasons. https://www.youtube.com/watch?v=7Eh8WQfwDnk Unfortunately, they’re fairly easy to find, and thanks to YouTube’s current algorithm, they’re extremely profitable for content creators. For a much more detailed look at the content algorithm issues, check out this excellent piece from James Bridle of Medium.
YouTube claims to have taken steps to curb the problem.
In August 2017, the site said it wouldn’t allow creators to make money from videos that “made inappropriate use of family-friendly characters,” and three months later, YouTube announced stricter controls for videos aimed at young children. The company claimed the new controls had been in development for some time and said that they were not introduced in response to widespread media coverage. However, the problem hasn’t disappeared. While researching this article, we easily found over a dozen disturbing videos that were clearly marketed toward young children. Some of the videos were over a year old. Some had millions of views. We reported the offensive links to YouTube, and will not link to those clips in this article, but parents who want to understand the extent of the issue can do so by adding a few offensive terms to kid-friendly search strings (for instance, “finger family,” or “nursery rhymes”). This isn’t to say that YouTube isn’t taking action, but thousands of videos exist, and they don’t disappear from search results after they’re de-monetized. While YouTube’s strategy could eventually curb the problem, it relies on volunteer moderators and adults who can flag offensive videos before kids get a chance to see them. For parents, that’s not good enough.
To keep your kids safe, here’s what you need to do.
The obvious answer is to prevent kids from using YouTube and to thoroughly monitor screen time until kids are old enough to understand how to use the website responsibly. “If parents decide to [use parental controls], it is okay to discuss with the child why these steps are being taken. Kids need to know there are dangers online parents are responsible for preventing.” —Támara Hill, licensed child and adolescent therapist
“If parents decide to [use parental controls], it is okay to discuss with the child why these steps are being taken. Kids need to know there are dangers online parents are responsible for preventing.” —Támara Hill, licensed child and adolescent therapist
More importantly, parents should establish clear limits to electronic time for preteens.
“Set a time where [your child] can utilize their devices, preferably a time where the parent is available to check in on what they’re doing,” Campbell says. “Establish ground rules … Kids know when they’re viewing something that their parent wouldn’t approve of and will always test the limits when given the opportunity.” Campbell recommends establishing clear consequences for broken rules, noting those consequences should be directly related to screen time. That helps establish the connection between the bad habits and the consequence. For instance, if kids watch a video without telling the parent, the parent might take away screen time for a day or week. To make sure younger kids don’t stray into dangerous parts of the web, parents should check browser histories and use mobile apps designed to limit access. “Parents can monitor a child’s access to inappropriate material by downloading apps such as Screen Time Parental Control,” says Támara Hill, a licensed child and adolescent therapist specializing in trauma. “These apps make it possible for parents to monitor and locate content on iPads and cellphones.” Again, these apps aren’t completely foolproof, but they can serve as an additional deterrent. “If parents decide to do this, it is okay to discuss with the child why these steps are being taken,” Hill adds. “Kids need to know there are dangers online parents are responsible for preventing.”
Another crucial tip: Don’t wait to establish guidelines.
If kids are old enough to use a smartphone or tablet, they’re old enough to follow the rules. “If you don’t allow kids to educate you to their social media worlds, how can we prevent the dangers of it? We can’t.” —Gretchen Campbell, licensed professional counselor
“If you don’t allow kids to educate you to their social media worlds, how can we prevent the dangers of it? We can’t.” —Gretchen Campbell, licensed professional counselor