It’s our natural instinct to help kids grow into responsible digital citizens—and that means helping parents, too. Kinzoo’s “In the Wild” series was specifically designed for our founder, Sean Herman, to share the latest news, resources, and knowledge from beyond our zoo gates.
“The stories told in this piece are eerily familiar to me, as I had a similar experience with my own daughter. It’s this type of unfiltered, inappropriate content that inspired Kinzoo, and the proactive approach we'll take to monitor content on our upcoming app. Many platforms, like YouTube Kids, have good intentions, but errors like this really highlight parents’ need to intervene.”
On YouTube Kids, Startling Videos Slip Past Filters
It was a typical night in Staci Burns’s house outside Fort Wayne, Ind. She was cooking dinner while her 3-year-old son, Isaac, watched videos on the YouTube Kids app on an iPad. Suddenly he cried out, “Mommy, the monster scares me!”
When Ms. Burns walked over, Isaac was watching a video featuring crude renderings of the characters from “PAW Patrol,” a Nickelodeon show that is popular among preschoolers, screaming in a car. The vehicle hurtled into a light pole and burst into flames.
The 10-minute clip, “PAW Patrol Babies Pretend to Die Suicide by Annabelle Hypnotized,” was a nightmarish imitation of an animated series in which a boy and a pack of rescue dogs protect their community from troubles like runaway kittens and rock slides. In the video Isaac watched, some characters died and one walked off a roof after being hypnotized by a likeness of a doll possessed by a demon.
“My initial response was anger,” said Ms. Burns, a nurse, who credits the app with helping Isaac to learn colors and letters before other boys his age. “My poor little innocent boy, he’s the sweetest thing, and then there are these horrible, horrible, evil people out there that just get their kicks off of making stuff like this to torment children.”
Parents and children have flocked to Google-owned YouTube Kids since it was introduced in early 2015. The app’s more than 11 million weekly viewers are drawn in by its seemingly infinite supply of clips, including those from popular shows by Disney and Nickelodeon, and the knowledge that the app is supposed to contain only child-friendly content that has been automatically filtered from the main YouTube site.
But the app contains dark corners, too, as videos that are disturbing for children slip past its filters, either by mistake or because bad actors have found ways to fool the YouTube Kids algorithms.
In recent months, parents like Ms. Burns have complained that their children have been shown videos with well-known characters in violent or lewd situations and other clips with disturbing imagery, sometimes set to nursery rhymes. Many have taken to Facebook to warn others, and share video screenshots showing moments ranging from a Claymation Spider-Man urinating on Elsa of “Frozen” to Nick Jr. characters in a strip club.
Malik Ducard, YouTube’s global head of family and learning content, said that the inappropriate videos were “the extreme needle in the haystack,” but that “making the app family friendly is of the utmost importance to us.”
While the offending videos are a tiny fraction of YouTube Kids’ universe, they are another example of the potential for abuse on digital media platforms that rely on computer algorithms, rather than humans, to police the content that appears in front of people — in this case, very young people.
And they show, at a time when Congress is closely scrutinizing technology giants, how rules that govern at least some of the content on children’s television fail to extend to the digital world.
When videos are uploaded to YouTube, algorithms determine whether or not they are appropriate for YouTube Kids. The videos are continually monitored after that, Mr. Ducard said, a process that is “multilayered and uses a lot of machine learning.” Several parents said they expected the app to be safer because it asked during setup whether their child was in preschool or older.
Mr. Ducard said that while YouTube Kids may highlight some content, like Halloween videos in October, “it isn’t a curated experience.” Instead, “parents are in the driver’s seat,” he said, pointing to the ability to block channels, set usage timers and disable search results.
Parents are also encouraged to report inappropriate videos, which someone at YouTube then manually reviews, he said. He noted that in the past 30 days, “less than .005 percent” of the millions of videos viewed in the app were removed for being inappropriate.
“We strive,” he added, “to make that fraction even lower.”
Holly Hart of Gray, Tenn., said she was recently reading while her 3-year-old daughter was in the room when she noticed that Disney Junior characters in the video her daughter was watching started “turning into monsters and trying to feed each other to alligators.” An image previewing a recommended video showed the characters in a provocative pose.
“It was an eye-opener for me,” said Ms. Hart, who had downloaded the app because it was being used at the local elementary school.
Not all of the inappropriate videos feature cartoons. Alisa Clark Wilcken of Vernal, Utah, said her 4-year-old son had recently seen a video of a family playing roughly with a young girl, including a scene in which her forehead is shaved, causing her to wail and appear to bleed.
Most of the videos flagged by parents were uploaded to YouTube in recent months by anonymous users with names like Kids Channel TV and Super Moon TV. The videos’ titles and descriptions feature popular character names and terms like “education” and “learn colors.”
They are independently animated, presumably to avoid copyright violations and detection. Some clips uploaded as recently as August have millions of views on the main YouTube site and run automatically placed ads, suggesting they are financially lucrative for the makers as well as YouTube, which shares in ad revenue. It is not clear how many of those views came on YouTube Kids.
One video on YouTube Kids from the account Subin TV shows the “PAW Patrol” characters in a strip club. One of them then visits a doctor and asks for her cartoon legs to be replaced with long, provocative human legs in stilettos. The account’s description says, “Video created with the purpose of learning and development of children!”
The account that posted the video seen by Ms. Burns’s son is named Super Ares TV and has a Facebook page called PAW Patrol Awesome TV. Questions sent there were mostly ignored, though the account did reply: “That’s a Cute character and video is a funny story, take it easy, that’s it.”
The Super Ares TV account seems to be linked to a number of other channels targeting children with cartoon imitations, based on their similar channel fonts, animation style and Greek mythology-inspired names, from Super Hermes TV and Super Apollo TV to Super Hera TV.
A Super Zeus TV account included a link to a shopping site called SuperKidsShop.com, which is registered in Ho Chi Minh City, Vietnam. A call to the phone number listed in that site’s registration records was answered by a man who declined to identify himself. He said that his partners were responsible for the videos and that a team of about 100 people worked on them. He said he would forward email requests for comment to them. Those emails went unanswered.
Dr. Michael Rich, a pediatrics professor at Harvard Medical School and the director of the Center on Media and Child Health, said such videos brought up a host of issues for children. “It’s just made that much more upsetting by the fact that characters they thought they knew and trusted are behaving in these ways,” he said.
Josh Golin, executive director of the Campaign for a Commercial-Free Childhood, argued that inappropriate videos on YouTube Kids showed hazards of today’s media reality.
“Algorithms are not a substitute for human intervention, and when it comes to creating a safe environment for children, you need humans,” Mr. Golin said. His group and the Center for Digital Democracy filed a complaint with the Federal Trade Commission in 2015 accusing YouTube Kids of deceptive marketing to parents based on inappropriate videos.
Using automation for online advertising has turned Google into a behemoth worth more than half a trillion dollars. The company has faced a new wave of criticism in the past year for lacking human oversight after its systems inadvertently funded fake news sites and hateful YouTube videos and most likely sold election-related ads to accounts affiliated with the Russian government.
Google has largely defended its errors by pointing to the enormous amount of content it hosts, including more than 400 hours of content uploaded to YouTube every minute.
Disney and Nickelodeon, mainstays of children’s programming, work with YouTube Kids to introduce children to their characters. But they are also aware that their content can be mixed in with disturbing knockoffs.
“Nickelodeon creates its characters and shows to entertain kids, so we share the same concern as parents about the unsuitable nature of some of the videos being served to them,” said David Bittler, a spokesman for the Viacom-owned network.
A Disney spokesman said YouTube Kids had assured the company that it was “working on ways to more effectively and proactively prevent this type of situation from occurring.”
Some parents have taken to deleting the app. Others, like Ms. Burns, still allow its use, just on a more limited, supervised basis.
“This is a children’s application — it’s targeted to children,” said Crissi Gilreath, a mother of two in Oklahoma, “and I just can’t believe that with such a big company they don’t have people whose job it is to filter and flag.”
This article was originally published by The New York Times on November 4, 2017.
Photo Credits: fizkes / Shutterstock