YouTube is big on AI. It’s been promoting the use of generative artificial intelligence for years now, calling on creators to embrace this ‘new era’ of content-making where humans and AI work together to speed up and streamline video production.
But the very tools tech bros developed for this ‘new era’ are now being used to breathe new life into one of YouTube’s most expensive controversies.
Elsagate. Sparked by a single 2016 article from The Guardian about a trend of disturbing animated videos aimed at kids, Elsagate snowballed into a sweeping, years-long scrutinization of child safety on YouTube–scrutinization that eventually led to a $170 million Federal Trade Commission fine and mass demonetization of kids’ content. Under fire from creators, viewers, advertisers, and regulatory agencies, YouTube quashed most Elsagate content, and while there have been scat, the scale of them has remained small.
Subscribe for daily Tubefilter Top Stories
Until now. WIRED today reported a new flush of Elsagate-esque content on YouTube–and this time, the people making them are using AI.
Like the OG Elsagate vids, these feature kid-bait characters like Minions, Thomas the Tank Engine, and even Elsa herself. There are cute cats and dogs, too. Except those dogs and cats are likely to be the subjects of horrific transformations, and Elsa–again, just like the old days–is likely to be kitted out in lacy lingerie, with a swollen pregnant belly. In one video, a Minion is morphed by slime, then sneaks up behind a child and chews them to meaty pulp.
The channel that posted that last video is Go Cat, which had nearly 25,000 subscribers and more than 7 million lifetime views. It promoted itself as “a fun and exciting YouTube channel for kids,” and its banner was an AI-ified Thomas the Tank Engine with searing red eyes. “Every episode is filled with imagination, colorful animation, and a surprising story of transformation waiting to unfold,” its bio read. “Whether it’s a funny accident or a spooky glitch, each video brings a fresh new story of transformation for kids to enjoy!”
After WIRED pinged YouTube about Go Cat and other similar channels, YouTube terminated two that depicted generated animal violence, and suspended monetization of three more. At the time, Go Cat was not terminated, but it’s now showing a 404 error, so either its owner or YouTube deleted it within the last few hours.
“A number of videos have also been removed for violating our Child Safety policy,” a YouTube spokesperson told the outlet. “As always, all content uploaded to YouTube is subject to our Community Guidelines and quality principles for kids—regardless of how it’s generated.”
As for preventing further content like this, YouTube simply said these videos are against its TOS and that it’s enforcing guidelines “using a combination of both people and technology.”
The real story here is that–not to give any credit to the original Elsagate ‘creators’–back in the day, if people wanted to make this content, they had to draw and/or 3-D animate it themselves, a process that took significant time. That time requirement may have naturally constrained the amount they could make.
Now, though, all people have to do is load up Midjourney and its ilk, type a prompt, and push a button. We already knew there was a deluge of AI slop on YouTube that’s gathering both clicks and revenue for uploaders. It was inevitable that the sort of people who made Elsagate would want to use the same generators to effortlessly pump out their ‘content.’
In its response to WIRED, YouTube brought up the Youth Digital Wellbeing Initiative, which it formed in March with nearly 20 kids’ content creators and distribution companies, including Pinkfong, WildBrain, The Wiggles, and Cocomelon owner Moonbug. The initiative’s goal is to “actively [limit] the reach of low-quality content.”
“We want younger viewers to not just have a safer experience but also an enriching one,” a YouTube spokesperson told WIRED. “To support this, we partnered with experts to create a set of quality principles for kids and family content meant to help guide creators in creating quality content for kids and reduce the amount of content that is low quality, regardless of how it was created.”
But the fact that this AI Elsagate popped up a month after the initiative was founded makes us wonder what sort of systems YouTube is putting in place to limit the reach of this content. Go Cat had 7 million views by the time it was shut down, and it only came to YouTube’s attention because of WIRED‘s investigation. (This is a theme with problematic content on YouTube).
If YouTube wants to keep pushing AI and child safety, it has to figure out how to balance the two. And soon.