The Never-ending Elsagate Problem

Written by

Posted on

In mid-2016, news outlets would publish articles about a channel directed at children called ‘Webs and Tiaras.’

The channel featured people dressing up as characters such as Spider-Man, Elsa and the Joker, and they featured no dialogue, just background music over the scripted actions.

Due to the fact there was no dialogue, this allowed the videos to be distributed throughout the world, and the reach they just got even larger.

The characters featured on Webs and Tiaras (Credit: YouTube)

Webs and Tiaras certainly wasn’t the only channel out there either, because as soon as this one came into the light, others began copying their style hoping for a taste of some YouTube views.

Some channels would begin taking things a step too far and show the characters engaging in violent or NSFW situations. For example, some would portray Elsa giving birth, while others would show Peppa Pig burning down a house while the homeowners were inside.

Peppa Pig was a whole other rabbit hole in and of itself as people began to upload seemingly innocent extracts of cartoon episodes of the show, but hidden inside, at random timestamps, were gore videos, or other things that children should not have been exposed to.

These videos were sneaking their way past YouTube’s moderation and quickly ended up on YouTube Kids, the separate platform intended to host videos that are only suitable for children.

One of the thumbnails during the original Elsagate (Credit: YouTube)

The backlash against this quickly grew and in the summer of 2017, the term ‘Elsagate’ would be created and used to describe this disturbing internet rabbit hole.

These channels were not small and some had millions of subscribers, and even more views. They flew under the radar for so long before being discovered but when they were, celebrities and internet users alike were calling for action from YouTube, and they knew that the platform couldn’t ignore them.

YouTube responded by announcing new guidelines that attempted to combat the Elsagate content. They were no longer allowing creators to monetise videos that made inappropriate use of family-friendly characters, and they were knuckling down on channels featuring actual children in an attempt to stop this from ever happening again.

But skip forward to 2025 and the disturbing “trend” has once against resurfaced.

Content creator Swanaenae, who makes videos on disturbing internet subjects, said: “I found out about Elsagate in 2017 when it was originally starting to get a lot of news coverage. That’s where I found out about it and the seeds were planted in my mind that I knew this existed.

“But it was two years ago that I actually decided to cover it and was like, oh this stuff wasn’t completely nuked off the platform in 2017. It’s still here and it’s still prevalent.”

Just this year, new Elsagate content has come to light.

A disturbing channel called Cute Cat AI started to be recommended a lot and although it looked innocent enough on the surface, one closer look and you’d realise that was far from the truth.

The owner of the channel was using AI-generated gore to draw in viewers. By using hashtags like ‘cat’, ‘cute’, and ‘catlover’, they further increased their reach.

Some thumbnails featured dead cats with their insides exposed. In others, the cat was portrayed as pregnant, with kittens inside of a bloody stomach.

It may not have looked all that realistic, but it was certainly disturbing.

One YouTube commenter, AoiUsagiOtoko, said: “These give me a bigger feeling of dread than the old Elsa Spiderman stuff did. The gore is so… not realistic but it’s DETAILED and the imagery is legit horrific to look at.”

A thumbnail featured on Cute Cat AI (Credit: YouTube)

And their sicks attempts to draw in viewers worked because their most viewed video had over 1.7 million views before being taken down. For an average YouTube channel, that would make thousands in ad revenue, so they’ve likely left the situation with a good amount of cash in their pocket, with no care for who they may traumatise.

Although these videos had been up for months by the time they were spoken about, YouTube eventually took action and removed the videos from the Cute Cat AI channel. The remaining problem is that it certainly isn’t the only one out there.

In a recent investigation, Swanaenae found even more videos similar to this, and he spoke about what YouTube are doing wrong.

He said: “One issue is the lack of YouTube actually cracking down on these. According to YouTube guidelines, fetish content and stuff like that isn’t allowed on the platform, but when you report it, [nothing happens].

“There was this one example, there was literal uncensored porn of characters getting sexualised and I reported it, and the videos just got age restricted, not fully taken down. I get that certain content is meant to get age restricted but shouldn’t straight-up hentai be removed from the platform completely?”

Another YouTube commenter, CAPES4CHRIST, agrees. They said: “The fact that these kinds of channels can rack up hundreds of thousands, if not millions of views, uncontested, with zero consequences, never getting taken down, is absolutely sickening.

“I’ve had my eye on Elsagate stuff before it became mainstream and it’s both surprising and the not the least bit surprising that the content presented is getting even more graphic, and uploaders are getting more brazen.

“I know most of these channels make their money by preying on feelings of shock and confusion, as well as morbid curiosity that children may not be able to regulate, getting views this way.

“I know YouTube is never kind to people who [talk about this]. They’ll host whatever degeneracy these people upload and rake in adsense money, but heaven forbid you discuss it. When YouTubers make videos [covering this], they risk age restriction of their content, if not their channel’s outright deletion.”

YouTube failed to respond publicly to the new wave of Elsagate content, choosing to remove them silently instead, likely in an attempt to avoid public backlash.

But people are starting to believe that YouTube may be unsafe for kids – a conclusion that the company certainly wants to avoid.

Swanaenae said: “If I’m being completely honest, I think it’s only going to get worse. I hate to admit that but I do think it’s going to get worse, because I think it’s already getting worse.

“Realistically, the best thing that we, parents and people with kids in their life can do. The best thing that we and they can do is just to make sure that young ones in our lives aren’t stumbling upon this content.

“I know that’s hard to do because a majority of kids are on the internet and they’re watching YouTube content, and it’s hard to monitor them 24/7. But I guess the best thing we can do is try our best to monitor them.”

You only need to be 13 years or older to use the main YouTube website, with people under that being directed to the YouTube Kids platform.

All of this content is out there, on both platforms, so unless YouTube decide to take a solid stance and publicly denounce this type of content, then monitoring your children is the best way to avoid them stumbling upon this.

If you discover any of this content, you can find out how to report it to YouTube here.

Leave a comment