We recently covered the story of the Rose Tinted Spectrum YouTube channel being taken down for seemingly no real reason before being thankfully reinstated, but we've done some digging since then and have discovered that YouTube's heavy-handed content guidelines are causing all kinds of issues for video game historians on the platform.
We were alerted to problems with YouTube's draconian censorship system by game historian and regular Time Extension contributor John Szczepaniak, who has been a victim of its takedown policy in the past. Szczepaniak highlighted cases relating to a trio of content creators – Majuular, The Cellar / Taigen Moon and GoodBadFlicks – all of whom had seen their content either removed or demonetised not due to copyright claims but because of their content.
For example, Majuular's excellent study on the topic of FMV horror games was impacted because it included footage of Phantasmagoria 2, with the creator noting in the comments that YouTube flagged it as "inappropriate for ads" and the section relating to Sierra's game had to be removed.
"When talking about a game, I always like to have a balanced discussion on the cultural significance of certain aspects of that game, even when those subjects are controversial," says Majuular. "I was careful to give ample warning, censor, and provide context, but the algorithm cares not for such things." Meanwhile, Taigen Moon's superb deconstruction of the John Hurt-starring erotic FMV title Tender Loving Care was taken offline because it featured adult content, while GoodBadFlicks had a video taken down for the same reason – something the channel owner refutes.
"It seems to me like there's also an increasing censorial stance at YouTube, demonetising or striking down analytical videos exploring adult games or games with adult themes," says Szczepaniak. "Not shock-bait, but insightful content with historical value. YouTube's policies now basically make it impossible to have any sort of intellectual discussion."
Unlike streaming services such as Netflix, Disney+ and Amazon Prime Video – which have shows, documentaries and movies for all ages – YouTube has a somewhat trickier job on its hands moderating content as it's almost entirely generated by users. Some would argue that YouTube has a duty to protect younger viewers from seeing harmful content, but the issue seems to be that the video-sharing service is dealing with so much footage that it's impossible for it to verify everything correctly – so it has turned to AI to do the job.
"I wouldn't say that YouTube is cracking down harder than before," says Taigen Moon. "Actually, I wouldn't mind if that were the case, because at least then it would be consistent. The problem since forever has been that YouTube can't possibly moderate all of the content uploaded to the site, and even if they could employ enough people to manually review everything, 'standards and practices' so often become blurry when you're talking about art. Therefore, they rely on bots that they've trained to look for certain things in the visuals or transcript, and leave it up to some machine learning algorithm."
The uptake of AI has accelerated during 2024, and 2025 is unlikely to be any different, despite repeated goofs by AI-based search engines and the like. Taigen Moon has been unconvinced for some time now. "One of the reasons I've been sceptical of AI as soon as it became the buzzword was because I've been seeing firsthand the reliability of these machine 'learning' algorithms and how utterly useless they are on a large scale. It became obvious through insider comments and observation that the algo had become so complex and sensitive to change that YT's actual engineers didn't really know how it worked or how to predict its results. It's like if you gave control of the site to a 6-month-old baby who can hit buttons."
Volume is one issue, but what caused YouTube to take such a stance against mature discussion on its platform in the first place? Taigen Moon traces it back to around 2017 when the Wall Street Journal accused PewDiePie – easily the largest creator on YouTube at the time – of espousing far-right views. "Because the only way YouTube had found to monetize the platform was ads, and advertisers were pulling out, YouTube went through a phase called the 'adpocalypse.' Content guidelines became a lot stricter and, as usual, were dealt with inconsistently."
Simply put, YouTube needs advertisers to make money, and if there's one thing advertisers don't like, it's controversy – especially as their ad might be playing during the video, which would suggest some kind of complicity with its content and views. YouTube already has an age gate system, but it doesn't seem to have much faith in it, given the examples we've listed so far. YouTube also makes all content creators tick a video if it's explicitly aimed at children – which should surely leave the door open for grown-up content, but clearly doesn't.
Taigen Moon chooses not to monetise videos on YouTube and, therefore, was only caught by the system because "there was a nipple in the first segment of the video." Despite following YouTube's guidelines on nudity, the AI bot "saw a nipple and decided to shoot," according to the content creator. "They lie and tell you a human reviewed it, but you know they didn't."
So, where does this leave content created purely for older viewers? "I don't make videos for a child audience," says Taigen Moon. "95% of my viewership is in the 18-35 age range, so why am I beholden to the standards of children's content? Why can't I label my channel an 'older audiences' one and be done with it? There's nothing insidious or politically driven about YouTube's behaviour in particular; they're just trying not to embarrass themselves."
Taigen Moon adds that YouTube's policy has had a long-lasting impact on content creation. "I censored my Harvester video in a few places by blurring sexual things and colour-filtering some blood green. As of now, nothing bad has happened; I just don't want a repeat of the last time I uploaded a feature-length video, and it got bodied."
Another person who has fallen foul of YouTube's policies is Devin Monnens, who produced an "experimental documentary" called Contra vs. Contra in 2007, which gained praise from Henry Lowood, one of the leading experts on machinima. "In Contra vs. Contra, Devin Monnens mixes Konami's Contra and historical footage related to the Iran-Contra affair to create a piece that mixes video art and political commentary," said Lowood when he uploaded the video to the Internet Archive in 2008. "A provocative use of videogame footage."
The reason given for YouTube's recent takedown was for showing graphic violence, which is indeed the case with Monnens' video. However, he makes the very valid point that video games like Mortal Kombat also show this kind of violence (admittedly, it's not real) and don't seem to get punished. (If you were being overly cynical, you could point out that videos relating to Mortal Kombat fatalities promote a product made by a company which advertises on YouTube.)
Monnens explains that when your video is hit with such a claim, you have the right to appeal. However, it more often than not falls on deaf ears. "I spent way longer than I needed to writing an appeal, only to have it rejected in less than five minutes. There is no way it was reviewed by a human; I'm guessing it's done through AI. This video has been up since 2007 and had fewer than 800 views. And suddenly, now it's on their radar? I'm pretty sure the scene they took objection to had bodies of Nicaraguan civil war victims juxtaposed with footage of Contra enemies being killed in slow motion. The footage was from a trailer of a documentary on the Iran Contra affair. The purpose was to show the real-life cost of war."
The situation has clearly concerned Monnens, who introduced us to Super Columbine Massacre RPG!! creator Danny Ledonne to hear his thoughts on the topic. "YouTube and other sites are over-correcting for political reasons after the moral panic about algorithms radicalizing socially isolated individuals," says Ledonne. "So we are all being collectively punished for a set of websites that cannot create R-rated versions of themselves, so they insist on effectively making their entire platform PG-13. Content platforms should get much smarter about making access safe and age-appropriate, but also comprehensive in nature so adults can use them without being constantly infantilized with 'community standards' child daycare sensibilities." It's worth noting that video games are not the only area of YouTube impacted by this – even videos about history are getting hit.
To cut a long story short, YouTube is throwing the baby out with the bathwater when it comes to policing content on its platform. "It's tempting to argue YouTube is anti-video game and pro-war, when the reality may be closer to these videos being victim to a set of content policy rules programmed into an AI that then arbitrates the process in a resulting system that unfairly removes content out of efficiency," concludes Monnens. "It is incredibly angering and a dangerous eroding of artistic expression."