Imagine being a student and one of your classes is on video game history; reading material includes interviews with developers, or perhaps analysing old games mags, and the homework assignments are to play specific classic games. Oh, and the school has a library of 200 highly acclaimed examples, freely accessible. Sounds like a dream, right? Or an indulgent fantasy from a Saturday morning cartoon you might have watched in the early 1990s. But it is, in fact, reality!
What if we then told you that in this reality, the students were not playing the games, did not particularly want to play them, and the professors teaching these classes - from all around the world - were grappling with the challenge of enthusing their students and having them engage with the past? This was the talk given by Doctor Victor Navarro, co-authored with Doctor Beatriz Pérez-Zapata, which generated a lot of discussion with other teachers in attendance.
With all our recent coverage of the History of Games 2024 conference, comprising over 60 distinct talks, you might be asking yourself why? What is the point of all this research? Why organise a complicated array of overlapping panels over three days? The conference can't have been for you, retro enthusiasts and mainstream readers, because, as we've shown, the rest of the press didn't give a damn. But equally, it wasn't meant to be an enclosed bubble, either.
Doctor Nick Webber, one of the co-chairs of the conference, explained:
There is an expectation, in the UK at least, that academics have what's called impact. So that's recognising the fact that we take public money, and so we should be publicly accountable for our work. So we produce work that makes change in the world. One of the things we put quite a lot of thought into, is how we make that happen.
Before we get into the nitty-gritty of teaching students about game history, we need to establish that the pursuit of all this enlightenment is - ultimately - intended to benefit the wider world, including the modern games industry. "One of the things I'm involved in is the Historical Games Network," elaborates Dr Webber, adding:
We bring together game developers, people who work in the cultural sector - typically in cultural heritage - and academics, to talk about history and games and the relationship between them. That produces an event every three months, which is publicly open, anyone can attend. No fee, you just sign up and join the Zoom. We normally have between 50 and 60 people join. We also have a blog with guest posts - academics, game developers, some in the heritage sector, all thinking about how history works in relation to games or in their work. We act as a broad public audience; members of the public come to the events, asking questions, and we connect them with people.
So, there's engagement with the industry and developers, and a desire for public engagement too. Academia is not some unreachable or unfathomable other world. Which brings us to the conference and professors attempting to educate students about games. Not every delegate was a teacher of games studies or history, some were PhD or post-doctoral students, sharing their current research. But many were directly involved in teaching the developers of tomorrow. In some instances, former developers were now lecturing! Besides which, this vast repository of knowledge would at some point be available to others teaching, through academic texts, books, and research papers.
In short: all these studies are seeping back into the industry, influencing the games you might play in the future.
Understanding how creativity arises makes sense, right? According to GamesIndustry.biz, "Video games to pass $300bn revenue, 3.8 billion players by 2030." The cost to develop and the value of the market has grown to the point where we need to have solid foundational knowledge of methods and practices, and also the history, evolution, and genealogy of various games, genres, and stylistic choices; whether the topics are mechanical (gravity in platformers, UI optimisation) or thematic, such as standardising new non-linear narrative frameworks, rather than forcibly shoe-horning those found in older media like film and books. Narratives do not have to be linear - thinking they do requires correction through better education.
Basically, games are expensive, so you need to know what works, what fails, why it fails, and how games are uniquely special - they are capable of more than any previous human creative endeavour. Games are interactive and can incorporate visual art, sound, music, writing, acting, architecture, physics, economics, cooperation, competitiveness, sports, marksmanship, history, fantasy, folklore, abstract surrealism, exploration, geography, weather, mathematics, theology, politics, sociology... Quite literally, the summation of every conceivable human thought in the history of the species, presented as something to be touched and manipulated.
This particular topic was discussed at length over those three days, with Dr Webber adding:
We teach about the history of these things as context, to understand what [one is] doing. Sometimes yes, there's that traditional, historical, 'You learn about the past so you don't repeat its mistakes.' But actually, the kind of talk you gave for example, where you're making those connections, between different influences, and different processes of design, and different kinds of aesthetics, those are important.
Your author's talk was on forgotten lineages - conducting interviews with developers so see how they were influenced. Multiple examples were covered in 45 minutes, but to pick just one: The Tower of Druaga was savaged by American critics, IGN scored it 3/10 and said it was worthless, and yet Yui Tanimura, a director on multiple Soulsborne games, openly revealed it was the direct influence for From Software's messaging system, where players helped each other. Not the game itself, but the notebooks players kept in arcades to aid each other in finishing Druaga. It highlighted how an organically formed external component can be incorporated into later-made games.
This received a lot of discussion, because it exemplified the importance of understanding "history" not just as an abstract concept of dates and events, but as a contextual framework for the zeitgeist of specific eras. In other words, not just what happened, but why it happened and how it affected people and caused other things to happen later. It was gratifying to have professors come up afterwards, extol the virtues of what was presented, and declare that these ideas would inform their curriculum.
One of the questions I was asked in the auditorium, by David ten Cate, was: "For those teaching games history, how can they best incorporate these forgotten though influential lineages?" It was a good question, because it's the core of all this discussion: how to make all this relevant to students. My spontaneous response was the importance of conveying context. It's one thing to play an old game off GOG or Steam, but any specific title would have existed in an ecosystem of supporting magazines and press coverage, adjacent game releases, specific controllers, display types, and physical environments. If you're making games, you need to know all this, you need to know context, and not just good games but also bad games too.
All of these challenges to make students aware of what came before informed Dr Navarro and Dr Pérez-Zapata's talk, titled RetroStream: A Project in Innovation to Teach Video Game History. In turn, the responses from delegates attending said talk covered a complex set of issues.
"I've been teaching the history of video games for four years now," began Dr Navarro, adding:
I was teaching on the general culture of video games before that. When I got to my new university I was assigned a full 10 week course on the history of videogames. My first impression was that students - most of them want to learn the history of games, but they haven't experienced any of it yet. My colleague, a teacher with a second group, agreed that students were not playing games. To use a contemporary reference from pop culture: it was like teaching botany to the Fremen on Arrakis in Dune. So we needed people to play. Otherwise they cannot connect historical facts with actual games.
This resonated strongly with several teachers present, and aligned with other talks as well. But everyone was keen to stress, this was not a criticism of the 18-year-old students in the classes - it was an inevitable result of the media saturation the world exists in today. The thoughts the professors expressed were born not of negativity, but a passion for games and the desire to alight that same passion in their students.
"We wanted people to play, to follow the classes, to grasp how games used to feel," says Dr Navarro, before adding:
How games work, and how games evolved, and to look at 'retro video games' - whatever that means - as leading things, and not just things from the past. We got a small budget, €1000 for the project. But there is no magical solution for the cultural problems we have now with young people. Because they are over-saturated by a media landscape where they have a million options, and everything is kind of samey, and everything is bombarding them all the time. I'm not blaming them, I'm saying the media is so over-saturated that going back to proper retro games is not so tempting, even if they want to get into development.
With a budget to build a library, the professors set to work, with plans to use Steam, GOG, and Evercade, given the low cost of acquiring retro titles digitally through these. They negotiated with students, curating the list based on what interested them. But it was a process, since the goal was to educate and expand their comprehension - so if students suggested the latest Resident Evil, teachers would suggest the original Alone in the Dark, to witness the genre's origins and better appreciate how it's evolved over time. If students said BioShock, teachers suggested the original System Shock, and so on.
"We had to trim the selection a lot," admits Dr Navarro. "We had to negotiate a lot. Because everybody was trying to go to things they already knew about. Like Doom. Or, very contemporary franchises like Bioshock. We made the cut-off point to be the year 2000. And they kept proposing games from 2015 or 2017. Because everything is retro to an 18-year-old student. <room laughs>"
What's interesting was the emphasis on the library being "legal" - emulation wasn't used, apart from a very specific exception. To put this into context, recall how the VGHF declared that 87% of all games pre-2010 were unavailable via legal means. Meaning the universities teaching games have to rely on the available 13% or spend huge sums acquiring legacy hardware and software, prone to failure. One might think that given this was for study, there'd be "fair use", and emulation could be used. Your author asked about this, but apparently, that's not the case! It was a shocking revelation that copyright, intended to safeguard creativity, is actually preventing the education of future artists.
"We do not have 'fair use' - that is an American legal preset!" declares Dr Webber, adding:
In the UK we have 'fair dealing' - which is more restrictive than fair use. Copyright has a massive effect on our ability to preserve games, and that affects the games we get to use in the classroom. There's constant discussion of how we preserve games, given we're likely to lose particular kinds of hardware, and so emulation is probably the most effective way to preserve games. But of course it's typically interpreted as breach of copyright. The law is not very clear on this. A lot of the time it is not worth pursuing copyright infringement, unless it's reaching standards of criminal copyright. Education does not do that. Typically most developers are really happy, actually, for you to use their games. Particularly independent developers, because they're just really excited about the fact someone cares about their game enough, to think about how it's going to shape the next generation of developers. But some of the larger developers are very hostile to re-use of their ideas, re-use of their work, and so on. It can be years negotiating quite complicated agreements about whether or not you can use those materials.
So let's recap: you're a student in the above class, there's 200 classic games to play, marked assignments to explore specific titles, write about them, and also explore the authors behind the games. As was described, this last point also presents challenges, since first-year students are part of a cultural space where video games are seen as simply appearing, as if from the ether. There's still a lack of understanding of how they're made and who makes them, unless it's a figurehead such as Hideo Kojima, et al. Part of the course requires students to connect the games with previous and subsequent titles, and put them in a chronology or genealogy, and then cite availability and sources, while writing reference lists and a bibliography.
This sort of holistic approach sounds logical. To make better games one needs to understand all facets of the industry - not just one's favourite games, but how these came to exist. Who made them. Yet there's resistance, with some students believing that old games are irrelevant to modern development.
Doctor Bruno de Paula, who teaches at University College London, recounted some personal experiences:
There is a desire to join the industry, but they have a certain idea of what the industry looks like. So the things that we are doing, in terms of history, has nothing to do with what they expect. They say: 'This is not going to help me to learn Unreal or Unity, because that's what games are about!' <room laughs> This is the disconnection I see in my teaching. I did a games module, and we had a survey. One of the comments was: 'Please give us less things about history, and more things that help us with our projects.' I was trying to give them a foundation on what games are and how to look at game design - but that was not relevant to them. They wanted to sit down and spend five hours doing Unreal Engine.
This is a challenge that universities around the world face. The above two are in Spain and UK respectively, but Dr Carl Therrien described a similar situation at Université de Montréal in Canada. They have an incredible games laboratory, with a collection of 80 hardware platforms and over 7000 games!
Though as Dr Therrien revealed:
The sad news is that we have to force students to check it out. Marked assignments, that sort of thing. So similar observations. They don't have a point of entry yet. This is not a value judgement of the younger generation. If they have no way to grasp it, of course they won't go into 7000 games and pick one randomly and just discover them. In the next version of my course on videogame history, I will ask them to go to the lab with a specific list of games we've pre-selected. On top of playing the game, they will have to take pictures to document the material aspects and present these elements to their classmates. This means I give half the semester to presentations. They are responsible for bringing passion into it, and for sharing the passion with their class. I'm thinking this will backfire in so many ways! But I'm willing to try it. Because if I tell them: 'This is a great game from the past, believe me I've played it, it's amazing!' This doesn't work anymore, at all.
Sitting in that room, viewing the slides, all of these accounts came as a shock. If you're reading this article, spend a moment recalling your own life at 18 and how video games intersected it. Such thoughts prompted your author to ask the following question at the talk's end:
Thinking back to my days in education, and looking at today, where students are expected to play video games, I'm thinking: what a time to be alive! <room laughs> All I wanted to do was play games. I'm astonished students today do not play games, do not want to play games. I don't want to be critical of your students, but... Some students even said that playing older games, there was no relevance to making games. Why would they be in a class, ostensibly to learn how to make games, and yet have no interest in them?
This opened up the complicated topic of societal norms, cultural evolution, monoculturalism, etc. Put simply: the things a society makes, finds interesting, shares, and attributes value to. Think about your own recollections again, be they from the 1980s, 1990s, or even early 2000s. Games were always a fringe activity. So much has been written about Sony making games "mainstream" with PlayStation, but actually, they were still outside mainstream thought. Games were on the margins of culture.
You only need to look at TV news reports twisting themselves into knots trying somehow to connect games to violence, over and over - that doesn't happen when a cultural product is normalised and truly accepted by the mainstream. Comics, Dungeons & Dragons, rock music, and countless other things were stigmatised as unhealthy, treated as disposable ephemera, understood only by those embedded within them. Our time with video games was from a different era, far removed from today where comic book movies are now blockbusters.
Dr Navarro gave a great statement on how media consumption has changed and why students of games are not playing games:
I've been pondering it for a long time. Part of it will be the institutionalisation of video games. For you and me, playing video games for a living, or playing games as a student, seems like a dream. Because we grew up in a moment where video games were kept out of culture. Another part is that the media ecology, at the minute, is so vast and so over-saturated, that things lose meaning. Susana Tosca has been writing about media and desire, her book Sameness and Repetition in Media is free, and the saturation we are facing is killing desire for a lot of people. I can feel it myself. I had a lot of platforms - but I cancelled Netflix, cancelled everything else, and stuck to just two, because I was saturated. I'm an adult and can plan my own media consumption. Students have not yet been trained to do so.
Given all that's been said, what, then, are they playing? What games are, in fact, motivating students to enter education to pursue a career in development? Speaking with the various professors at the conference, two titles came up repeatedly: Minecraft and Fortnite. It was stated some students even tried to argue Minecraft represented retro games and should be on the curriculum (this was rejected). Darren Berkland, who gave a talk on developer logs and the importance of students keeping logs during their courses, also cited Call of Duty. As mentioned by everyone, it seems to be a small selection of recent games motivating enrolment, with students wanting to create facsimiles. The stated goal of every professor was to break students free of this restrictive mindset. Firstly, because no one starts out creating such ambitious works, experience is needed. But mainly, to encourage originality and new ideas.
Teachers today face an impossible scenario, almost unimaginable to those of us who grew up in the '80s, '90s, and early 2000s. AI and ChatGPT have destroyed the value of essays as a form of evaluation since anyone can now generate bodies of text. Young people are so bombarded with media distractions that video games, to them, do not have the same meaning they once had for us. Developers in turn are hiring young graduates, then trying to create and sell titles which are released into that same over-saturated environment, coupled with exponentially increasing costs. It would sound like a dystopia, were it not for the fact that there are luminaries such as the aforementioned teachers, each of them passionate and dedicated, trying to awaken that same creativity in a new generation, which originally drew them to games all those decades ago.
May their protégés someday rise to enthral us all.