Azhdarchid

Why can't you be more like your older brother?

I genuinely don't remember how I saw this post, but it kind of lodged in my mind.

just as an outsider with an interest in video game development, and coming from the film industry, it really feels like the game industry has no standardized workflow. every new project gets a bespoke production system with a lot of uncertainty about tasks, roles and who should be doing what

[image or embed]

— mtsw (@mtsw.bsky.social) January 11, 2025 at 3:07 PM

I'm sure the OP here doesn't mean it like that, but these kinds of inquiries, in my mind, always have the tenor of a disappointed parent scolding a wayward younger child: "Why can't you be more like your older brother?"

The older brother, in this analogy, is cinema. Why don't video games have a standardized, regimented division of labor and working methodology? Why can't we be more like movies?

I think the answer here is actually really complex; it's about both the mediums themselves and about their differing histories.

Labor (im)mobility

A defining feature of the US movie industry1 is that movies are largely made by short-term workers.

Every movie starts with putting a crew together2. People who are attached to a film project will, then, also put their own crews together; it's crews all the way down. The film's director or producer might pick a director of photography, but that DP will then typically go find camera operators and other lighting/camera department people they like. Notably, on a film set, management and leadership roles are also filled by these short-term workers.

Film workers come on to a project as soon as their expertise becomes relevant, and they leave when it no longer is. A producer or director might stay with the project throughout its entire life cycle, but most people are only going to be around for principal photography – a period of time measured in months, not years.

At the extreme end, some people on a film set will be there only for that single day. Actors doing cameos or bit parts. People with highly specialized jobs – if the movie only has one brief scene with horses, you might need a whole horse crew... for one or two days of shooting.

What this means is that, in film, the ability to slot cleanly into a very well-defined role is extremely valuable. People will do multiple jobs over the course of a year. People spend a lot of time looking for the next job, or finding people to join them on a given job. Understanding what a role means very specifically is a requirement for all steps of this.

This system of standardization developed not only because of unions, or because of the industry, but from both sides.3 It benefits both the labor unions and management. The Hollywood guilds like this standardization of roles because it protects their specific niches, and it also creates a very well-defined 'class' that can be unionized.

IATSE can know with a very high degree of certainty whether someone is doing the kind of labor that they represent, and so the collective bargaining agreement is easier to enforce; there's not a lot of grey areas in film production, by design.

But, conversely, the studios also like this system. Because it makes labor very interchangeable, easy to get and to replace. The temporary, project-based nature of labor limits risk for studios; labor relations exist only as they are needed, and then they go away. You can replace people; you can schedule projects more easily, because you don't have to work around the availability of most workers.

Which is to say, basically everyone with any power in the film industry wants this degree of extremely specialized, extremely well-defined job titles and roles. There's a very strong pressure to make it so.

Video games are simply not made like this. Video games are made to a much greater extent by salaried workers who enter in long-term employment with studios. People will work on multiple projects with the same employer. Terms of employment are not predictable, but most people will stick around with a given employer for at least a year.

The reasons why this labor model was adopted are historical at this point, but it is now ingrained and self-reinforcing; it has to do with how video games are made and with how they are funded.

Video games do employ contractors, of course; but the creative leadership and management on a given project is almost always made up of salaried people with a long-term attachment to the studio. Importantly, video game projects tend to staff up gradually, and carry over staff from past projects; unless you're spinning up a new studio, you never deal with the problem of bringing together a crew of strangers who have never worked together.

(It is, of course, notable that spinning up new studios is notoriously fraught and difficult.)

Besides that, the video games industry is largely not unionized. Even in spots where it is, the unions don't have the same incentives that the movie unions have! Most unionized studios are unionized in a 'one big union' model, where one union represents everyone – from QA to art to narrative. Those unions don't really have strong reasons to push for standardizing a division of labor; it doesn't help protect their turf in the same way.

Ultimately, labor standardization is a process that had to be done by people, and the games industry never really had the same incentives to pursue it. Institutionally, universities and unions play a major role in doing this work in cinema. They're the ones that hold all the institutional knowledge, after all, since the movie studios themselves don't employ anybody long term.

Besides the relative absence of unions, 'video game colleges' also never really acted towards this standardization. Part of this is a chicken and egg problem – there is no standard for them to promote – but part of this is also that 'game dev education' is itself a very inconsistent and spotty endeavor, and what exactly one of those programs is trying to teach students can vary wildly from program to program. Many don't really seem to try to teach much about the process of video game production, or how a video game gets made end to end.

Assets, process, product

Setting aside all those historical factors, however, it's worth considering that the category of 'video game' is just broader and more diverse than the category of 'film'.

A movie is fundamentally made up of shots that are arranged in a linear sequence by an editor. Those shots are the individual assets that make up the bulk of the finished product, and they are all produced basically the same way -- you get a crew on set and you shoot.4

There are, of course, infinite individual factors that make every film shoot its own special and unique thing. I'm not trying to suggest that making movies is simple or easy. But, the process of making basically any movie can always be conceptualized with the same fundamental building blocks: You write a script, you shoot, you edit those shots together into scenes.

Video games simply don't operate like that. There is no universal organizing principle that holds for every video game. Terms like 'level' or 'quest' can mean wildly different things across different projects. The relationship between organizing parts of the product and organizing parts of the production process can change depending on the project.

In a film, every scene is made up of shots, and every shot has a specific shooting day and a specific set. Everything can be traced to where it comes from, in the same way.

In a video game, a 'level' might be a section in the story or in game progression, or it might be more like a location that gets reused and revisited over and over again. The actual asset associated with a conceptual 'level' might be a tile map, or might be a 3D space (which can be constructed using different techniques – CSG versus mesh modeling), or it might be code that procedurally generates the level as the player will experience it.

Assets can have different hierarchical relationships. The environmental art in one game might be exclusive to a level, and so 'below' the level in the asset hierarchy. In other games, artists make reusable 'tilesets' that are then used to dress up multiple levels.

The difference between those workflows is not arbitrary, but rather a function of the game's specific needs; an open-world game with dozens of similar-looking dungeons is a different problem than a linear story-driven game where each level is a specific singular place.

And of course, some games don't even have levels. There's no universal organizing principle that you can use to understand a video game as an artifact, not in the same way that you can organize a film in terms of shots or scenes.

Different projects have such divergent needs, priorities, and goals that it's not reasonable to think that one standard of division of labor and hierarchy could meet all of them.

In the games industry, where we do try to standardize, we end up undoing standardization at the local level to cope with specific project needs. An example of this is Agile methodology, which is pervasive in the games industry; but if you've worked in games for a while, you become aware that different studios 'do Agile' completely differently.

Most studios are inventing a production practice that they can live with and then covering it up with a fiction of Agile that makes whatever workers are really doing legible to management; I call this practice 'Bastard Agile'.5

A classic example is writers working loosely to meet deadlines (as writers have always done) but attaching individual writing assets to 'tickets' and 'sprinting' them so production can keep track of what they're doing.6

Different projects can have different priorities which give rise to different orders of operation or definitions of who blocks whom. Narrative in one project might be the starting point; in another project it might come in the middle of the process. The needs of different projects imply different interdisciplinary hierarchies and relationships, in a way that is generally not true of movies.

Could video game productions standardize?

I think the answer is... probably not. The first question is just, what standard are you supposed to adopt? There are as many ways of working as there are studios.

There actually are forces that work towards labor and workflow standardization in the games industry, but they're counteracted by other forces that keep these systems in flux. Things like GDC, game development university programs, and academic works on game development all work to try and create clear definitions that are stable.

But, there are incentives to shift those definitions. Studios aren't pressured to not adjust or tweak processes to their benefit, as we see with bastard Agile.

And individual workers, too, have worked against standardization. Everyone in games narrative has seen a phenomenon over the last 10 years where many studios have been seemingly hollowed out of writers.

Do they no longer employ writers? No; it's just that nobody at those places wants the low-prestige, put-upon job title of 'writer' any more. So all of these people have become 'narrative designers.' In effect, 'narrative designer' is becoming an umbrella term that holds a continuum from ICs who mainly write things like cinematic scripts and barks... to ICs who do design work but don't write.

Note that in this case, the industry wants to standardize but it's workers who are muddying the waters to pursue their own labor and economic interests! As I've said, I don't think that broad unionization of the games industry would necessarily bring about this kind of standardization, because the incentives for labor are very different, and the unionization model being adopted is different.

Process and progress

Film production is a prototypically linear process. You start by defining the parameters of the project (writing, development). In pre-production, you then you generate a plan – the shooting script, script breakdown, shooting schedule. In principal photography, you generate the raw material that the project will be made out of; you shoot until you get all of the footage you need. Then, this raw material is put together in editing and post-production.

Deviating from this linear model is usually considered a process failure (eg, reshoots). There are clear 'on' and 'off' ramps for different participants in this process; the actors show up at the start of principal photography, and they go away when that wraps. The writer is heavily involved in development and maybe pre-production, but after that they'll visit the set a few times but are generally expected to move on.

It's worth noting that a lot of the perception that film productions move 'like clockwork' is that a lot of the meandering, chaotic, unpredictable aspects of creative work are essentially moved to happen outside of principal photography. Scripts get rewritten multiple times, and many a movie has languished for years in 'development hell'. A lot of the work that directors and actors engage in happens in preparation ahead of actual shooting. Movies often change substantially in the edit; and of course post-production is often about addressing changes.

These discussions often ignore that there's more to moviemaking than being on set. The on-set process is extremely consistent, strict, and regimented exactly because every single day on set is so brutally expensive. It's worth it to spend weeks in pre-production to save a day of shooting.

Video games are just not subjected to the same economics.7 Whether you're in pre-production or production, you're paying basically the same labor cost. More than that, it's impossible to really prepare for a game production to the extent that a film production can prepare. Games are dynamic entities made up of dozens of systems, meanings, images, and narratives. How they interact is unpredictable until you can observe it in situ.

So it's harder to truly 'solve' a game on paper before you start production, and studios have less incentive to do it.

There's no equivalent, in video games, to the editing step; you can't adjust your raw materials after the fact. If you want the game to be different than as it is, you have to go and remake assets, change code. Movie studios will often re-edit movies based on feedback from test screenings; if you have to go to reshoots, it's generally taken as a negative sign.

In video games, the equivalent always necessitates redoing assets or changing significant aspects of the game. It's like you always have to reshoot things.

This very naturally leads video games towards a more iterative process. You make a chunk of the game, check your work, adjust, and continue. This process is unpredictable; it's impossible to really know how long it will take. It also has baked in inefficiencies – people can be blocked, thrash8 happens, and so on.

This in turn pushes studios towards longer-term relationships with workers, because for most disciplines you can't quite be sure of when someone's involvement in a project will end. This, in turn, encourages people to wear multiple hats or cross interdisciplinary boundaries to justify these longer terms of employment.

Should the industry standardize?

Here's the thing about history: it is extremely difficult to prove a counterfactual. I think an assumption is widely made that the heavy standardization of the film industry was 'good for movies', whatever you want that to mean.

I think this is a very intuitive assumption! But also we don't have a counterexample. We don't have a means to peer into the timeline where the movie industry didn't develop in that way. Maybe their version of The Substance is way better; who can say?

Standardization, it bears saying, largely developed because it benefits the studios and the unions as institutions; any purported benefit to individual workers is a happy accident. I can definitely see reasons why people enjoy working in this system! But I can also see counterarguments.

Career progression is very linear and people have few opportunities to broaden their skills outside their discipline.9 Interdisciplinary hierarchies are more or less set in stone, and some disciplines simply have more prestige than others, across the breadth of the industry. There's a reason some Oscars get announced by a celebrity on stage and some Oscars get handed out off the air.

And even if we take it as a given that standardization is beneficial in film, there's no guarantee that it would be in video games. We've seen, in the games industry, standardization happen to an extent within a family of studios; big publishers often try to regularize workflows and roles across the studios they own.

There doesn't seem to be much evidence that this solves any of the significant problems that plague video game productions -- crunch, thrash, waste, and so on. And it's not meant to; what standardization does is make the individual studios more legible to the structure of upper management above them; but that legibility does not seem to lead to particularly better decisionmaking.

Process rigidity has had some high-profile failures, such as the infamous teething issues that ensued when EA decided that all its internal studios should use Frostbite.

Ultimately, I don't think solving the games industry's organizational and labor problems is going to come from adopting a model of labor division invented in the mid-20th century to make a completely different class of product.

  1. Not all film industries operate this way, but many do; this piece focuses on the mainstream American film industry because, frankly, that's what people making these comparisons are familiar with.

  2. Hence the near-universal trope in heist movies, which many a film critic has pointed out are a selfconscious metaphor for moviemaking.

  3. For more on the development of the Hollywood division of labor during the studio system era and right after the collapse of the studio system, see Janet Staiger's sections of The Classical Hollywood Cinema (1987).

  4. For more on this, see Laura Michet on why you shouldn't make game writers sprint.

  5. This of course doesn't apply to animation, and VFX increasingly breaks assumptions of this process. It's worth noting that the very regimented and clear workflow of a film set only really applies within the bounds of one type of film-making, and things are different in other sectors.

  6. I am told this is extremely widespread, also, in the software industry.

  7. Worth noting, of course, that people do tend to run a much tighter ship in those parts of the game dev process where the economics are like shooting a film – like recording VO or doing performance capture.

  8. In the games industry, 'thrash' is generally defined as what happens when you have to throw out work because something changed that you weren't expecting.

  9. In video games, it is extremely common to enter the industry through one discipline and eventually land elsewhere. For a long time, QA was traditionally an 'entry point' discipline. It's worth noting that QA workers themselves have worked to fight against this and to have their expertise recognized as more than just 'entry-level' work.

#video games