It takes a team to make a video game. Unfortunately, given the state of video game staff credits, it can be difficult, if not impossible, to know who was actually on that team.




Hundreds of people can toil for years to make a single game, often putting in 80-hour workweeks or more. And unless you’re a creative director or one of the few other people who does interviews at E3 or guests on a podcast, the game’s credits roll is probably going to be the only place that work gets acknowledged. Recognizing the hard work that people do is important, both for their careers and their own sense of accomplishment. For the rest of us, having a record of who worked on what is massively valuable historical knowledge. That’s why it’s so frustrating that video game credits are such a mess.

I began the research for this story hoping to get some understanding of video game credits. After looking through the credits of hundreds of games, I’ve found a lawless and unstructured world where nothing makes sense. Rules change from game to game for barely any identifiable reason. Discrepancies in game credits range from minor style issues all the way up to deleted names. Credits are a weird, often harsh world, and there’s been little progress in improving them.

Until a few years ago, I didn’t pay that much attention to video game credits. To me, they were just a waste of time that I watched because there was a chance something was hidden at the end.



That all changed when I decided I wanted to learn more about the developers who worked on games I loved. Credits are an amazing starting point for learning about developers, and I relied on them heavily as I started doing deep dives into what happened to the teams that developed classic games like Prince of Persia: The Sands of Time and Uncharted 2.


After doing this for a few games, I started to notice a strange thing: Every credit sequence I looked at was completely different from the last. What I first read as a shared, standard feature of games—white text on a black background—turned out to be radically different from one game to the next. The place I was searching for answers just left me with more and more questions. Why does Knights of the Old Republic put the leads at the start of the credits, while Jedi Outcast spreads them out over the list of names? Why does Morrowind separate out the people who worked on the character art versus world art, while Metroid Prime just gives everyone who worked on any sort of visuals a generic “artist” title? It was hard to even find two sets of credits that had the same structure.

From the layout of names to the terminology used, there’s no real consistency, even between the most similar of games.. Super Mario Maker 2 and Super Mario Odyssey are both platformers starring Mario developed by Nintendo’s internal teams for the Switch. The games share character models, enemy designs, and sound effects, each one clearly adhering to strict internal branding guidelines. And yet if you finish Super Mario Maker 2, developed by Nintendo’s Kyoto team, you’ll see the director’s name come first in the credits. But in Super Mario Odyssey, developed by the Tokyo team, it’s the 119th name in an alphabetical list.



These are two extreme examples, but the manner in which names are sorted is usually one of the more noticeable differences between any two sets of game credits. For games like Far Cry 5 and God of War, names are separated into disciplines, so programmers and artists get their own sections. Other games like Battlefield V and Call of Duty: WWII, put every name of the development team into a giant alphabetical list. If a game decides to go with the grouped-by-discipline style, which group goes in what order also changes massively from game to game..

Once you start comparing job titles, you can find even more differences. Star Wars: Battlefront 2 breaks down who works on what to a very specific level of detail, noting who were the engineers for vehicles and who were the engineers for user interface. Ubisoft games tend to go into such granular detail that you can find who sang alto and who sang tenor on the soundtrack of Assassin’s Creed Odyssey. Other games, like Borderlands 3, have absolutely no details about who on the team did what. A “tester” at one company might be “quality assurance” at another. Similar minor title differences exist for almost every imaginable role. Across the globe, job titles can change completely; Japanese game studios use the English word “planner” to mean the same thing as a “designer” in America.


The credits sequence from Call of Duty: Modern Warfare 3. Screenshot : Activision ( YouTube

Even if a company establishes a style for its credits, that doesn’t mean it’ll stay that way. After Jason West and Vince Zampella were fired from Call of Duty: Modern Warfare studio Infinity Ward, leading to the resignations of many of its other employees, Sledgehammer Games was brought in to tag-team with what was left of Infinity Ward work on the next game. When Modern Warfare 3 released in 2011, the credits were an alphabetically-organized mishmash of names of developers from both companies, with no information on who did what or even who worked where. (Coincidentally, one of the first names listed is a then-unknown Guy Beahm, now better known as Twitch megastar Dr Disrespect.) It’s a far cry from the detailed credits of Modern Warfare 2, and Infinity Ward only switched back to listing individual job titles this year with the reboot of the series.



The development studio Insomniac organizes names alphabetically, with a twist. For Marvel’s Spider-Man, the names are sorted by the last names from A to Z, but in Song of the Deep they’re sorted by first names, Z to A. By doing this Insomniac never faces the problem of having someone with a name like Aaron Aaronson always being the first name people see. Credits might also list company pets, children born during development, and developers who have passed away; ever since Visual Concepts’ studio art director Alvin Cardona died in late 2012, every NBA 2K game has ended with a dedication to him.

“Studios are more or less free to do whatever they want, with no consequences if they choose to ignore the standard.”

If you’re the type of person who likes to sit through the credits of movies, you may have noticed that film credit sequences are much more standardized, both in terms of their structures and the naming of different positions. Why is it that no matter the movie, you can always find the Best Boy and the Key Grip? Put simply, it’s because the people who make movies are unionized, and unions have rigid rules about how credits work.


The video game industry, mostly non-unionized, has no such standards.“Having accurate, verifiable credits isn’t part of the certification process for Apple or Steam or Nintendo or anywhere else, so studios are more or less free to do whatever they want, with no consequences if they choose to ignore the standard,” said game designer Ian Schrei ber in an email.

Schreiber is a member of the Independent Game Developers Association, is a representative of the organization’s credits special interest group, which is trying to create a standard for game credits. A guide was put out by the credits SIG in 2014, before Schrei ber joined, but it’s hard to figure out which companies have adopted the policies, if any. There’s no clear record for which studios follow the guide seriously, and from looking at credits, it seems common for game studios to just follow the whim of whatever they want to do.

It can often be complex figuring out whose names even go into the credits. Perhaps a member of a development studio only did a little bit of work on one of its games before moving to another project, or perhaps another left during a game’s development. Without rules, it often leads to undercrediting. It’s common to see developers talk about the frustration of not being credited in a game on which they worked, or being relegated to the “special thanks” section. This kind of undercrediting can cause massive problems, with developers being lost to history as years of work is erased or misattributed.

Many studios have policies in place to only credit developers who were still employed by the studio when the game releases. Over a thousand people who worked on Red Dead Redemption 2 didn’t make the official credits because of this kind of policy. Rockstar told Kotaku last year that it uses the game’s credits in this way because it wants employees to stick it out and “get to the finish line,” essentially punishing those who leave the studio’s culture of crunch. The only confirmation we have that these people worked on the game is a page on Rockstar’s website with no details what they actually did. Even this vague acknowledgement is better than the policies of many studios, which give no official credit to some developers in any way.


“It feels intentionally destructive,” said one developer who didn’t make it into the credits of a major triple-A game on which they worked for over two years. “It feels very very strange to not get credit for all the hard work—doing things on crazy deadlines, and sending things overseas and working with other teams,” they said. “It really sucks. It burns pretty bad.”

Even if someone is credited, that doesn’t guarantee they’ll stay in the credits for future releases. XSEED first published the Trails of Cold Steel role-playing games for the PlayStation 3 and PS Vita in 2015, crediting those who localized its text, like Brittany Avery and Thomas Lipschultz. But when it released ports of the games on PlayStation 4 this year, Avery and Lipschultz, who has since left the company, had their names removed even as the words they wrote years before were still used. XSEED said it had a policy similar to Rockstar’s of not crediting anyone who does not currently work at the company. When the Crash Bandicoot trilogy was remastered for modern consoles in 2017, none of the original team at Naughty Dog who worked on the original PlayStation 1 games had their names in the credits. This policy has seemingly changed since then, with later remakes published by Activision including the names of the original developers.

Then there are the developers whose work is almost always done in anonymity. Games take a lot of work, and each studio has limited time and people. Because of this, almost every major release uses outsourced labor to some degree. There are entire game studios like XPEC and Virtuos that are designed not to make their own games, but to help create assets for others. The oldest and most well-known of these studios is Tose, a Kyoto-based firm that has been doing anonymous game development since 1979. Tose has worked on (if not fully developed) thousands of games over the last 40 years, but until recently almost never appeared in the credits of its own games.

Over the last decade, it’s become a lot more common to see the names of these support studios in credits. Games like Forza Horizon 4 and Monster Hunter: World give seemingly complete credits for support studios, but like everything in the world of credits, there’s still a large amount of inconsistency. In The Legend of Zelda: Breath of the Wild, support studios are listed, but no developers are named and no information is given for what part of the game these studios worked on.


“A lot of what you do and future opportunities are based off credits,” said Joshua Minette, a former quality assurance employee at the localization and support company GTL Media, now called GameScribes. “Generally as a third-party vendor, you don’t get [credits]. I have a list of somewhere around maybe 60 mobile and console games that in some shape or form I worked on. My name’s nowhere in the credits and I don’t believe our company name is anywhere in the credits.”

The open-world crafting game Terraria was an exception for Minette. After working on several different versions of it, he was offered a chance to be credited. It was a nice feeling, but it didn’t last. Minette left the company in 2016, and after the mobile version of Terraria was rebuilt, his name was removed completely. Minette only found this out earlier this year when he loaded up the game and couldn’t find his name anymore. “It’s a little bit heartbreaking,” he said. “This is bizarre, this is saddening, this hurts.”

Leaving developers out of credits doesn’t just hurt the people excluded. It can also make hiring processes harder. Schrei ber, of IGDA’s credits SIG, said in an email that this process “leaves the door open to applicants falsely claiming credits on games they didn’t work on, and hoping no one will notice—and yes, any hiring manager will tell you that people do try this sometimes.”

Credits, or the lack thereof, have been a consistent problem since the earliest days of the video game industry. In the late 1970s, Atari was the dominant player in the console world, but didn’t want its developers’ names to be known, afraid that other companies would hire away the best talent. Employee Warren Robinett famously found a workaround, hiding the message “Created By Warren Robinett” in his Atari 2600 game Adventure. After Robinett left and Atari found out about the Easter egg, some at Atari tried to remove his name from the game, but ended up leaving it in.


The lack of credits, alongside other issues like poor pay and poor treatment, led many of Atari’s biggest developers to quit and form their own company, the first third-party video game publisher: Activision. Games released by Activision proudly put the names of designers like David Crane and Bob Whitehead everywhere from the box to the manual and even the cartridge label itself.Electronic Arts launched with a similar mission statement, building an early advertising campaign around the people who made their games, and putting prominent credits on the front of the box. Credits quickly became the standard, even if they had no standards themselves.

The entire credits sequence from the original Legend of Zelda. Screenshot : Nintendo ( VGMuseum

In Japan, developers went longer without being named. During the 8-bit era, most big Japanese releases had no credits at all, and if they did, the names therein would often be pseudonyms. For The Legend of Zelda, director Shigeru Miyamoto and composer Koji Kondo were billed as S. Miyahon and Konchan, respectively. Mega Man character designer Keiji Inafune was listed as Infaking even through the 16-bit era. Because of this, many of the developers for classic and historic games are still unknown today. The Castlevania series is one of the most influential of all time, but do you know who directed the first three games? The answer is probably Hitoshi Akamatsu—but after over 30 years, no one has ever been able to talk to him to 100 percent confirm that he led the games.



Over 10 years ago, Leigh Alexander wrote a similar story about credits for Kotaku, and little has changed since then. Too many publishers—even ones that were founded on the principle of properly crediting developers for their work—still use their games’ credits not as a historical record of the people who created the game, but as a reward and punishment system to keep people from quitting the company.

As a technical problem, undercrediting isn’t that hard to fix. Companies could come together and establish standard rules, or publically adopt and improve what IGDA’s credits SIG put out 5 years ago. Employees can attempt to push for stronger policies, but it’s unlikely they could make any headway on it without being able to bargain collectively. It’s clear, looking at what happened with credits in the film industry, that the problem will only get completely solved if game development studios unionize, and start setting rules that publishers must follow.


At their best, credits can be amazing to watch. They can show off the style of the studio while being entertaining and providing important historical information. It’s a shame, then, that there are still major games with credits so bad. Maybe in 10 years when someone’s writing another Kotaku article, it won’t just be about the same problems.

Forest Lassman is a writer from Kansas. His favorite game of the 2010s is Pac-Man Championship Edition DX. Or one of the Trails of Cold Steel/Trails in the Sky games.