I remember it like it was yesterday. I was sitting on my friend’s couch, and I had a huge lead; it was the final lap of the final race, and we were running for the championship on Mario Kart 64.

The course? Star Road.

I was miles ahead but in an instant, my kart flew off the track and I went from easily in first, to fighting with no one but myself for last. To this day, everyone who was in the room with me at the time remembers my collapse as well as my description of that failure: “No Clutch.”

We called them the couch wars.

We played after school or on weekends. We’d play Goldeneye, or Mario Kart, or whatever we could get our hands on that we could all play together.

It could get loud, it would get rowdy, but it was always fun.

Today, however, it’s not nearly as easy to find games that offer local multiplayer support. In fact, with the notable exception of Nintendo, on a current generation console it’s almost gotten to the point that unless you’re really into sports games, having a second controller is a waste.

Many of today’s AAA titles offer multiplayer options such as Dragon Age: Inquisition, Evolve, and Grand Theft Auto V, to name just a few, you just can’t play them in the same room as your friends; you need to play online.

How did this come to pass?

Rise of the Internets (or how we learned to talk smack to strangers on the web)

There are a number of theories as to the decline of local multiplayer modes in major releases. Mike Mika, designer of #IDARB for Xbox One, believes the reasons the industry has gone away from local multiplayer has more to do with the technical aspects of multiplayer gaming than any other.

“[For example] on N64, we got four player Mario Kart [and] that worked pretty well. By this time, though, the horsepower of the N64 was getting taxed pretty hard. You effectively had to render a scene four times. It was painful…Games that supported local multiplayer, and especially 3D games, didn’t look as good or maintain a strong enough frame rate as single player games that used all the resources to drive a single display. Once network gaming took off it suddenly took the burden away from a single system. Now you had enough processing power to make the game look good and it could be multiplayer,” he said.

It’s certainly not an unconvincing argument. With every generation of console, developers are finding new and exciting ways to entertain us, which puts more and more strain on the processing powers of the systems they’re developing for.

The simple fact is that developers want their games to stand out, to get good reviews, and to sell oodles of copies. Maximizing the output and look of their games is certainly not a bad way to do that. Fans expect a graphically superior gaming experience on their next-gen consoles. If developers can only bring that with online multiplayer, it’s understandable why they’d go that route.

It’s also been suggested that the fans themselves may have played a role in the trend away from local multiplayer modes.

Nick Madonna, founder of the PHL Collective and designer of the ridiculously addictive Clusterpuck 99 (a local multiplayer only game), weighed in and said he believes the community bears some responsibility for this shift. When asked what he felt was the biggest barrier for the development of local multiplayer modes for their games, he was unequivocal.

“Community backlash,” Madonna said. “I think local only is a choice the developer has to make and stand confidently by, even if the community is demanding online… if the developer has a design goal or reason for making a local multiplayer game, that’s his or her right and they don’t deserve to be blasted for it. I see this all the time on Steam forums and it’s not fair to the developer.”

Gamers of all stripes are certainly not afraid to voice their displeasure, particularly online. Developers invest countless hours into their games, and for most, it’s a labour of love. While it’s important to have a thick skin, obviously sometimes things can go too far, and developers need a little slack. Sometimes they feel the experience might simply be not as exciting while playing online versus local and that’s reason enough to keep it local only.

One interesting idea that’s been floated is that the gaming demographics have shifted considerably since the days of Atari and the NES, and that shift is what’s causing the move away from local multiplayer.

It goes something like this: the main consumers of the Atari 2600 and the NES were kids somewhere between the ages of 6 and 12. Since the time of those console releases, roughly 30-35 years have passed.

Now those consumers have grown up, may have families of their own, and have different responsibilities than they did when they were young. It’s not as easy to call over the neighbour kids to play video games when you need to take little Sarah to hockey practice and little Bobby to drama class all before you get home and make dinner.

Players want to hop online, play a game or two to decompress, then get back to their real lives.

Tommaso De Benetti, Community Manager for Housemarque, creators of Resogun, believes this demographic shift to be a contributing factor in the decline of local multiplayer.

“Because the average age of players is increasing, so meeting with friends is not always an option…[w]hen people start to have kids or work, and can only play late at night, having people over is not always an option.”

The simple truth is that it’s unlikely that there’s any one specific cause for the decline in local multiplayer offerings. Each developer I asked had a very different reason for this shift, which suggests that there are many contributing factors, as opposed to just a few.

It could be that all of the above (and perhaps others) are responsible in some way for this decline. Given that the major publishers are moving away from local multiplayer games are the couch wars dead? The short answer is HECK NO. Indie developers have stepped up to fill this gap with fun, at times enraging, and fully immersive local multiplayer games.