Last week I talked about what sort of hardware changes the industry went through between 2009 and today. This week I want to go back and look at how this changing hardware has impacted our games. As before, I’ll be using PC games for a lot of my comparisons. It’s easier to see technological changes on a year-by-year basis on PCs, as opposed to console platforms which only make the leap to new technology once there’s been enough progress to warrant a fresh generation of machines.
The most obvious thing is that the overall visual complexity of our games hasn’t changed much. Sure, textures have gotten a little sharper and we can see more fine details. Gamespace has more clutter objects so they can feel more lived in. Draw distances are better so we can see further. Faces have gotten slightly more natural and look a bit less like stiff plastic puppets. But these changes are more incremental than revolutionary. Compare footage of Assassin’s Creed II with footage of Assassin’s Creed Odyssey. I agree that the latest game looks better, but that change is nothing compared to the leaps we experienced in the past.
How Does it Look?
Nine years is a long time in terms of rendering technology. It was nine years between 1991’s Wolfenstein 3D and the release of Deus Ex in 2000. In that time period, the world of video game graphics went through numerous stages of evolution. The blocky levels of flat lighting in Wolfenstein 3D gave way to the twisting dark mazes of Doom, which gave way to the complex three-dimensional spaces of Quake, which evolved into the garishly colored techno-industrial spaces of Unreal Tournament and Quake III Arena, which led to our crude early attempts at photorealism with games like Hitman and Counter-Strike.
In 1993, I went to Walmart because I’d heard a rumor that one of the employees had slipped the Doom demo onto one of the display machines. My computer at home couldn’t run the game, and I needed to see it for myself. It wasn’t even available to buy at Walmart. This wasn’t a shopping trip, it was a technology pilgrimage. I was going to see Doom for the same reason you might take a trip to see the new Tesla Model 3 or one of those scary robots from Boston Dynamics. Doom represented a massive leap forward in technology. While everyone was smart enough to realize that graphics were going to keep getting better, we were still caught off guard by how quickly the future had arrived.
The subsequent generations were similarly amazing. Each new game was greeted by people looking at the demo and saying, “I didn’t realize this was possible.” The new graphics continued to astound us, even when we were already expecting to be astounded. Sadly, this process couldn’t last. Eventually diminishing returns kicked in and the gains began to level off. Computers were still getting faster and programmers were still inventing brilliant new rendering tricks, but those tricks were having a smaller impact on the final appearance of the game. Different people will identify different points as the moment when the gains began to fall off, but I like to tag 2004 as the year when graphics technology stabilized. That was the final moment of astonishment. The one-two punch of Half-Life 2 and Doom 3 brought together bump mapping, facial animations, dynamic lights with moving shadows, unified lighting, per-pixel specular calculations, and a half dozen other improvements. They might look dated today, but these 2004 games look more like modern games than they look like the games of 2000.
Developers have had to fight hard for every visual gain made since that point. Processor clock speeds have gradually leveled off, and even when we do make faster devices it’s harder for that additional power to translate into tangible improvements.
The point I’m getting at here is that the difference between Assassin’s Creed II and Assassin’s Creed Odyssey is less pronounced than the difference between Wolfenstein 3D and Doom. In the last nine years, developers have put a lot of effort into making games more detailed, but the impact for the player has been small. We’ve made smaller strides than we made in the two-year period between 1991 and 1993. In fact, a time traveler from the 1990s might be forgiven for thinking that these two different Assassin’s Creed games were really just the same game engine with different graphics settings. Ā
I should clarify that I’m not knocking the hard work that graphics programmers have been doing. Pushing the envelope on graphics requires a lot of dedicated effort from talented people with very specialized knowledge. The problem isn’t a lack of talent or labor, it’s that things already look so good that making further gains is very difficult. Shadow of The Tomb Raider came out this year and it does look more detailed than the 2013 reboot. Lara’s face is slightly more detailed and the foliage is thicker than in the previous games. That’s nice, but these changes are pretty subtle. You don’t start up the game for the first time and feel the surprise of seeing a new thing that’s never been possible before. I don’t think I’ve had a moment like that since Grand Theft Auto V released back in 2013.
On the other hand, I think developers have made huge artistic strides since 2009. Back then, we were still stuck in the depths of the brown age, when games were desaturated and everything was the color of mud, blood, and concrete dust. Again, if we compare Assassin’s Creed II to Assassin’s Creed Odyssey we can see the later is more colorful and vibrant, and offers better overall contrast in any given scene. The perceived complexity of the scene might be roughly the same, but the latter game is much more eye catching and can stand up to hours of play without becoming visually monotonous. Todayās developers are much smarter about using color grading and full-screen effects. While I’m not crazy about games turning themselves into movies, I have to admit their cinematic parts have greatly improved. Todayās overall framing, lighting, blocking, and set design are vastly improved compared to what they were in 2009.
But How Does it Play?
Of course, evolution isn’t just about looks. Back in the 1990s, each generation was also a major leap forward in terms of gameplay. The games didn’t just look better, they offered new and different kinds of systems for players to engage with. Wolfenstein 3D was a simple arcade-style shooter, while Deus Ex had interactive computers, security systems, inventory items, characters, dialogue, cutscenes, destroyable objects, rudimentary physics, dynamic lighting that impacted gameplay, complex AI behaviors, branching story choices, and countless other features. A few years after that, we got simple physics systems which became a major component of gameplay in Half-Life 2. Technology made the game more interesting to play in addition to making it more interesting to look at.
Since then, new gameplay mechanics have been much rarer. Consider recent titles like Far Cry 5, Marvel’s Spider-Man, WATCH_DOGS, The Witcher 3: Wild Hunt, Red Dead Redemption 2, Assassin’s Creed Odyssey, Dishonored 2, Grand Theft Auto V, and Batman: Arkham Knight. Those are all very different games. You’ve got superhero stories, historical adventure, westerns, and gritty crime drama, and yet you can describe all of them as “cinematic story-based open world combat with occasional stealth sections and a dash of skill points and crafting.” Sure, sometimes the “combat” part means you’re shooting a gun and other times it’s a bow or magic spells, but the games still have a familiar rhythm of combat encounters, stealth sections, and overworld travel. Yes, some games are more cinematic and some are less so. Some are truly open, and some just allow you to choose the order in which you explore the encounter areas. Even so, you can see the underlying design philosophy at work.
A lot of people might gripe that this shows a lack of innovation in the industry, but I think we’re doing fine in terms of new systems. The AAA scene seems to have settled into this vague omni-genre, but if you look at the indie scene you’ll see people are inventing new ideas faster than anyone can play them. From the crazy bullet-time shenanigans of Superhot to the playable black hole of Donut County, there are lots of ideas out there Dthat are too unconventional to build a AAA game around.
Rather than blaming developers for a lack of invention, I think the explanation for homogenized gameplay goes back to the advances we’ve had in graphics. Those fancy photorealistic models and detailed environments cost a fortune to produce. It takes a lot of money to fill a game world with that level of content, which means that games need to be marketable to a broad group of people. The familiar blend of cutscenes, combat, stealth, crafting, and leveling isn’t anyone’s favorite framework, but it seems to be acceptable to almost everyone.
Overall, it’s been a good decade. I’m really curious what happens next. It will be even harder to make graphical gains in the coming years, so I wonder what the new focus will be. Higher resolution? 120 fps gaming? Will VR finally take off? I know we’re all happy with how non-Bethesda games look at the moment, but I also know that the world isn’t done trying to sell us new hardware. Sooner or later the big companies will need to figure out how to get us to buy a new PC or the next generation of console.
Published: Dec 4, 2018 08:00 am