5/25/2015

About the #downgrade thingy...

Last week there were two major gaming events in Poland. First was the launch of The Witcher 3: Wild Hunt on Tuesday. Second was Digital Dragons (a conference that's closest to a Polish GDC) on Thursday and Friday. Lots of developers and media people attended and there's been many topics on everyone's tongues, but this one I've actually debated during the afterparty and thought some of the facts are worth sharing.

Alright, so a lot of people, especially in the media, have been complaining about the graphic downgrade in The Witcher 3 and comparison compilations have been thrown back and forth to prove... something. Doesn't really matter. The downgrade quite obviously happened. But so what? It happens in most games.

What are these images trying to prove, with completely different lighting and scenes?
What many people don't understand is that for a developer, every project brings new challenges and new experiences. Nobody ever makes the same game twice and really very few sequels are made with the "let's do a copy with just a few tweaks here and there" mindset. When an AAA team is working on a game, they want it to be the best game they can make. And almost always it turns out to be very hard to achieve, because no matter how experienced they are, they're doing something that's never been done before. I'm sure that if you asked the developers of even the highest-praised titles (like Ocarina of Time with its 99 metascore), they would tell you how the game could have been so much better if they hadn't cut some features or optimized some graphics.

I have experienced a downgrade of the game I was working on myself. To quote Tomek Gop from the speech we gave on Digital Dragons: media demo from February 2014 was the best Lords of the Fallen has ever looked (more or less, don't hold me to the exact wording). And that's true. I remember exactly all the work that we've put into achieving this visual benchmark. I also remember all the reasons for the rest of the game not living up to this benchmark. 

Many people accused this screenshot of being overpainted,
but at that point Lords of the Fallen really looked like this.
Does that mean that we lied in February 2014? That we deliberately misinformed the public? No. We worked hard towards achieving that level of graphics. We really believed the whole game will look this good (and it ended up looking not too shabby either, but that's not the point). Most probably, so did guys from Red when they were pitching W3 over the past years. So did From Software, when they were first showing Bloodborne. 

So... Why can the downgrade happen? The most common reason is hardware's power. I know, yeah, consoles have fixed specs. Sounds like it's not rocket science. Well, it kinda is. There's dozens of parameters that can affect what you can show to the player at the same time. It's a function of particles, triangles, pixels, streaming, POV, horizon and a whole bunch of memory management elements I can't even list. With often hundreds of people creating assets, it's virtually impossible to accurately predict how advanced the graphics should be. And in AAA everyone prefers to produce assets of higher quality, because it's easier and more efficient to cut down than to scale up. 

You end up producing a virtual slice with these highest quality assets. A small piece of game that often has problems working properly on a console, so you take a PC with the best graphic card in the studio. Yes, the one worth your monthly salary. The one that nobody can yet afford. And it manages to run these 30-40 FPS, but you keep telling yourself it's allright, because it's not optimized yet. Because some of the LOD's are not loading properly yet. Because streaming isn't yet fully implemented. But you still believe it can be all crammed into the game, because you really want your game to look awesome.

And this is a build that you're showing to the media, explaining that it's a vertical slice or a work in progress or alpha or beta or whatever stage you're in and you hope they will understand. What you mean to say is: this is how it looks now and we want it to look this good, because you wouldn't show something that looks like shit to the public, would you? But what media people seem to undestand is this is still an early version, so there's a lot of room for improvement and everything will look so much better on the release! You can see some communication noise here, right? :)

Not without its impact is the fact that most of the recent games accused of graphic downgrade are being released on the newest generation of consoles. For the players, the consoles aren't new anymore. For developers, with three years of development cycle, the consoles are very new. Many of these games released today were started before the specs of the new consoles were final and way before the teams got their devkits.

It would be awesome if game development was as predictable as math.
And what about PC? Sure PC can be more powerful, but can your company spend resources to make a completely different build for a PC? Does it have a fanbase strong enough to wait for the PC version for up to a year and a half, like with GTA V? And lastly... How many PC players will actually own hardware strong enough to support your ultra-ultra settings? Optimizing for one setting and fixed specs of consoles is a bitch. Optimizing for 3 or 5 or more setting levels for every variant of a PC is a burning whorehouse. Most of the time the dev team doesn't have enough manpower to really handle that and with the PC market being considerably smaller than console markets, even in biggest companies nobody will invest enough manpower to do the PC version "right", unless it's a PC-centric game.

That brings us to another point - manpower and release dates. Sometimes the dev team just can't deliver. Again, there's lots of constraints here: availability of resources, deadlines, changing directions and a shitload of events or problems you couldn't possibly anticipate. Key person on your team might leave and the rest is struggling to cover for the loss. Or (sometimes even worse) someone else comes in his/her place, with a completely different vision. Two-three years is a lot of time and a lot can happen. That's why the best game producers aren't the ones that can plan the whole 3 years development from the beginning to the end, but those who can adjust the development to the current situation.

Reading all these accusations of ill will, where people compare two completely different screenshots to prove that they've been "lied to" makes me smile. And the most of this butthurt comes from people who have never been in development, but feel that they are extremely close to it - the gaming journalists. And yes, of course this scenario is possible: a company deliberately showing everyone pretty candies and then shoving shit into the Blu-ray boxes. But is it likely? From what I've seen, I doubt it. 

Gamedev is pretty unique. People want to make great games. Designers want systems that are fun for them. Writers want characters they like. Programmers want code that just flows flawlessly. Artists want visuals they can be proud of. And if for some reason this can't be delivered, they are always aiming for the second best thing.