We can safely say that now, more than ever before, we've been down in the muck of talking about video game resolutions and frames per second on, essentially, a weekly basis. Whether it's producing headlines on the subject ourselves or answering questions about it on Twitter, gamers are exceptionally concerned with the quantifiable visuals in gaming right now.

It's been colloquially referred to as "Resolutiongate," and the term typically covers games playable on either the PlayStation 4 or Xbox One. The PC and Wii U get tossed into the fray every now and then, but this debate centers largely on the newest generation of machines from Microsoft and Sony.

Generically speaking, the PlayStation 4 has been able to "outperform" the Xbox One on multiplatform titles so far in this generation. The standard for the system seems to be 1080p at 60FPS. The Xbox One? Those numbers get a little fuzzier, but we've often seen games check in at the odd 900-something-p and 30FPS.

Does it matter, though?

We suppose that's the question a lot of gamers have been tossing around since the performances of these new consoles have been checking in. Do the resolutions and FPS marks for games matter?

Joey Davidson

I'd argue that they do, in fact.

I'm not sure where you stand on this, Eric, but I think that these point-blank statistics are often used when it comes time to argue about new video games because of their clear and tangible differences.

I'm not suggesting that one console is better than the other right now, so please don't infer that from what I'm about to say.

For someone on Twitter, in the comments below these posts and on message boards like NeoGAF, it's very easy to say that the PlayStation 4 is more powerful than the Xbox One because it delivers Tomb Raider at a higher frame rate.


Eric Frederiksen

Numbers don't lie, right? And bigger numbers are better, unless we're playing golf.


I love golf.


I know, right? But I think it's not as simple and clear as bigger numbers equals a better gaming experience. Resolution and framerate absolutely do matter, but they don't matter the same way in every game, and in many games they don't matter at all. Much of the time, they really only matter in so far as establishing that one game, console, video card, etc. is mathematically superior to the others.


So, they matter in the shear empirical evidence world, right? If I want distinct proof that my video game majigger is better than your video game majigger, I can slide the resolution and FPS numbers in front of you for scientific evidence.

At which point you'll just shut up, right? That will be the end of it.

But here's the thing, and I think a lot of gamers will take issue with me for saying this… I don't care about these numbers. I don't.

If the game looks nice and doesn't stutter while I play it, I don't care about the precise pixel count or frame speed. Does that make me a bad critic?


If they don't affect your experience, then I don't see why you should care about them. Since you mentioned it specifically: I played Tomb Raider on both the Xbox 360 and Xbox One. Loved it twice, got 100% twice. It looked great the first time, even better the second. I didn't feel the lower frame rate affected my enjoyment in the least.

Where these numbers matter is when the gameplay is affected. For lots of games, these things won't affect at all. I'd challenge you to find me an adventure game hurt by either of those statistics, for example.

But when you boil it down, video games are always going to be interacting with computers, and anytime we're asked to be precise or fast, those numbers start to matter. Here are two examples:

Forza Motorsport 5 Review - Hero - 001

In Forza Motorsport 5, Turn 10's racing series, frame rate is incredibly important. The game polls the controller for information about what you're doing at each frame, and determines how you interact with the game at each of those moments, in addition to figuring out the physics of the rubber on the road. Racing requires both fast reaction and sharp precision, and an uneven frame rate would be a huge problem, even if it was jumping between the acceptable 30FPS and optimal 60FPS.

The second example comes from a piece I read in one of Eurogamer's Digital Foundry columns, which any graphics nerd should be checking out.


Yes! Seconded, those features are brilliant.


Exactly, it's virtually required reading if this stuff matters to you.

The second example is Metal Gear Solid V: Ground Zeroes – What Digital Foundry discovered when comparing the Xbox One and PlayStation 4 versions of the game was that sniping was ever so slightly more difficult as a result of the upscaling from 720p resolution to a 1080p display.


Right, yeah, those two examples pretty much serve my point. They're both interesting, and they both relate actual display to gameplay mechanics.

If you told me that, and I'm just pulling a game out randomly, BioShock Infinite was unplayable on the Xbox 360 but perfect on the PlayStation 3 thanks to frame rate dips, I'd play it on the PS3. That, to me, is when these statistics matter.

But, when it comes to Tomb Raider? Telling me that the Xbox One's version locks at 30FPS while the PlayStation 4's checks in at 60FPS is a non-factor. They're both stable, they both look good and my purchase choice will come down to things like multiplayer, friends who will be playing and even controller preference. In fact, whichever console I currently internally favor for a whole list of reasons will likely be my choice.

There's another argument floating around out there, and that's the notion of cinematic look. Cinemas show movies, at least in the days of physical film, at 24FPS. Some folks have suggested that 30FPS is better for cinematic games because it comes closer to the norm for film.

Well, Eric, true or false?


If it's a rock steady, locked frame rate that doesn't flicker in even the slightest, I could see that argument holding some water, and I can definitely see it mattering more in cut scenes than in actual gameplay. I acknowledge the logic of the train of thought, but I don't think there's a lot of evidence to support that.

Metal Gear Solid V - Ground Zeroes - 4

With that said, I think if it does matter, it's going to be dependent on what the game is going for. If we're talking about Metal Gear Solid V levels of photorealism, then I think 60FPS is a good mark to shoot for. When it comes to a game whose visuals are more about the art direction than they are the pores on the character's face, 30FPS is going to be fine.

You majored in film, right? So what do you think?


I think my parents never liked the idea of their son majoring in English and Film.

About this? I think there's a distinct look to 24FPS when compared to 30. 24 has a slight blur.

What people who make this argument aren't accounting for, though, is the flicker between frames in old cinemas. Projectors show a frame of film and a frame of black over and over again. That's how movies used to work with analog film. That flicker gave the theater a very distinct feeling over, say, your television at home. Aside from being bigger and louder, movies shown in theaters had that distinct flicker.

If you want your games to look like old fashioned films, add a flicker. That's the sound and look of the shutter opening and closing.

So, no, I don't buy the argument that 30FPS looks more cinematic. Even modern cinema doesn't look "cinematic" anymore, thanks to digital projectors. That flicker is all but dead. I think the term "cinematic" is thrown around regarding budget, characters, camera angles and shot composition. Those are all cinematic things that games can share with film.

Frame rates? Irrelevant.


So it sounds like we're mostly in agreement here. Frame rates and resolution kind of matter some of the time. That sound about right?


In the most non-committal way possible, yes.

It's like this for me: resolution and FPS are great tools for discussing the core graphical strength of the PlayStation 4 and the Xbox One. If how a game looks is what dictates your console purchases, these numbers are invaluable.

For me? I want all the games (a life choice my wife barely supports), so these numbers have very little affect on my console choice. If, however, they show me that a game is playable on the PlayStation 4 but terrible on the Xbox One, consider my platform choice for that game informed.

These numbers matter. They also matter right now because we're still trying to figure out exactly how good today's consoles are.


And the answer to that question – even with these numbers – is still way up in the air. The metaphorical ball hasn't even started to descend for the tip off. Just last week, Microsoft announced DirectX 12 and apparently the Xbox One just got a new update to the development kit that led one developer to say that they were worried previously about getting their game to run at 1080p and can do so now comfortably.

While we gripe about these consoles being unfinished, it's also cool to see that developers and manufacturers have the flexibility to make something better if it's busted. Companies have rushed products to market before, and in previous eras, Microsoft would've been out of luck like so many others. Now, though, they can make what are apparently substantial improvements to the development process and gamer experience.

Which console is better? They're both really great, and they're both still changing in substantial, meaningful ways. It might bear out that the PlayStation 4 really is a markedly better system over the years, but I think a more likely scenario is near-parity like we see with the Xbox 360 and PlayStation 3.


Right, and here's the only piece of evidence I can give regarding the long term lives of these consoles: the games we played at the onset of the 360 and PS3's cycles don't look nearly as good as the games that came out towards the end.

I bet that repeats with these new machines, and I love that idea.


Exactly. One of the real, tangible advantages console gaming has, independent of platform and regardless of what the PC Master Race might say, is watching developers get more and more comfortable with the hardware. The strengths and weaknesses of the platforms come into sharper focus over time and the people making games get better and exploiting them. I don't see that changing.

Which is where we are right now.

We know that not every gamer will agree with what we've offered up here; but, we'd like to know what you think. Do you care about these numbers? Is Resolutiongate a thing that matters to you?