Media Why frame rate and resolution matter

Collector Freaks Forum

Help Support Collector Freaks Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

Fritz

My muse is a fickle b**ch
Joined
Sep 9, 2005
Messages
8,345
Reaction score
31
Location
Niagara Falls
I couldn't figure out where to post this since it concerns all platforms so I figured why not start a thread. It's a long read but for those interested, here you go...

SOURCE - Polygon

Why frame rate and resolution matter: A graphics primer

By Samit Sarkar on Jun 05, 2014 at 10:00a

Graphics has always been the foremost battleground in the console wars. The participants in those never-ending debates try to bring in objectivity by quoting numbers — and in the case of visual prowess, resolution and frame rate are the two figures most commonly cited.

But what do they even mean? What's the difference between 720p and 1080p, or between 30 frames per second and 60 frames per second — and is it an academic distinction, or a meaningful one? In other words, why should you care?
What are resolution and frame rate, anyway?

Let's begin with some basic definitions, for the uninitiated.
Frame rate

Standard video, such as film or broadcast television, consists of still images that are captured consecutively and played back in quick succession. A "frame" is a single one of those images, and the "frame rate" is a measure of frequency: how often the video is updated with a new frame. For video games, that translates to the frequency at which the game redraws the image on the screen. Frame rate is measured in frames per second (fps).

Refreshing the frame must happen very quickly in order for the player to perceive motion. How quickly, you ask? Movies are traditionally filmed and exhibited at 24 fps. Television in North America is broadcast at 30 fps or 60 fps. Developers we spoke with for this article indicated that 30 fps is a general baseline that game makers don't want to fall short of — below that threshold, things start to look choppy, like the audio from a skipping record.

A game's frame rate is separate from the screen it's being displayed on. Displays have their own frequency: the "refresh rate," or how often the device (such as a TV or monitor) refreshes its screen; this is measured in hertz (Hz), where 1 Hz is one cycle per second. Most modern TVs and monitors have a refresh rate of 60 Hz, so the optimal situation is for an image source (like a game console or Blu-ray player) to come in with a frame rate that evenly divides into 60. Think about it this way: A standard TV refreshing at 60 Hz would go through all 60 frames of a 60 fps feed in a single second — one frame every one-sixtieth of a second. The same TV would show each of the 30 frames in a 30 fps feed twice every one-thirtieth of a second.

Thus, games run into problems on most displays when they're not running at 30 or 60 frames per second. If a game ignores a screen's refresh rate and runs at whatever frame rate it can manage — a state known to PC gamers as "v-sync off" — it causes screen tearing (parts of multiple frames are visible at once). If you turn v-sync (vertical synchronization) on, which caps the frame rate at the display's refresh rate, it can cause stuttering and input lag whenever the frame rate falls below that refresh rate.
Resolution

The size of an image is known as its "resolution." Modern widescreen displays feature an aspect ratio of 16:9 (the width is 1.78 times the height), and resolution is literally a measurement of the width and height, in pixels, of an image. The minimum resolution that qualifies as "high definition" is 1280x720 ("720p" for short), while the higher-fidelity standard of most new HDTVs is 1920x1080 ("1080p"). Just as with megapixels in a digital camera, a higher pixel count — higher resolution — provides more detail in an image.

The "p" stands for "progressive," shorthand for "progressive scan," a technique in which a display draws every line of a single frame sequentially, from top to bottom, within the space of a single refresh cycle (one-sixtieth of a second). This is different from the previous method, "interlaced," where TVs would alternate drawing odd- and even-numbered lines of the image every one-thirtieth of a second. Progressive scan makes for a smoother image that's less susceptible to the flickering that interlaced video suffers from.

A 1080p image contains 2.25 times as many pixels as a 720p image. So it's notably tougher for a game to generate a 1080p image than a 720p image. The PlayStation 4, Wii U and Xbox One are all capable of outputting games in 1080p. Sony and Microsoft have actually been touting the availability of 1080p gaming since the start of the previous console generation; the PlayStation 3 and Xbox 360 could also do 1080p. But we're very early in the life cycle of the PS4 and Xbox One, and at this point, games that run in native 1080p are relatively rare — especially on the Xbox One, although that's an argument we'll skirt here.

Instead, some games are rendered at a sub-1080p resolution in order to maintain the visual fidelity of the game; the console then upscales the image to 1080p before it sends the picture to the TV. For example, the PS4 version of Watch Dogs runs at 900p (1600x900), while the Xbox One version runs at 792p (1408x792). Developers make those decisions depending on the game and console in question.

How do developers prioritize frame rate and resolution?

Naughty Dog's Cort Stratton is a senior programmer on Sony's ICE team, which develops graphics technology that is shared across Sony Computer Entertainment's first-party studios and with third-party developers. According to Stratton, frame rate and resolution are related, but it isn't as simple as them being inversely proportional.

Generally, resolution is entirely under the purview of the GPU. Stratton provided this simplified explanation: "The CPU sends the GPU a list of objects to draw and a resolution at which to draw them; the GPU hunkers down and runs the dizzyingly complex calculations to figure out the appropriate color at each pixel." For example, doubling the resolution wouldn't affect CPU performance, but it would require the GPU to pump out four times as many pixels.

"While it's true that rendering at higher resolutions is more work for the GPU, this only affects the overall frame rate if rendering is the main performance bottleneck. It's often possible to increase resolution (to a point!) without affecting the frame rate," Stratton explained. "So, it's certainly not as simple as a dial that developers can turn, with 'silky-smooth [first-person shooter]' at one end and 'jaggy-less 4K resolution' at the other."

Peter Thoman, who is known for making a Dark Souls mod that vastly improved the graphics in the game's Windows PC port, agrees.

"While increasing the resolution only increases GPU load, increasing the frame rate also increases the CPU load significantly," Thoman told Polygon. "So in cases which are CPU-limited (or limited on the GPU by some very specific factors), you might be able to increase resolution while not affecting (or only slightly affecting) frame rates."

While the impact of resolution on performance depends on a variety of factors, there's a concrete measure of the difference between frame rates. The "frame time" is the time it takes to execute a single frame, and is generally expressed in milliseconds. At 30 fps, developers have one-thirtieth of a second, or 33.33 milliseconds, to render each frame. Doubling the frame rate to 60 fps cuts the frame time in half to one-sixtieth of a second, or 16.67 milliseconds. It takes time to render everything on the screen — objects, particle effects like explosions, visual effects like antialiasing and more — so whatever the target frame rate is, the total rendering time can't exceed the corresponding frame time.

In Stratton's experience, developers decide on a frame rate target, not a resolution, and go from there. Simply attempting to make the game you wanted to make and see where the frame rate and resolution ended up, well, that wouldn't be a good use of development resources. Although Stratton said he hasn't been personally involved in that decision-making process, he listed a few potential factors: hardware and engine technical limitations, a desire to keep up with competing games, a studio's past history and the art department's creative vision.

"For all the games I've worked on, the frame rate target is the fixed point against which all other performance decisions are made," said Stratton. "If your target is 30 fps, then you cut whatever corners are necessary to hit 30 fps as consistently as possible — reduce the scene complexity, use cheaper/lower-quality algorithms or yes, render at a lower resolution." Stratton noted that he has heard of some developers choosing to drop from 60 fps to 30 fps rather than make cuts in visual quality. But usually, the frame rate target is "the line in the sand that doesn't get crossed."

Do developers care, and do they think others should?

When it comes to frame rate and resolution, Stratton told Polygon that his personal bare minimum is 720p at 30 fps, since "anything below those levels is actively unpleasant." However, he continued, "anything higher isn't necessarily a clear improvement." Asked to choose one or the other, he said he'd probably prefer a higher resolution than a higher frame rate, since he "[doesn't] personally place much value on frame rates above 30 fps."

This is a your-mileage-may-vary area of graphics because your personal preference is likely to depend on the kinds of games you play. The higher the frame rate, the lower the theoretical input lag (the delay, in milliseconds, between entering a command on a controller and seeing an action on the screen). According to an analysis of responsiveness originally published in Game Developer magazine, the real-world minimum input lag for a 60 fps console game is three-sixtieths of a second, or 50 milliseconds — a figure that is doubled to one-tenth of a second, or 100 milliseconds, for a 30 fps console game. That gap can make a difference in games that depend on very responsive controls.

"I acknowledge that ultra-low input latency has tangible benefits for certain genres of games (fighters, multiplayer shooters, etc.), but those aren't the games I tend to be playing," said Stratton.

"In most cases, I'd rather a team uses the extra 16 milliseconds per frame," he continued, referring to the difference in frame time between games at 60 fps and 30 fps, "to make the game world significantly more beautiful/fun/rich/complex/awesome, instead of wasting it all to make the game marginally more responsive to player input."

Thoman has a higher threshold for performance than Stratton's 30 fps, but his opinions are otherwise similar.

"What's more important to me depends on the genre. In anything but very reaction-based action games I can live with 45+ fps, and then try to increase [image quality] as much as possible while staying above that boundary," he said. "I'm generally not that hung up on frame rates above ~45 [fps]."

Even in the land of facts and figures, personal preferences play a major role in discussions about visual quality. Both Thoman and Stratton agreed that people argue over the importance of frame rate and resolution partly because they're trying to justify their investment in a particular console. But according to Thoman, these figures do matter beyond serving as ammunition in the console wars, and they should matter.

"I strongly believe that gamers should care about resolution and frame rate, because the former makes games look much better and the latter does that, and in addition makes them more playable," he said. "I'm always surprised when publishers claim that resolution differences don't matter — if that is the case, then why are most of them sending out screenshots rendered at 8K?"

While a higher-resolution image is ostensibly better than a lower-resolution one, it's more difficult to distinguish between two images when they're both in HD — especially considering how far most people sit from their TVs.

"In a side-by-side comparison, I don’t believe the average consumer can tell the difference between 720p and 1080p," said Barry Sandrew, PhD, a digital imaging expert who is the founder, chief technology officer and chief creative officer of Legend3D, a well-known stereoscopic 3D and visual effects firm. "If you get into the higher resolutions like 4K or 8K, there is an obvious difference, but the difference is best appreciated when sitting close to the screen and the TV is larger than 55 inches."
What does the future hold?

According to Stratton and Thoman, developers are able to wring more and more performance out of a particular hardware platform as they become increasingly familiar with it. In fact, Stratton explained, this is what keeps consoles relevant long after they've been outclassed by newer, more powerful computers.

"Uncharted: Drake's Fortune and The Last of Us were made by many of the same people, and run on the exact same hardware, yet [The Last of Us] looks significantly better than [Uncharted: Drake's Fortune]," said Stratton. "It's worth pointing out that the PlayStation 3's hardware is nearly 10 years old. Go look at PC games from 10 years ago and compare them to the PS3 games being released today. That's what years of experience with a fixed hardware platform gets you."

Stratton also noted that he didn't fully believe this was true before he was working with console hardware for a living. As a student, he figured that developers under-utilized a console's power early on in its life cycle in order to "give themselves some headroom to improve in later titles." But an experienced developer who visited his school disabused Stratton of that notion.

The individual explained that every game uses 100 percent of the system resources that the developers can access. "We can't add anything to the sequel without first improving our existing tech to make room for it," Stratton recalled the developer saying.

"Game developers are constantly learning, constantly optimizing, constantly swapping tricks with each other to make better use of each generation of hardware," Stratton continued. "The same is true at the platform level; first-party core tech teams like mine are tirelessly improving the performance of the OS and core rendering libraries, and passing the improvements on to developers."

Six months into the life of the PS4 and Xbox One, there aren't many games on either console that reach the holy grail of 1080p60. But it's possible, perhaps even likely, that developers will get there over time.

This also applies to any differences in power that may exist between the consoles. Compare the current situation to the previous console generation. Because the Xbox 360 was easier for developers to work with, games on that system often ran better than their PS3 counterparts during the early years of the generation. But game makers eventually figured out the idiosyncrasies of the PS3 hardware, and were able eliminate significant disparities in performance.

In mid-November, just before the launch of the new consoles, we spoke with a developer of one major multiplatform title who told us that although the PS4 may be ahead of the Xbox One at this point, the systems are similar enough that — given enough time with both — the performance gap is "completely temporary."

"While I can't go into specifics," said Stratton, whose work includes writing the official rendering API for the PS4, "please rest assured that we're nowhere near hitting the full potential of next-gen consoles yet."
 
Good article. Personally, I could never tell the difference between 30 and 60 fps gaming. The differences in milliseconds don't seem to matter too much. Although, I absolutely hate screen tearing, and I try to leave V-Sync enabled at all times in my PC games. So, it does seem that that 30 fps should be achieved as the bare-minimum if you have a monitor that refreshes at 60 Hz.

I'm really in the need of a new graphics card to handle higher resolutions, though.
 
Good article. Personally, I could never tell the difference between 30 and 60 fps gaming. The differences in milliseconds don't seem to matter too much. Although, I absolutely hate screen tearing, and I try to leave V-Sync enabled at all times in my PC games. So, it does seem that that 30 fps should be achieved as the bare-minimum if you have a monitor that refreshes at 60 Hz.

I'm really in the need of a new graphics card to handle higher resolutions, though.
I'm sure you can tell the difference between 30 and 60. You might not pay much attention to it, but I'm sure you can see it. It's a pretty considerable difference.
 
I'm sure you can tell the difference between 30 and 60. You might not pay much attention to it, but I'm sure you can see it. It's a pretty considerable difference.

Can you see the difference? :lol I haven't been able to, for the years that I've been playing games.
 
I haven't that your entire post in its' entirety, Fritz but seems like a compelling read. I've always been an advocate for 60fps gameplay, it's not a necessity but rather a luxury. There are some games that truly play differently in 60fps. Fighting games like Tekken however, were always in 60fps and they can't operate at any lower due to the precise controller response needed.

Here is Uncharted 3 in 30 fps, which approximates what the retail version of the game felt like:



Here is the same gameplay, kept by Naughty Dog as an internal development video, running at 60fps. Remember to toggle the vid to HD and hit fullscreen to get the effect, it's the first video.

https://www.eurogamer.net/articles/digitalfoundry-what-if-uncharted-3-ran-at-60-fps

You should CLEARLY be able to tell the difference between the two.
 
Can you see the difference? :lol I haven't been able to, for the years that I've been playing games.

Yeah, it's pretty noticeable when you go from a game running at 30 like say Infamous Second Son or even BF4 which is occasionally at 60, but often dips to something like Titanfall or Wolfenstein that have locked 60. There's a smoothness and natural quality to it that's hard to explain, but once you notice it, it's missed from games playing at lower fps.
 
When it's competitive multiplayer, 60fps or more is mandatory.

When it's a story driven game, with slower paced gameplay, 30fps is the best way to go, since it gives the feel of watching a movie, and having it sped up to 60 or more makes it feel like a cheap soap opera.

That's why I'm excited for The Order.
 
When it's competitive multiplayer, 60fps or more is mandatory.

When it's a story driven game, with slower paced gameplay, 30fps is the best way to go, since it gives the feel of watching a movie, and having it sped up to 60 or more makes it feel like a cheap soap opera.

That's why I'm excited for The Order.

I use to feel the same, but playing MGS GZ on PS4 changed my mind. The game really looks incredible running in 60 fps.
 
does the motion enhancers on HDTV improve the gameplay?

movements are certainly smooth when enabled, so is it ideal for gaming? some say it's like a soap opera but some TVs have Custom settings for it to the person's likings
 
does the motion enhancers on HDTV improve the gameplay?

movements are certainly smooth when enabled, so is it ideal for gaming? some say it's like a soap opera but some TVs have Custom settings for it to the person's likings

No. I'd recommend turning that off when gaming. Motion enhancing is simply having the tv guess what the frames between frames would look like (poorly) and that's why it looks odd. If anything, it may hurt your gaming performance.
 
Depending on your TV it can cause a big input delay too.

What part does monitor response time play in frame rate? I know lower is better, but is 5ms good enough for FPS gaming?
 
does the motion enhancers on HDTV improve the gameplay?

movements are certainly smooth when enabled, so is it ideal for gaming? some say it's like a soap opera but some TVs have Custom settings for it to the person's likings

No. Generally all types of artificial enhancement features on modern TVs only deviate from calibration standards. I'd recommend tuning everything off, minus sharpness, which should be kept at a bare minimum.

Depending on your TV it can cause a big input delay too.

What part does monitor response time play in frame rate? I know lower is better, but is 5ms good enough for FPS gaming?

Yes it's good enough.
 
I should have said, is 2 or 1 ms noticeably better than 5?

Yes 2 and 1 ms response time is definitely better but nothing that dramatic. I had a Samsung monitor at 5ms years ago and it was a HUGE upgrade over my old display, which had a response time of 10ms, if I recall. When I bought my new LG which had 2ms response time, it didn't made that much of a difference that I would write home about.

That being said, proper calibration such as a 6500k color temperature is a FAR more important aspect of a picture regardless if you're gaming or watching movies.
 
Back
Top