Take the 2-minute tour ×
Game Development Stack Exchange is a question and answer site for professional and independent game developers. It's 100% free, no registration required.

It seems to me that it would be more logical, reusable and user-friendly to implement flexible, responsive UI layout over a 3d or 2d screen, which can then be run on any screen resolution.

Some modern games auto-detect screen resolution and adjust the game to that, but the option to change the resolution still remains in the settings.

Why is this option there?

My guess is that this is one way to provide older machines with a way to improve performance by rendering less graphics and just stretching them across the screen, but surely there are better ways of improving performance (choosing different texture and model qualities, for example).

share|improve this question
    
In addition to what's said here, I think some games also choose one of the lowest resolutions as being 'safe': nearly all hardware configurations will support it, so the games starts without crashing, and then the user can change the resolution in-game. –  MicroVirus yesterday
    
@MicroVirus I very much second this one, starting with a safe and likely to launch mode is a good plan on a very variable pleform. –  Vality 18 hours ago
3  
"It seems to me that it would be more logical, reusable and user-friendly to implement flexible, responsive UI layout over a 3d or 2d screen" - It seems to you, but in practice it's often easier and cheaper to not do that. –  Superbest 17 hours ago
    
@Superbest: As a counterargument, World of Warcraft switched to a "fixed size logical UI"(ish). The UI is 768 "scaled pixels" high, and an aspect-appropriate width: wowwiki.com/UI_coordinates (usually 1024x768, 960x768, or 1229x768) –  Mooing Duck 14 hours ago
add comment

7 Answers

So the user can choose the quality vs performance of the game. Some people would rather have the graphics settings on higher quality and the resolution lower, and others the opposite. Some computers can handle max everything, some cant.

Notice that devices with the same hardware (PlayStation, Xbox, iPhone etc...) don't offer graphics settings (usually) because users are using the same spec hardware

share|improve this answer
add comment

There are already some good answers here but to add one more, there is no way to completely reliably detect the correct screen resolution. Approach A is to simply not change it, leaving it at whatever desktop resolution the user uses. This is annoying as I know a number of people (some of whom have visual impairments) who like to run their desktop at a lower resolution to make things look larger but still prefer games at the native resolution where small text and details are rarer and less critical.

Secondly you could look at the monitor's supported mode list, this is also unreliable, some monitors do not provide this, some provide it incorrectly and some provide resolutions higher than native which they are able to down-sample, in the last case it will use extra resources and look awful and in the other two cases it may just not work.

In a way your question could apply to any setting a game provides "Why should I ask the user what they want when I can already guess what would be best for them?". Why would I ask they about quality settings when I can detect their hardware? Why would I ask about FOV if I know what the game looks best with? Why should I let users turn of AA if their computers can handle it? This more general question highlights the real answer which is twofold.

Your guesses are not always right, you will not always have the correct hardware information and will not always be able to make a good guess, what if a users graphics card is bugged causing the computer to crash at a certain resolution and you will not let them disable it, your game is now useless to them. Also, users like choice, perhaps someone prefers the fuzzy look of playing at a non native resolution even though it is 'wrong', perhaps they have a CRT monitor which supports many resolutions and the optimum one is not the highest, or they get better refresh rates at a lower one.

In effect you cannot judge accurately what the user -wants- just with information from the computer, neither can you accurately judge what will work best. It is best to leave it to the user who knows their own computer to decide.

If I have missed anything please leave a comment, I hope this is helpful (note I only really cover PC here, consoles are a different story but have already been covered by others)

share|improve this answer
10  
+1; this particularly annoys me in games which do not give me a gamma slider (or at least a brightness setting). It's great that you found a look that you think makes your game look best on your monitors, but your monitor is not my monitor, and I @%&$ing hate dark environments where I can't see. I don't care if the black-on-black is "the point" of your stealth level, I want to see what I'm doing. –  Brian S 18 hours ago
    
"what if a users graphics card is bugged causing the computer to crash at a certain resolution and you will not let them disable it" ah, the good old days with D2 and the barbarian's battle cry that always crashed my computer :) –  Stop forgetting my accounts... 9 hours ago
    
You can pretty reliably detect native resolution, the resolution of the desktop is a good indicator. Virtually everyone runs that at native res, and it is at least a good starting point. However that doesn't mean you shouldn't offer the option to change it. For my current project, full screen will only work at native (desktop) resolution, and if people want to lower it for whatever reason, then either go windowed mode, or it would change the back render target size and just let the graphics card stretch it. –  Programmdude 2 hours ago
add comment

There are a VAST number of reasons to allow the user to control the settings for their game.

  • MANY people have 2 (or more) monitors these days. The user should be able to determine which one to play the game on.
  • There are thousands of different devices a user could be using, and no way to reliably tell what setting would be optimal for every one.
  • Some users may prefer less than "optimal" settings for reasons ranging from better game performance, to poor eyesight (lower resolution = bigger text!).
  • Some users may prefer to play the game windowed, so they can have their chat boxes, walkthroughs, music player, or other programs visible at the same time as the game.

That said, many games DO auto-detect, and then set the initial settings to whatever it thinks is the best your machine can handle.

share|improve this answer
    
Excluding windowed mode, are there any games that let you choose which monitor to run on? I've been multi-monitor for a number of years and don't ever recall a full screen game giving the the option to run on something other than my primary display. –  Dan Neely 17 hours ago
    
I can't recall specifics, but I've had at least a few games take over my secondary monitor, and I've occasionally had games that changed my monitors settings ... If I've got the resolution on my monitor other than the default ... well there's probably a reason. –  aslum 16 hours ago
    
I've seen quite a few games with this as an option, usually called 'display adaptor' in the menu. EVE Online is one example. –  Ben 23 mins ago
add comment

That's because the cost and effect of texture quality, geometry detail and screen resolution are very hardware-dependent.

The texture quality usually does not have much impact on the speed of the rendering pipeline, but only when they are read from GPU memory. When not all textures fit into the GPU memory, they need to be read from normal RAM or even worse from the hard drive cache, which affects performance negatively. Reducing geometry* and omitting expensive effects** won't help much. But when the execution speed of the rendering pipeline is the bottleneck, reducing texture resolution won't help much either.

Vertex shaders are usually unaffected by the output resolution. The only way to reduce the load on them is to reduce quality and quantity of the 3d models in the scene.

But any pixel shaders still scale linearly with the number of pixels on the screen. Reducing the screen resolution is still an important tool to improve performance. Having half the horizontal- and vertical resolution means you have just a quarter of the calls to the pixel shaders.

In contrary to the older CRT displays, modern LCD or plasma screens have a native pixel-resolution. When they are fed with a video stream in a different resolution, they need to interpolate. Some interpolate much better than others which means that running them on a lower resolution doesn't reduce the picture quality a lot while other monitors really look bad when not run on their native solution (the first LCD screen I owned used nearest-neighbor interpolation, which looked horrible. With my current screens, it's hard to tell when they don't run on the correct resolution).

The game engine can not know how well the monitor of the user interpolates, so it's better to leave the choice to either reduce texture and geometry detail or reduce the screen resolution to the user.

*) OK, reducing geometry might help a bit because vertices also consume GPU memory.

**) unless, of course, omitting these effects means that some textures are no longer required

share|improve this answer
    
Just to nitpick: CRTs have a "dot pitch" which means that they really DO have a native resolution. The phosphors in a CRT have a decay time which means a CRT has an ideal refresh rate as well. –  Zan Lynx 19 hours ago
add comment

Not only the predefined resolutions settings make it possible to react with performance in some way, but creating a universal approach, which fits all kinds of resolutions, ratios and dpis is just way much harder to create.

If you create only few different resolutions, then it is user worry to choose best looking for him (especially when his resolution/ratio is unusual). If you make one universal design, you are responsible for making the game looking perfect on all the devices (could be especially hard on mobiles nowadays, espiecialy not on iOS, but on the other platforms with a huge variety of devices).

share|improve this answer
add comment

Additional reason: Changing environments.

1080 is now standard, but wasn't 5 years ago. 4k is entering the picture and how long until 6k/8k/1080k (or whatever) enters the picture? What about gaming on an 8 year old computer that tops out at 1280x720? 1024x768? 640x400?

My monitor supports 4k native, but It'd choke on the bandwidth required and max out at 30fps. Let me switch to 1080 (or lower) so that the game runs at 60fps and the text/hud/etc isn't microscopic.

(Additionally: Give me windowed / maximized windowed as well please so I can tab around easily lol)

share|improve this answer
add comment

Tradition.

Mostly. For the longest time games only supported 1 or 2 resolutions, for those it made sense to give the player the choice which version of the game they want to play. Then over time more diverse monitor configurations arised and people just added more and more resolutions into the menu. And then finally the amount of possible configurations grown so complex that resolution independent game code became necessary. That's what we have today. APIs also changed, modern APIs allow you to create windows of the exact size of the desktop, and to check beforehand which resolutions are suitable, that wasn't always the case.

For simplicities sake I would only suggest to either use a fixed resolution for better artistic control over the game, or always use the native resolution for the best display quality.

If you want to give the player control about the performance give them control about things like anti aliasing or shadow quality, not resolution. GPUs are very fast nowadays and using a non-native resolution often considerably decreases the perceived quality, enough to make the performance gain rarely worth it.

share|improve this answer
1  
Native resolution only provides the best quality if the GPU is fast enough to run it at relatively high quality settings. When it's not, turning more eye candy on and upscalling from a lower resolution often looks better than playing at native resolution but having to use low quality textures, disable shadows, disable reflections, reduce the maximum distance objects are shown at, etc. This is why as consoles get older fewer and fewer games are rendering at native; and even with the current generation a some high profile XBone games launched at 900p instead of 1080p. –  Dan Neely 21 hours ago
    
Performance isn't limited by the screen resolution anymore - Interesting! I didn't know this, that explains why my 1440p monitor doesn't make it any harder to run games at native resolution! But seriously, resolution is still a major factor, doubly so these days due to more and more fancy shader effects that have O(n) or worse runtime. –  Phoshi 19 hours ago
    
I must add firstly that not everyone uses an LCD, many very high quality screens are CRTs. Second the APIs used to query available display modes is flaky at best and sometimes downright broken. In addition I am very concerned about the comment that performance is not related to screen resolution. Anti aliasing and a number of other effects are very much effected by it. –  Vality 18 hours ago
    
-1. This answer is very opinionated and lacks any facts to back up some strange claims. –  Cypher 18 hours ago
    
I improved the wording a bit, since there seem to be a few misunderstandings. –  Mr. Beast 17 hours ago
add comment

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.