Page 10 of 20
Re: Dodgy EDIDs (was: What does your OS look like?)
Posted: Sat Feb 06, 2016 5:51 pm
by Brendan
Hi,
onlyonemac wrote:Brendan wrote:Incorrect is horrible (ranging from strobing/flickering to rolling to black screen). I'm not talking about "incorrect"; I'm talking about "perfectly correct but not the native resolution".
Perhaps this is not obvious but I'm defining "incorrect resolution" as having the most obvious meaning, that being that the resolution is incorrect i.e. does not match the resolution that the monitor is designed for i.e. is not the native resolution. Referring to a "perfectly correct but not native resolution" is a self-contradicting statement; if it's not native then it is by definition incorrect.
"Incorrect" is "video timing not supported by display". If the native resolution was the only correct video mode then monitors would only support 2 video modes in the first place (its native resolution and 640*480).
Assuming a modern monitor (that will still be obsolete before my OS is released), and assuming we're talking about my OS's graphics stack (with resolution independence and "fixed frame rate, variable quality") and not something irrelevant (existing Windows, Linux, etc); I very much doubt that anyone will be able to notice any difference between the monitor's native resolution and the monitors second best resolution (without their nose touching the screen and/or a magnifying glass).
onlyonemac wrote:Brendan wrote:Note that Windows typically has (had?) no resolution independence; so changing the video mode causes everything to be broken - things like desktop icons that are the wrong size (and laid out different to before), all the text/fonts being (physically) larger/smaller, applications that remember their previous window size that end up too small or too large for the screen, etc. Because of this, it's entirely possible for someone to use a "not native" resolution for years without knowing it, then hate it when someone sets the computer to the native resolution because everything is too small (and unreadable and gives them watery eyes just by looking at it).
I never said anything about the physical size on the screen of the text; of course one can scale the display to match the pixel pitch but if the pixel pitch is too low (because the monitor is running at a non-native resolution) and, more importantly, the pixels output by the video card do not match the display's physical pixels and the display is having to perform scaling on the input signal in the case of an LCD (because the monitor is running at a non-native resolution), then the text will still be unreadable (or close to unreadable).
Yes; if the resolution is too low then things get harder to read (regardless of whether its the native resolution of not); and the difference between (e.g.) 800*600 and 1024*768 back in 1990s was significant. For modern displays it's completely different - the size of a pixel is much smaller than the human eye can actually see and the difference between (e.g.) 2560*2048 and 1920*1600 is completely irrelevant.
Don't believe me? Search for "is 4K worth it" and I guarantee the search results will be a large number of pages that explain why modern high resolutions are completely pointless (until/unless the user sits so close that most of the picture is beyond their peripheral vision).
If anything; if I wanted to improve my OS for something that actually matters (rather than this completely irrelevant issue), I'd modify it to deliberately choose a lower resolution (that's still higher resolution than the user can see) just to reduce the power consumption/heat caused by trying to render pixels that don't make any difference.
onlyonemac wrote:Brendan wrote:Each user has to have a profile; which includes things like their user name, password; internationalisation (time zone, language and keyboard layout), accessibility options (if they are colour blind, have photosensitive epilepsy, deaf, etc), and permissions (whether they belong to various "user groups"). None of these settings have anything to do with any specific computer (and none of them are hardware settings/configuration) - if the same user logs into to 20 different computers (within the same cluster) then every different computer uses their settings.
Fair enough - so how about you then similarly give each computer an (optional) "profile"?
Why would I bother when it's unnecessary? If I could avoid it (if the vast majority of "human beings" supported auto-detection/auto-configuration) I wouldn't have user profiles either.
onlyonemac wrote:Brendan wrote:onlyonemac wrote:Changing the keyboard layout has at least as much potential for a user to misconfigure things as changing the monitor resolution does (and you keep insisting that you don't want any way for users to misconfigure things and make a mess).
I've been using "QWERTY" my entire life. If I happen to use a computer that has a "Dvorak" keyboard layout and the OS configures it as "Dvorak", then the OS has mis-configured the keyboard. Whatever is actually printed on the keys is irrelevant (unless you're learning how to type).
I'm not quite sure what you're getting at but what I mean is the simple case of a (dumb) user changing their keyboard layout to e.g. Dvorak and then being unable to change it back to QWERTY because they cannot find the correct keys. Likewise if they change the language.
Why would the user change their profile (assuming it was set correctly by the administrator when the administrator creates the new user's account); and why can't they use the mouse (or touchpad or touchscreen or whatever other input devices there are)?
Cheers,
Brendan
Re: Dodgy EDIDs (was: What does your OS look like?)
Posted: Sat Feb 06, 2016 7:58 pm
by Rusky
Brendan wrote:Don't believe me? Search for "is 4K worth it" and I guarantee the search results will be a large number of pages that explain why modern high resolutions are completely pointless (until/unless the user sits so close that most of the picture is beyond their peripheral vision).
There's a threshold resolution (in pixels-per-degree, thus taking into account pixel density and viewing distance) where the eye can no longer resolve individual pixels. A typical 1920x1080 monitor on someone's desk has not crossed that threshold. Maybe 4k is well beyond it, but it's definitely worth crossing.
Re: Dodgy EDIDs (was: What does your OS look like?)
Posted: Sat Feb 06, 2016 8:49 pm
by Brendan
Hi,
Rusky wrote:Brendan wrote:Don't believe me? Search for "is 4K worth it" and I guarantee the search results will be a large number of pages that explain why modern high resolutions are completely pointless (until/unless the user sits so close that most of the picture is beyond their peripheral vision).
There's a threshold resolution (in pixels-per-degree, thus taking into account pixel density and viewing distance) where the eye can no longer resolve individual pixels. A typical 1920x1080 monitor on someone's desk has not crossed that threshold. Maybe 4k is well beyond it, but it's definitely worth crossing.
Yes.
Now let's extrapolate from historical data to predict the hardware people will be using when my OS is released (a minimum of 10 years). For example, (for "primary desktop resolution") from
Steam's 2007 hardware survey (I couldn't find 2006) and from
Steam's most recent survey you can see:
- In 2007, most people used 1024*768 and 1280*800
- In 2007, 1366*768 and 1920*1080 existed but few people used it
- In 2016, the number of people still using "2007 video modes" is virtually zero
- In 2016, most people use 1366*768 and 1920*1080
- In 2016, "4K" exists but few people use it
From this you can probably expect:
- In 2025, the number of people using "2016 video modes" will be virtually zero
- In 2025, most people will be using 4K
- In 2025, "8K or higher" will probably exist (mostly for marketing and not common sense, and/or to get "4K per eye stereoscopic") but few people will be using it
Essentially; even if something that hasn't happened for 10 years now actually does happen again in future and a dodgy monitor has the same PnP ID as another; everyone is going to be beyond that "threshold resolution (in pixels-per-degree)" and nobody is going to care.
Cheers,
Brendan
Re: Dodgy EDIDs (was: What does your OS look like?)
Posted: Sat Feb 06, 2016 11:47 pm
by Rusky
I guarantee people will care when their 4k display suddenly is no longer 4k. But really, enough about this one particular way things could go wrong. The point is that autoconfiguration in general is going to make mistakes sometimes, so it's worth leaving in workarounds.
Re: Dodgy EDIDs (was: What does your OS look like?)
Posted: Sun Feb 07, 2016 12:59 am
by Brendan
Hi,
Rusky wrote:I guarantee people will care when their 4k display suddenly is no longer 4k. But really, enough about this one particular way things could go wrong. The point is that autoconfiguration in general is going to make mistakes sometimes, so it's worth leaving in workarounds.
Either auto-configuration works or someone (OS developer or hardware manufacturer) gets blamed and should be expected to fix their problem. No excuses. No
"enabling".
Cheers,
Brendan
Re: Dodgy EDIDs (was: What does your OS look like?)
Posted: Sun Feb 07, 2016 1:04 am
by Rusky
I think that approach is perfect for something like Apple where one entity has complete control over the hardware and the software. But for an OS where you expect to be able to throw all kinds of hardware together (which is especially true in your case, where you even have multiple types of CPUs and graphics cards and network interfaces) I remain unconvinced. Minimizing the amount of configuration that is strictly necessary is still an admiral goal, though, IMO.
Re: Dodgy EDIDs (was: What does your OS look like?)
Posted: Sun Feb 07, 2016 2:55 am
by Brendan
Hi,
Rusky wrote:I think that approach is perfect for something like Apple where one entity has complete control over the hardware and the software. But for an OS where you expect to be able to throw all kinds of hardware together (which is especially true in your case, where you even have multiple types of CPUs and graphics cards and network interfaces) I remain unconvinced. Minimizing the amount of configuration that is strictly necessary is still an admiral goal, though, IMO.
I honestly can't think of anything significant where "per computer" configuration (as opposed to "per user" or "per cluster" settings) is actually needed.
Note that I'd be less opposed (but still opposed) to "per computer" configuration that can be left until after the OS's file system is up and running; but (until/unless there's native video drivers) choosing a video mode and setting up frame buffer/s is something that has to happen early during boot (while boot loader is still able to use firmware) and can't be left until after the kernel, networking and then distributed file system are running. Even with native video drivers I'd still want "configuration-less" video mode selection in case any of the many things that could go wrong during boot actually does go wrong (e.g. simply unplugging a network cable can prevent the native video driver from being started).
Cheers,
Brendan
Re: Dodgy EDIDs (was: What does your OS look like?)
Posted: Sun Feb 07, 2016 4:15 am
by onlyonemac
Rusky wrote:The point is that autoconfiguration in general is going to make mistakes sometimes, so it's worth leaving in workarounds.
Thanks. That's the point I've been trying to make for the entire thread.
Brendan wrote:I honestly can't think of anything significant where "per computer" configuration (as opposed to "per user" or "per cluster" settings) is actually needed.
That's because you have too much faith in your auto-configuration algorithms, compounded by your persistent insistence on discarding small quantities of affected, but drastically affected, users as insignificant.
Brendan wrote:Even with native video drivers I'd still want "configuration-less" video mode selection in case any of the many things that could go wrong during boot actually does go wrong (e.g. simply unplugging a network cable can prevent the native video driver from being started).
So the latest excuse (let's just ignore the fact that you've been coming out with so many arbitrary excuses over the course of the discussion) is that someone is just going to walk up to a computer and randomly unplug the network cable
while the video driver is being started? I'll tell you now, that's going to happen far less frequently than users complaining about incorrect auto-configuration is going to happen. Even a user who knows nothing about how to configure a computer system knows instinctively that unplugging something from the back of a computer while it is turned on is a bad idea, for whatever reason.
Re: Dodgy EDIDs (was: What does your OS look like?)
Posted: Sun Feb 07, 2016 7:40 am
by Brendan
Hi,
onlyonemac wrote:Brendan wrote:I honestly can't think of anything significant where "per computer" configuration (as opposed to "per user" or "per cluster" settings) is actually needed.
That's because you have too much faith in your auto-configuration algorithms, compounded by your persistent insistence on discarding small quantities of affected, but drastically affected, users as insignificant.
As I've explained repeatedly; "drastically affected" is an extreme exaggeration for multiple reasons.
onlyonemac wrote:Brendan wrote:Even with native video drivers I'd still want "configuration-less" video mode selection in case any of the many things that could go wrong during boot actually does go wrong (e.g. simply unplugging a network cable can prevent the native video driver from being started).
So the latest excuse (let's just ignore the fact that you've been coming out with so many arbitrary excuses over the course of the discussion) is that someone is just going to walk up to a computer and randomly unplug the network cable
while the video driver is being started?
I can't understand how someone can think "
I don't want any configuration. Even if there were native drivers, I don't want any configuration" constitutes an excuse of any kind. Nothing I've said has really changed since the beginning of this conversation - my position has always been "too unlikely, too irrelevant if it did happen, not worth wasting my time on, no configuration anyway". I also don't think it's my fault that I have to repeat myself in multiple different ways just because you have difficulty understanding simple things.
I said nothing about "while the video driver is being started".
The computer loads its boot code (mostly the minimum needed to get the file system up) and sets up "frame buffer" (partly to display error messages and partly because there may not be any native video driver); then does some security related stuff (wipe PCI config space, wipe IOMMU, setup "dynamic root of trust" for remote attestation); then does CPU detection, chooses a kernel and starts it, starts device manager, starts network card driver; then does remote attestation and tries to synchronise with the distributed file system. Any error that occurs during any of that (including "can't access network because cable was unplugged 2 days ago") means it can't access the distributed file system and can't load additional drivers (including the native video driver).
onlyonemac wrote:I'll tell you now, that's going to happen far less frequently than users complaining about incorrect auto-configuration is going to happen. Even a user who knows nothing about how to configure a computer system knows instinctively that unplugging something from the back of a computer while it is turned on is a bad idea, for whatever reason.
You do realise that every cable has 2 ends?
Cheers,
Brendan
Re: Dodgy EDIDs (was: What does your OS look like?)
Posted: Sun Feb 07, 2016 12:16 pm
by Rusky
Brendan wrote:Note that I'd be less opposed (but still opposed) to "per computer" configuration that can be left until after the OS's file system is up and running; but (until/unless there's native video drivers) choosing a video mode and setting up frame buffer/s is something that has to happen early during boot (while boot loader is still able to use firmware) and can't be left until after the kernel, networking and then distributed file system are running. Even with native video drivers I'd still want "configuration-less" video mode selection in case any of the many things that could go wrong during boot actually does go wrong (e.g. simply unplugging a network cable can prevent the native video driver from being started).
I think the same users who complain about bad screen resolutions would also understand if they had to wait for the native driver to set the resolution, since VESA can't always handle the native resolution anyway. So auto-config up until any per-computer config gets loaded along with the native driver would be fine.
Re: Dodgy EDIDs (was: What does your OS look like?)
Posted: Sun Feb 07, 2016 12:19 pm
by onlyonemac
Brendan wrote:onlyonemac wrote:Brendan wrote:I honestly can't think of anything significant where "per computer" configuration (as opposed to "per user" or "per cluster" settings) is actually needed.
That's because you have too much faith in your auto-configuration algorithms, compounded by your persistent insistence on discarding small quantities of affected, but drastically affected, users as insignificant.
As I've explained repeatedly; "drastically affected" is an extreme exaggeration for multiple reasons.
I repeat, it's only an exaggeration until you're affected by it.
Brendan wrote:I also don't think it's my fault that I have to repeat myself in multiple different ways just because you have difficulty understanding simple things.
Perhaps it's you that has difficulty accepting that just because you're willing to read a blurry monitor doesn't mean that I or the rest of the world is.
Brenda wrote:The computer loads its boot code (mostly the minimum needed to get the file system up) and sets up "frame buffer" (partly to display error messages and partly because there may not be any native video driver); then does some security related stuff (wipe PCI config space, wipe IOMMU, setup "dynamic root of trust" for remote attestation); then does CPU detection, chooses a kernel and starts it, starts device manager, starts network card driver; then does remote attestation and tries to synchronise with the distributed file system. Any error that occurs during any of that (including "can't access network because cable was unplugged 2 days ago") means it can't access the distributed file system and can't load additional drivers (including the native video driver).
Alright, so then have your auto-configuration crap (nobody said auto-configuration was bad) and then, just in case the network cable happens to have fallen out, you can use that, otherwise you can use the overriding configuration of the native video driver (which can also auto-configure, with configuration file to override the settings just like everything else that will auto-configure but can be overridden). It's not like your OS is designed to work without a network connection anyway, so who cares if the network cable being unplugged stops the video driver from being able to read it's configuration file (which is an optional file, anyway, and isn't going to be needed unless the workstation has a "crappy" monitor attached and the video driver can still initialise without it if the network connection fails).
Brendan wrote:onlyonemac wrote:I'll tell you now, that's going to happen far less frequently than users complaining about incorrect auto-configuration is going to happen. Even a user who knows nothing about how to configure a computer system knows instinctively that unplugging something from the back of a computer while it is turned on is a bad idea, for whatever reason.
You do realise that every cable has 2 ends?
Are you suggesting that the IT admins are now also so technically incompetent that they are going to unplug their end of the cable? Or the cleaning staff who clean the wiring closet and who have been told "don't touch anything in there without explicit permission"?
Re: Dodgy EDIDs (was: What does your OS look like?)
Posted: Sun Feb 07, 2016 12:57 pm
by Owen
Brendan wrote:onlyonemac wrote:Brendan wrote:Note that Windows typically has (had?) no resolution independence; so changing the video mode causes everything to be broken - things like desktop icons that are the wrong size (and laid out different to before), all the text/fonts being (physically) larger/smaller, applications that remember their previous window size that end up too small or too large for the screen, etc. Because of this, it's entirely possible for someone to use a "not native" resolution for years without knowing it, then hate it when someone sets the computer to the native resolution because everything is too small (and unreadable and gives them watery eyes just by looking at it).
I never said anything about the physical size on the screen of the text; of course one can scale the display to match the pixel pitch but if the pixel pitch is too low (because the monitor is running at a non-native resolution) and, more importantly, the pixels output by the video card do not match the display's physical pixels and the display is having to perform scaling on the input signal in the case of an LCD (because the monitor is running at a non-native resolution), then the text will still be unreadable (or close to unreadable).
Yes; if the resolution is too low then things get harder to read (regardless of whether its the native resolution of not); and the difference between (e.g.) 800*600 and 1024*768 back in 1990s was significant. For modern displays it's completely different - the size of a pixel is much smaller than the human eye can actually see and the difference between (e.g.) 2560*2048 and 1920*1600 is completely irrelevant.
Don't believe me? Search for "is 4K worth it" and I guarantee the search results will be a large number of pages that explain why modern high resolutions are completely pointless (until/unless the user sits so close that most of the picture is beyond their peripheral vision).
If anything; if I wanted to improve my OS for something that actually matters (rather than this completely irrelevant issue), I'd modify it to deliberately choose a lower resolution (that's still higher resolution than the user can see) just to reduce the power consumption/heat caused by trying to render pixels that don't make any difference.
These articles always ignore a multitude of factors
- They're based upon the "normal" accuity of our vision. This is much worse than the vernier accuity of our eyes (which arises from our brain's reconstruction filter and the movement of our eyes during a saccade).
- They fail to account for the fact that pixels on most of our displays are rectangular, not infinitely small (ideal) sampling points. This means that there is aliasing in the produced signal due to the structure of the pixel
- They fail to account for the fact that most methods of programatically producing visual content are not accurately band limited, and therefore produce aliasing artifacts themselves.
For a simple example of this, take one of these allegedly better than human vision displays (such as an Apple "Retina" display or one of most modern smartphones) and draw a non-anti-aliased diagonal line. Observe that, holding it at a normal distance, you can indeed still observe that the line is aliased.
The end result of this is that for visually good results you will end up rendering at a resolution at least as great as that natively possessed by the display, in which case there is no point scaling down the image and throwing away detail only to scale it back up again in the display.
Re: Dodgy EDIDs (was: What does your OS look like?)
Posted: Sun Feb 07, 2016 3:37 pm
by Brendan
Hi
onlyonemac wrote:Brendan wrote:As I've explained repeatedly; "drastically affected" is an extreme exaggeration for multiple reasons.
I repeat, it's only an exaggeration until you're affected by it.
And I repeat, you're basing your opinion on graphics stacks that are completely different (aren't resolution independent, don't anti-alias everything, do have everything aligned perfectly on pixel boundaries, don't do "fixed frame rate, variable quality", etc), and you're basing your opinion on ancient monitors (where pixels are relatively large) and not current/future monitors (where pixels are so small that it makes very little difference) which compounds the problem; and because of these things your opinion is misguided at best.
onlyonemac wrote:Brendan wrote:The computer loads its boot code (mostly the minimum needed to get the file system up) and sets up "frame buffer" (partly to display error messages and partly because there may not be any native video driver); then does some security related stuff (wipe PCI config space, wipe IOMMU, setup "dynamic root of trust" for remote attestation); then does CPU detection, chooses a kernel and starts it, starts device manager, starts network card driver; then does remote attestation and tries to synchronise with the distributed file system. Any error that occurs during any of that (including "can't access network because cable was unplugged 2 days ago") means it can't access the distributed file system and can't load additional drivers (including the native video driver).
Alright, so then have your auto-configuration crap (nobody said auto-configuration was bad) and then, just in case the network cable happens to have fallen out, you can use that, otherwise you can use the overriding configuration of the native video driver (which can also auto-configure, with configuration file to override the settings just like everything else that will auto-configure but can be overridden).
Adding hassle to the OS's device manager (to try to figure out which settings get used by which monitor on which video driver), every native video driver, documentation (including end user documentation and device driver developer documentation), administrators (when users change things) and end users (when they have to wait for admin to readjust settings because things were changed); all for the sake of something that won't happen (without the ability to travel back in time) and/or won't matter much if it does happen. You're trying to maximise inconvenience for many number of people, just to minimise inconvenience for nobody. It's incredibly stupid; and I suspect that the only reason you think this is a good idea in the first place is because it was a good idea for OSs that were designed 20+ years ago when everything (hardware and software) was completely different.
If you like it so much, why don't you write your own OS instead of trying so hard to make my OS worse?
onlyonemac wrote:Brendan wrote:onlyonemac wrote:I'll tell you now, that's going to happen far less frequently than users complaining about incorrect auto-configuration is going to happen. Even a user who knows nothing about how to configure a computer system knows instinctively that unplugging something from the back of a computer while it is turned on is a bad idea, for whatever reason.
You do realise that every cable has 2 ends?
Are you suggesting that the IT admins are now also so technically incompetent that they are going to unplug their end of the cable? Or the cleaning staff who clean the wiring closet and who have been told "don't touch anything in there without explicit permission"?
I'm suggesting that there's many reasons why a network connection may be down; and that deliberately choosing one unlikely reason just so that you can claim that a "throw away example" is unlikely is disingenuous.
Cheers,
Brendan
Re: Dodgy EDIDs (was: What does your OS look like?)
Posted: Sun Feb 07, 2016 3:57 pm
by onlyonemac
Brendan wrote:onlyonemac wrote:Brendan wrote:As I've explained repeatedly; "drastically affected" is an extreme exaggeration for multiple reasons.
I repeat, it's only an exaggeration until you're affected by it.
And I repeat, you're basing your opinion on graphics stacks that are completely different (aren't resolution independent, don't anti-alias everything, do have everything aligned perfectly on pixel boundaries, don't do "fixed frame rate, variable quality", etc), and you're basing your opinion on ancient monitors (where pixels are relatively large) and not current/future monitors (where pixels are so small that it makes very little difference) which compounds the problem; and because of these things your opinion is misguided at best.
What you seem to be failing to recognise is this: when I ran my 1280x1024 LCD panel at 1024x768, each physical pixel on the display represented not one, not half, but an odd fraction of each pixel in the video signal. That means that, no matter how resolution-independent your operating system is, there will be some inherent blur introduced by the LCD not running at it's native resolution. There is no way around that blur, because it is caused by the fact that the pixels that your OS is producing for display and the pixels that are being displayed do not have an integer correlation, meaning that some of the video signal pixels will "fall in the gaps" leading to significant loss of quality and thus legibility. No matter how good your font rendering is, you can't get around the blurry text issue when for example the cross-stroke on the letter "t" keeps falling in between two physical device pixels.
Re: Dodgy EDIDs (was: What does your OS look like?)
Posted: Sun Feb 07, 2016 5:18 pm
by Brendan
Hi,
Owen wrote:These articles always ignore a multitude of factors
- They're based upon the "normal" accuity of our vision. This is much worse than the vernier accuity of our eyes (which arises from our brain's reconstruction filter and the movement of our eyes during a saccade).
- They fail to account for the fact that pixels on most of our displays are rectangular, not infinitely small (ideal) sampling points. This means that there is aliasing in the produced signal due to the structure of the pixel
- They fail to account for the fact that most methods of programatically producing visual content are not accurately band limited, and therefore produce aliasing artifacts themselves.
For a simple example of this, take one of these allegedly better than human vision displays (such as an Apple "Retina" display or one of most modern smartphones) and draw a non-anti-aliased diagonal line. Observe that, holding it at a normal distance, you can indeed still observe that the line is aliased.
Displays have multiple physical limitations; where if you do nothing to hide those limitations they become much more noticeable. This includes dithering (to hide low colour depth), "anti-aliasing" (to hide low resolution) and HDR (to hide dynamic range). All of these tricks will be built into my video drivers - e.g. it will be impossible for normal software (GIU, application) to ask the video driver to render anything (e.g. a thin rectangle/"line") without anti-aliasing.
Note: Another trick is using motion blur (to hide monitor's frame rate); but I doubt I can support it.
Note that one of the reasons I posted my screenshots was because someone on IRC said "15-bpp/16-bpp looks bad and shouldn't ever be used", and I wanted to show that (even for an "almost pathological" case involving large smooth gradients) it's extremely hard to tell the difference between 15-bpp/16-bpp and 24-bpp when graphics is done right.
The resolution argument here is similar - people saying "anything that isn't the native resolution is bad", not so much because it is bad, but more because they rarely see graphics done right.
Owen wrote:The end result of this is that for visually good results you will end up rendering at a resolution at least as great as that natively possessed by the display, in which case there is no point scaling down the image and throwing away detail only to scale it back up again in the display.
Ironically; one of the tricks I'm planning (for "fixed frame rate, variable quality", if and only if the scene is changing too quickly for renderer to keep up) is to render graphics at a fraction of the video mode's resolution and scale it up after rendering/before blitting (to reduce per-pixel rendering overhead and meet the "vertical sync deadline"); and (if and only if it's a persistent "not enough time to render frame" problem) this includes changing the video mode to a lower resolution (during periods of user inactivity) so that I can remove the overhead of scaling up before blitting (essentially, offloading the work of scaling the image up to the monitor). Of course the reverse also applies (switching back to higher resolutions during user inactivity if/when there is enough time to render frames).
Cheers,
Brendan