Hi,
onlyonemac wrote:Brendan wrote:Rusky wrote:And when the office is already full of monitors your OS arbitrarily decides are broken?
[...]
As I've already explain multiple times; the chance of it happening is zero, and the chance of a user actually caring if it does happen is also zero.
The chances are not zero: it's quite possible that the IT department in charge of an office might decide to get cheap monitors that will probably work fine under Windows, and when they do work fine there's no need to upgrade.
The chance of a company buying TVs and not monitors is zero. The chance of a company buying all of the TVs at the same time (and not buying some one year, more the next year, etc; and ending up with many different monitors) is very small. The chance that a sane company would buy many cheap trash TVs without testing one first and realising it's puss before buying the rest is also very small. The chance that a company would then not return them when they realise they suck on every OS is also tiny. The chance that a company would continue using cheap trash for 10 freaking years without replacing any of them is completely absurd.
onlyonemac wrote:Now let's suppose that your OS actually becomes popular enough that the IT department decides to switch, and now they've got an office full of "broken" monitors just because you left out the few lines of code it would take to provide the option for the IT department to override the OS's automatic detection of the monitor resolution.
It's not a few lines of code; it's punching a hole through the OSs permission system to allow a user to diddle with the OS's configuration that they should never be allowed to touch, combined with some sort of "monitor preferences" dialog box in each GUI; all for the sake of faulty hardware that should never have been sold, won't exist by the time my OS is actually released, and won't make a significant difference even if it does still exist.
onlyonemac wrote:Ultimately the problem seems to be that you're regarding a blurry LCD as an almost insignificant issue when in reality it is a very big issue - I remember using LCDs like that before I went blind and they were usually pretty much unreadable.
Blur is an insignificant issue.
Because my OS is fully resolution independent (and apps/GUIs will be running in 3D), nothing will be perfectly aligned to pixel boundaries (and everything will be anti-aliased); and this will cause a little blur. On top of that there's "focal blur" where things that are out of focus are intentionally blurred more (which will subtly effect most things because nothing will be exactly parallel with the screen and so a lot will be "slightly out of focus"). Basically; because the OS won't be a pathetic pile of **** (like Windows, Linux) there will be a certain amount of "inherent blur". Any additional blur (caused by monitor/TV scaling) isn't going to be very noticeable (will be masked by the "inherent blur" anyway); unless the monitor is using a very low resolution in the first place (e.g. the monitor obsolete crap that barely exists now and most certainly won't exist by the time my OS is released).
On top of that; under perfectly normal scenarios (e.g. no native video driver; and either software rendering can't pump pixels fast enough; or VBE/GOP didn't support the monitor's native resolution) the OS might not use the monitor's native resolution on perfectly good monitors (even when the OS can for the "software renderer not fast enough case").
Now..
From my perspective; most of this has already been adequately explained (by me) multiple times; and the only problem that actually exists is that Rusky is a troll that fails to listen and repeatedly regurgitates the same drivel over and over and over (all without admitting that his own OS is multiple orders of magnitude worse in every possible way).
Cheers,
Brendan