You're saying it failed to work on a OS that has none of the features intended to make faulty hardware bearable; and saying that my OS (with a "monitor description file" designed to work around the combination of "vendorID and productID not unique" and "EDID's timings are wrong" problems and give the "least worst possible" result) will be far superior to the OS you tried?onlyonemac wrote:It *was* unusable. At that resolution, the pixels were smaller than the phosphor dots so apart from the text being too small to conformally read the letters were also not properly formed.Brendan wrote:Without which, your dodgy CRT will still be perfectly usable (just with slightly more blur).onlyonemac wrote:One parameter that I *do* know is useful for end users is the one to override the automatic video mode setting, without which my dodgy CRT would have been unusable.
1366*768 is the highest resolution that the Olevia LT26HVE manual says it supports. Something is crappy if it fails to implement relevant industry standards correctly; and no work-around (including end user config and including "carefully designed monitor description file") changes that.onlyonemac wrote:I don't remember mentioning anything about a 1366x768 display, but the CRT didn't need replacing because the monitor was fine once it was running at the correct resolution. Besides, it was my parents' old computer and that's all I had. A display is only "crappy" if your OS prevents the user from making proper use of it.Brendan wrote:Note that if you actually cared about a little blur (other than for the purpose of whining) you would've replace your crappy "1366*768 native resolution only" display with something that has a better native resolution 5+ years ago when everyone else in the world was shifting to 1920*1200 (and wouldn't care now when people are shifting to 4K; and won't care when my OS is actually released, long after your piece of crud has suffered the same fate as all CRTs and becomes too dark even after you've set its brightness setting to maximum).
It's "possible in theory" to make almost all hardware "just work". Realistically, there's a limited amount of time to write suitable device drivers, so a lot of hardware that can "just work" won't be supported. This is a huge issue that effects all OSs (including Windows) that none of us can avoid.onlyonemac wrote:First point: it's impossible to make software that "just works" to the event that no workarounds will ever be needed.Brendan wrote:It's fundamentally wrong because:
- The minimum requirements are that software "just works" (ie. without any end user wankery).
- Providing end user configuration means that the end user will misconfigure it and screw everything up, and cause a whole pile of "PEBCAK" bug reports.
- Every single setting is a sign that the OS developer is an incompetent moron that failed to avoid the need for it.
In addition to the large number of devices that could "just work" but aren't supported, there will be a tiny number of devices that can't "just work" and also aren't supported.
The tiny number of devices that can't "just work" are nothing more than an irrelevant distraction.
To describe a video timing correctly its necessary to know horizontal/vertical resolution, horizontal/vertical sync polarity, horizontal/vertical front porch times, horizontal/vertical back porch times, horizontal/vertical sync width times, pixel clock frequency, and whether it's interlaced or not. Given that most OS developers seem to be too stupid to realise that horizontal/vertical resolution alone is completely inadequate (and that the majority of monitor manuals lack detailed information); what makes you think the average clueless user will be able to provide the necessary information?onlyonemac wrote:Second point: users will only need to worry about the configuration if their monitor doesn't work, whence they can follow a simple instruction like "please enter the native resolution of your monitor".
If it's on the OS's supported hardware list but fails to work without end user hassle, then that is the OS developer's fault (either it should work or it shouldn't be on the list).onlyonemac wrote:Third point: it's not the OS developer's fault that some monitors report incorrect data.
If it's not on the OS's supported hardware list, the OS developer is not responsible for something they don't support. In this case it's the user's fault for using unsupported hardware (they gambled, they lost).
Standards (e.g. EDID) make it easier for an OS to add devices to their supported hardware list (but only if the standards are implemented by the device correctly) - e.g. an OS developer can say "All monitors with correct EDID are on my supported hardware list".
Cheers,
Brendan