Dodgy EDIDs (was: What does your OS look like?)

Question about which tools to use, bugs, the best way to implement a function, etc should go here. Don't forget to see if your question is answered in the wiki first! When in doubt post here.
User avatar
Schol-R-LEA
Member
Member
Posts: 1925
Joined: Fri Oct 27, 2006 9:42 am
Location: Athens, GA, USA

Re: Dodgy EDIDs (was: What does your OS look like?)

Post by Schol-R-LEA »

My point isn't that the problem isn't with the companies, or that enabling such idiocies is a good idea in any absolute sense; my point is that these idiocies are in fact SOP across the majority of the business world, and that having a rational IT framework is so rare that you are writing your OS for what is effectively a non-existent customer base. Your design is fine from a technical perspective, and ideally, your goals are solid and desirable ones. The problem is that you are expecting your customers to thank you for telling them they are doing things wrong, and that is never going to work even (or perhaps especially) if they are.
Rev. First Speaker Schol-R-LEA;2 LCF ELF JAM POEE KoR KCO PPWMTF
Ordo OS Project
Lisp programmers tend to seem very odd to outsiders, just like anyone else who has had a religious experience they can't quite explain to others.
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: Dodgy EDIDs (was: What does your OS look like?)

Post by Brendan »

Hi,
onlyonemac wrote:
Brendan wrote:And I repeat, you're basing your opinion on graphics stacks that are completely different (aren't resolution independent, don't anti-alias everything, do have everything aligned perfectly on pixel boundaries, don't do "fixed frame rate, variable quality", etc), and you're basing your opinion on ancient monitors (where pixels are relatively large) and not current/future monitors (where pixels are so small that it makes very little difference) which compounds the problem; and because of these things your opinion is misguided at best.
What you seem to be failing to recognise is this: when I ran my 1280x1024 LCD panel at 1024x768, each physical pixel on the display represented not one, not half, but an odd fraction of each pixel in the video signal. That means that, no matter how resolution-independent your operating system is, there will be some inherent blur introduced by the LCD not running at it's native resolution.
Sigh. Forget about ancient video modes that nobody has cared about for 5+ years already (where everything will look blurry regardless of whether it's the native resolution or not because the resolution is so pathetic in the first place). Start caring about something that won't be a vague story told by old men to their grandchildren in 2026 about how, "back in my day", we had to heat the iron up in a wood stove before ironing our clothes and computer displays had lower resolutions than $2 wrist watch (and weren't even 3D/stereoscopy).
onlyonemac wrote:There is no way around that blur, because it is caused by the fact that the pixels that your OS is producing for display and the pixels that are being displayed do not have an integer correlation, meaning that some of the video signal pixels will "fall in the gaps" leading to significant loss of quality and thus legibility. No matter how good your font rendering is, you can't get around the blurry text issue when for example the cross-stroke on the letter "t" keeps falling in between two physical device pixels.
Resolution independence means that nothing being displayed will have an integer correlation anyway. "Everything 3D" means very little will be perfectly vertical or perfectly horizontal or perfectly parallel with the screen (which may not be flat anyway).

Focal blur means that things that aren't at the focal distance will be deliberately blurred more. If you put your text editor at a freaky/unusual angle; you're supposed to see something like this:
Image
Note: Picture shows a photo of a page from a book, where the text in the centre of the photo is crystal clear (in focus), but gradually gets worse (more blurry) until it's very blurry at the top and bottom edges of the photo due to being completely out of focus.

"Fixed frame rate, variable quality" means that if the renderer can't keep up it will reduce detail/quality. I have no choice but to depend on software rendering (native drivers with full GPU support don't magically appear out of nowhere and I doubt NVidia is going to write them for me this week). With a software renderer the limiting factor is CPU processing time; so increasing the resolution just causes a decrease in rendering quality and ends up worse and not better. If you didn't spend $750+ on an Intel CPU in the last 12 months you can forget about using 1920*1200 video mode on your 5 year old monitor (unless you've got gigabit network and are able to dedicate multiple computers to rendering alone).


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: Dodgy EDIDs (was: What does your OS look like?)

Post by Brendan »

Hi,
Schol-R-LEA wrote:My point isn't that the problem isn't with the companies, or that enabling such idiocies is a good idea in any absolute sense; my point is that these idiocies are in fact SOP across the majority of the business world, and that having a rational IT framework is so rare that you are writing your OS for what is effectively a non-existent customer base. Your design is fine from a technical perspective, and ideally, your goals are solid and desirable ones. The problem is that you are expecting your customers to thank you for telling them they are doing things wrong, and that is never going to work even (or perhaps especially) if they are.
In 2009, Toyota manufactured cars (Prius) with a design flaw in the braking system. Therefore anyone driving a car in 2026 should be forced to carry a heavy anchor with them, just so car manufacturers don't need to care if the cars they produce have design flaws in the braking system.


Cheers,

brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
User avatar
Rusky
Member
Member
Posts: 792
Joined: Wed Jan 06, 2010 7:07 pm

Re: Dodgy EDIDs (was: What does your OS look like?)

Post by Rusky »

Brendan wrote:Note that one of the reasons I posted my screenshots was because someone on IRC said "15-bpp/16-bpp looks bad and shouldn't ever be used", and I wanted to show that (even for an "almost pathological" case involving large smooth gradients) it's extremely hard to tell the difference between 15-bpp/16-bpp and 24-bpp when graphics is done right.
Your 15-bpp image is very clearly less than 24-bpp:Image

I suspect text in 3d that never exactly aligns with the display and can be blurred by depth-of-field effects will be similarly obvious compared to properly-rasterized text, even at a display's native resolution, and I guarantee it will be obvious on a non-native resolution.
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: Dodgy EDIDs (was: What does your OS look like?)

Post by Brendan »

Hi,
Rusky wrote:
Brendan wrote:Note that one of the reasons I posted my screenshots was because someone on IRC said "15-bpp/16-bpp looks bad and shouldn't ever be used", and I wanted to show that (even for an "almost pathological" case involving large smooth gradients) it's extremely hard to tell the difference between 15-bpp/16-bpp and 24-bpp when graphics is done right.
Your 15-bpp image is very clearly less than 24-bpp:Image
For comparison: Also note that dithering works much better at higher resolutions.
Rusky wrote:I suspect text in 3d that never exactly aligns with the display and can be blurred by depth-of-field effects will be similarly obvious compared to properly-rasterized text, even at a display's native resolution, and I guarantee it will be obvious on a non-native resolution.
"Properly rasterized" depends on how you define it. I define it as "identical to creating a physical copy using the highest accuracy possible (e.g. a professional sign-writer using a brush to paint free-flowing calligraphy) then taking a photo at whatever zoom, resolution, angle and lighting conditions you want with a high quality camera".

The 3D rendering and focal blur are supposed to be noticeable, otherwise there's no point.

For pre-historic resolutions (1280*1024 and worse) the only thing that matters is the physical size and weight of the monitor for garbage disposal purposes. For modern display resolutions (e.g. ~1920*1600) I doubt anyone will be able to tell the difference at first glance and would need to be specifically looking for it (and it won't prevent anyone using the display because it makes their eyes water). For future display resolutions (what I'm designing this for, and the only thing that actually matters) I very much doubt that anyone will be able to tell the difference without a 10+ meter wide screen under sane viewing conditions (without their nose touching the screen).

Of course the entire "not native resolution" nonsense completely overlooks the fact that either you have no choice at all (no native video driver and you didn't get lucky with VBE) or that it's virtually non-existent (you do have a native video driver and/or got lucky with VBE, but couldn't get the native resolution because the monitor has not just one but 2 hardware bugs). It's like I'm saying "X * Y > 0.00001" (or, the chance of this nonsense mattering is so close to zero that it's completely pointless caring about it at all) and you keep trying to convince me that "X might be < 0.5" despite the fact that the value of X (how much "not native resolution" matters) and the value of Y (how likely it is that end user configuration would make any difference at all) are both completely irrelevant in isolation.


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
onlyonemac
Member
Member
Posts: 1146
Joined: Sat Mar 01, 2014 2:59 pm

Re: Dodgy EDIDs (was: What does your OS look like?)

Post by onlyonemac »

Brendan wrote:The resolution argument here is similar - people saying "anything that isn't the native resolution is bad", not so much because it is bad, but more because they rarely see graphics done right.
Again what you're failing to realise is the significance of the "non-native" part. Nobody (except you) is saying that 1366x768 resolution looks bad in and of itself; what we are saying is that 1366x768 on a 1920x1080 LCD panel looks bad. "Graphics done right" might get around the "low" 1366x768 resolution, but it won't get around the blur that occurs when only every 1 in 3 (approximately) pixels aligns with the display's matrix.
When you start writing an OS you do the minimum possible to get the x86 processor in a usable state, then you try to get as far away from it as possible.

Syntax checkup:
Wrong: OS's, IRQ's, zero'ing
Right: OSes, IRQs, zeroing
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: Dodgy EDIDs (was: What does your OS look like?)

Post by Brendan »

Hi,
onlyonemac wrote:
Brendan wrote:The resolution argument here is similar - people saying "anything that isn't the native resolution is bad", not so much because it is bad, but more because they rarely see graphics done right.
Again what you're failing to realise is the significance of the "non-native" part. Nobody (except you) is saying that 1366x768 resolution looks bad in and of itself; what we are saying is that 1366x768 on a 1920x1080 LCD panel looks bad. "Graphics done right" might get around the "low" 1366x768 resolution, but it won't get around the blur that occurs when only every 1 in 3 (approximately) pixels aligns with the display's matrix.
Wrong.

Smooth areas (including gradients) are unaffected. Harsh edges that don't fall on exact pixel boundaries in the source image (and are blurred in the source image) get blurred a little bit more when scaled up by the monitor (but this minor additional blur is likely on par with error caused by sub-pixel format and gaps between pixels on a real 1366*768 monitor anyway).

Harsh edges that do fall on exact pixel boundaries in the source image are the pathological case. This pathological case doesn't exist for my OS for all the reasons I've repeatedly mentioned (resolution independence, 3D rendering, focal blur). It's existing OSs like Windows where the pathological cases dominates most graphics (everything except 3D games) that is responsible for your continued misguided ignorance.


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
onlyonemac
Member
Member
Posts: 1146
Joined: Sat Mar 01, 2014 2:59 pm

Re: Dodgy EDIDs (was: What does your OS look like?)

Post by onlyonemac »

Brendan wrote:
onlyonemac wrote:
Brendan wrote:The resolution argument here is similar - people saying "anything that isn't the native resolution is bad", not so much because it is bad, but more because they rarely see graphics done right.
Again what you're failing to realise is the significance of the "non-native" part. Nobody (except you) is saying that 1366x768 resolution looks bad in and of itself; what we are saying is that 1366x768 on a 1920x1080 LCD panel looks bad. "Graphics done right" might get around the "low" 1366x768 resolution, but it won't get around the blur that occurs when only every 1 in 3 (approximately) pixels aligns with the display's matrix.
Wrong.

Smooth areas (including gradients) are unaffected. Harsh edges that don't fall on exact pixel boundaries in the source image (and are blurred in the source image) get blurred a little bit more when scaled up by the monitor (but this minor additional blur is likely on par with error caused by sub-pixel format and gaps between pixels on a real 1366*768 monitor anyway).
Fair enough. Probably more than the blur caused by sub-pixel format artefacts, but yes for a perfect gradient you won't notice much of a difference. The point is that perfect gradients are neither the most important nor the most common graphic to represent correctly.
Brendan wrote:Harsh edges that do fall on exact pixel boundaries in the source image are the pathological case. This pathological case doesn't exist for my OS for all the reasons I've repeatedly mentioned (resolution independence, 3D rendering, focal blur). It's existing OSs like Windows where the pathological cases dominates most graphics (everything except 3D games) that is responsible for your continued misguided ignorance.
If that's your approach to graphics rendering, then I sincerely hope that you don't plan on displaying any text. If your OS has so much "focal blur" that additional monitor scaling blur is insignificant, then I wouldn't want to try reading text output by your OS. It is well known that unless text characters have well-defined shapes then they are inherently illegible, and I believe that your rendering process may lead to such illegibility. I advise that you post some screenshots of your rendering output for some sighted users here to comment on the legibility thereof.
When you start writing an OS you do the minimum possible to get the x86 processor in a usable state, then you try to get as far away from it as possible.

Syntax checkup:
Wrong: OS's, IRQ's, zero'ing
Right: OSes, IRQs, zeroing
User avatar
Schol-R-LEA
Member
Member
Posts: 1925
Joined: Fri Oct 27, 2006 9:42 am
Location: Athens, GA, USA

Re: Dodgy EDIDs (was: What does your OS look like?)

Post by Schol-R-LEA »

Brendan wrote: In 2009, Toyota manufactured cars (Prius) with a design flaw in the braking system. Therefore anyone driving a car in 2026 should be forced to carry a heavy anchor with them, just so car manufacturers don't need to care if the cars they produce have design flaws in the braking system.
This analogy doesn't fit the circumstances, as the flaws in the product are not the issue - the IT managers' irrational insistence on using the flawed product is. A better analogy (one for your policy, rather than the converse) would be a road that did not allow cars onto it if they had the capacity to exceed the maximum speed limit - that is to say, if the street in question had a maximum speed limit of 80 KmPH, a barrier would prevent any cars with a higher maximum speed from entering the roadway.

While you could argue that this comparison is absurd - there are effectively no cars currently on the road that cannot go faster than 80 KmPH, whereas only a handful of monitors have the problems described - you have to consider that when you look at all the components of a stock PC system, the odds that at least one component or peripheral would not be compliant rises drastically, especially when you begin looking at older hardware.

My point has less to do with the configuration policy itself than it does your expectations that clients will go out of their way to support your OS, rather than the other way around. Should they? If the benefits of your system are sufficient, then yes, of course they should. Will they? I wouldn't bet on it.

To put it another way, you are using a technical argument regarding a psychological problem.
Rev. First Speaker Schol-R-LEA;2 LCF ELF JAM POEE KoR KCO PPWMTF
Ordo OS Project
Lisp programmers tend to seem very odd to outsiders, just like anyone else who has had a religious experience they can't quite explain to others.
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: Dodgy EDIDs (was: What does your OS look like?)

Post by Brendan »

Hi,
onlyonemac wrote:
Brendan wrote:Harsh edges that do fall on exact pixel boundaries in the source image are the pathological case. This pathological case doesn't exist for my OS for all the reasons I've repeatedly mentioned (resolution independence, 3D rendering, focal blur). It's existing OSs like Windows where the pathological cases dominates most graphics (everything except 3D games) that is responsible for your continued misguided ignorance.
If that's your approach to graphics rendering, then I sincerely hope that you don't plan on displaying any text. If your OS has so much "focal blur" that additional monitor scaling blur is insignificant, then I wouldn't want to try reading text output by your OS. It is well known that unless text characters have well-defined shapes then they are inherently illegible, and I believe that your rendering process may lead to such illegibility.
Obviously focal blur doesn't apply to things that are in focus (e.g. text you're actually trying to read) and exists to add realism to everything else in the scene (other windows in the foreground/background, etc).

With full anti-aliasing and "nothing on pixel boundaries", for text to be readable (when in focus) the only thing that really matters is the number of pixels per character. Let's look at 4 character sizes:
  • 6*8 pixel characters, which is roughly the lower limit (blurry but still readable) for native resolution
  • 8*10 pixel characters, which would be the lower limit (blurry but still readable) for the non-existent "not native resolution" case
  • 12*16 pixel characters, which is about what you'd want for native resolution
  • 16*20 pixel characters, which is about what you'd want for the non-existent "not native resolution" case
And 2 display sizes:
  • 380 mm * 210 mm (a "17-inch" 16:9 screen), which is fairly common for laptops
  • 1080 mm * 670 mm (a "50-inch" 16:10 screen), which is about the largest screen I'd consider sane for desktop use (and probably more suited to lounge room where the user is much further from the screen)
Let's also assume characters are 4 mm wide and 6 mm tall (because I measured the character size I've been using for text editor for years and that's what I got).

From this we can calculate resolution you'd need for all the permutations.

For native resolutions we get:
  • 6*8 pixel characters:
    • 6*8 pixel characters:
      • 380 mm * 210 mm screen: 570*280
      • 1080 mm * 670 mm screen: 1620*893
  • 12*16 pixel characters:
    • 380 mm * 210 mm screen: 1140*560
    • 1080 mm * 670 mm screen: 3240*1786
And for the non-existent "not native resolutions" we get:
  • 8*10 pixel characters:
    • 380 mm * 210 mm screen: 760*350
    • 1080 mm * 670 mm screen: 2160*1116
  • 16*20 pixel characters:
    • 380 mm * 210 mm screen: 1520*700
    • 1080 mm * 670 mm screen: 4320*2233
Basically; for 2026 (what it's designed for) where I expect 4K resolutions to be common, this is fine even for the "not native resolution on huge monitor" case.

For today's hardware (e.g. 1366*768 on laptops, 1920*1200 on desktop) the resolution isn't quite enough for the huge monitor - it'd be "readable but blurry" but less than what you'd want.

Of course we can do the calculations in reverse - start from resolution and calculate max. screen size. For "what you'd want" number of pixels per character (12*16 pixels and 16*20 pixels); this gives:
  • 1366*768 with native resolution: 455 mm by 288 mm or smaller screen (roughly equivalent to a 21-inch laptop screen)
  • 1366*768 with non-native resolution: 341 mm by 230 mm or smaller screen
  • 1920*1200 with native resolution: 640 mm by 450 mm or smaller screen (roughly equivalent to a 30-inch desktop screen)
  • 1920*1200 with non-native resolution: 480 mm by 360 mm or smaller screen
  • 4096*2233 with native resolution: 1365 mm by 837 mm or smaller screen (roughly equivalent to a 64-inch lounge room screen)
  • 4096*2233 with non-native resolution: 1024 mm by 669 mm or smaller screen
onlyonemac wrote:I advise that you post some screenshots of your rendering output for some sighted users here to comment on the legibility thereof.
My "roadmap" looks like this:
  • 1 month implementing and optimising the scene renderer
  • 1 month researching and designing a font file format
  • 2 weeks to implement some sort of utility to allow me to create font files, and to create a font file that at least covers ASCII characters
  • 1.5 months adding text rendering to the scene renderer, including text layout and auto-kerning
This implies that I won't be able to post screen shots containing text before June. Screen shots showing "empty white windows" (with support for stereoscopy, "distortion free" curved/non-flat monitors, dynamic lighting/shadows and focal blur) will be next (about 1 month).


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: Dodgy EDIDs (was: What does your OS look like?)

Post by Brendan »

Hi,

My point was that (in general) providing end user configuration encourages (or fails to discourage) manufacturers from producing/selling dodgy hardware, and (over the long term) increases unnecessary end user hassle (the need to diddle with settings before something that should have worked properly actually does work properly); and that all OS developers (especially people like Microsoft, Apple and Linux who are large enough to effect hardware manufacturers) should refuse to provide "end-user configuration as work-around for faulty products" for the sake of end users/consumers.
Schol-R-LEA wrote:My point has less to do with the configuration policy itself than it does your expectations that clients will go out of their way to support your OS, rather than the other way around. Should they? If the benefits of your system are sufficient, then yes, of course they should. Will they? I wouldn't bet on it.

To put it another way, you are using a technical argument regarding a psychological problem.
To avoid "device/s don't work because there isn't any device driver/s" I have to expect users to use supported hardware. I don't have any choice, and neither does any other OS developer. Whether users are willing to replace unsupported hardware or not is mostly out of my control - the only thing I can do is increase the benefits of using the OS (to make them more willing to replace unsupported hardware); but I'm already doing (or attempting to do) everything I possibly can to increase the benefits of using the OS (for its intended purpose) for multiple reasons.

I will continue to expect users to use supported hardware to avoid "slightly more blurry graphics" on faulty monitors that don't/won't exist (unless users would rather have "slightly more blurry graphics"). In theory I do have some choice here - I could choose to make multiple things worse, knowing that (even if the problem actually existed) it's still nothing in comparison to the unavoidable "lack of drivers" problem.

Basically you could summarise my position as:
  • The "slightly more blurry graphics on faulty hardware" problem won't exist by the time my OS is released (and is already extremely rare)
  • If it did exist; "slightly more blurry graphics" would be a minor issue anyway
  • If it did exist and wasn't a minor issue; adding end-user configuration/work-arounds won't make any real difference to the number of people willing/not willing to use my OS
  • If it did exist, wasn't a minor issue and did effect the number of people willing to use my OS; "end user work-around for faulty hardware" only cures short term symptoms and makes the problem worse in the long term; so I'd still refuse.
People can argue about one of these things in isolation (and have been); but even if I'm wrong for 3 of these things it makes no difference and "end user configuration as work-around for slightly more blurry graphics on faulty hardware" is still a bad idea.


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
onlyonemac
Member
Member
Posts: 1146
Joined: Sat Mar 01, 2014 2:59 pm

Re: Dodgy EDIDs (was: What does your OS look like?)

Post by onlyonemac »

Brendan wrote:
onlyonemac wrote:
Brendan wrote:Harsh edges that do fall on exact pixel boundaries in the source image are the pathological case. This pathological case doesn't exist for my OS for all the reasons I've repeatedly mentioned (resolution independence, 3D rendering, focal blur). It's existing OSs like Windows where the pathological cases dominates most graphics (everything except 3D games) that is responsible for your continued misguided ignorance.
If that's your approach to graphics rendering, then I sincerely hope that you don't plan on displaying any text. If your OS has so much "focal blur" that additional monitor scaling blur is insignificant, then I wouldn't want to try reading text output by your OS. It is well known that unless text characters have well-defined shapes then they are inherently illegible, and I believe that your rendering process may lead to such illegibility.
Obviously focal blur doesn't apply to things that are in focus (e.g. text you're actually trying to read) and exists to add realism to everything else in the scene (other windows in the foreground/background, etc).

With full anti-aliasing and "nothing on pixel boundaries", for text to be readable (when in focus) the only thing that really matters is the number of pixels per character. Let's look at 4 character sizes:
  • 6*8 pixel characters, which is roughly the lower limit (blurry but still readable) for native resolution
  • 8*10 pixel characters, which would be the lower limit (blurry but still readable) for the non-existent "not native resolution" case
  • 12*16 pixel characters, which is about what you'd want for native resolution
  • 16*20 pixel characters, which is about what you'd want for the non-existent "not native resolution" case
<snip>
The discussion isn't about how big the characters are; the discussion is about the fact that if your video pixels are falling inbetween the physical pixels due to a non-native resolution then your text won't be legible unless it is *very* large. At "normal" text sizes you'll find that it becomes significant to consider how many rows of pixels are falling "in the gaps" on the physical display.
Brendan wrote:My point was that (in general) providing end user configuration encourages (or fails to discourage) manufacturers from producing/selling dodgy hardware, and (over the long term) increases unnecessary end user hassle (the need to diddle with settings before something that should have worked properly actually does work properly); and that all OS developers (especially people like Microsoft, Apple and Linux who are large enough to effect hardware manufacturers) should refuse to provide "end-user configuration as work-around for faulty products" for the sake of end users/consumers.
And my point is that, in general, one should always provide end-user configuration as a work-around for faulty products. Had Linux not offered that back when I was still using that "buggy" CRT monitor, I would never have switched from Windows to Linux because my frustration with the monitor running at a resolution where everything was blurry and flickering was enough to render the operating system practically unusable.
Brendan wrote:"end user configuration as work-around for slightly more blurry graphics on faulty hardware" is still a bad idea.
You can't keep insisting that:
  • The graphics are only "slightly more blurry" when multiple users here have commented otherwise
  • Providing end-user configuration is a bad idea without justification, and when multiple users here have commented otherwise
When you start writing an OS you do the minimum possible to get the x86 processor in a usable state, then you try to get as far away from it as possible.

Syntax checkup:
Wrong: OS's, IRQ's, zero'ing
Right: OSes, IRQs, zeroing
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: Dodgy EDIDs (was: What does your OS look like?)

Post by Brendan »

Hi,
onlyonemac wrote:The discussion isn't about how big the characters are; the discussion is about the fact that if your video pixels are falling inbetween the physical pixels due to a non-native resolution then your text won't be legible unless it is *very* large. At "normal" text sizes you'll find that it becomes significant to consider how many rows of pixels are falling "in the gaps" on the physical display.
Erm. I have no idea what you're talking about now.

I was talking about upscaling (from a lower resolution up to the native resolution) and assuming bilinear scaling; where if a source pixel straddles 2 or more destination pixels it's distributed correctly to those destination pixels, causing some additional blur (and causing people to need/want more pixels per character to get a clearly defined/readable character).

You seem to be talking about downscaling (for some unknown reason) and assuming nearest neighbour scaling (for some unknown reason); where if a source pixel isn't a "nearest pixel" it gets ignored ("falls inbetween").
onlyonemac wrote:And my point is that, in general, one should always provide end-user configuration as a work-around for faulty products. Had Linux not offered that back when I was still using that "buggy" CRT monitor, I would never have switched from Windows to Linux because my frustration with the monitor running at a resolution where everything was blurry and flickering was enough to render the operating system practically unusable.
There's so many different reasons why graphics on Linux is "less than good" (especially ancient Linux that was around in the "CRT era" - it has improved since) that it's impossible for me to determine how your problem relates to anything. Note that I do vaguely remember trying to get Linux to work in the 1990s, and Xorg defaulting to a hideous video mode that abused VGA registers to get a "800*600 with 16 colours", where the refresh rate was so bad (due to relying on VGA's pixel clock, which only goes up to about 30 MHz) that it was like it was designed specifically to cause seizures. Of course it's likely that your problem is different (and isn't the same as the "Xorg didn't use auto-configuration and had bad default configuration" problem that I remember).
onlyonemac wrote:You can't keep insisting that:
  • The graphics are only "slightly more blurry" when multiple users here have commented otherwise
  • Providing end-user configuration is a bad idea without justification, and when multiple users here have commented otherwise
I'm not sure if you've used the Internet before. How it works is that when people agree they say nothing, so conversations tend to be dominated by a vocal minority that disagree.

Multiple users have commented that "slightly more blurry" is bad for existing OSs (where everything is aligned to pixel boundaries) on ancient/low resolution screens. They're correct, but it's completely irrelevant. Nobody can say that "slightly more blurry" is bad for the way I'll be rendering graphics on future/high resolution screens for the "faulty monitor that wouldn't have existed for 15 years at the time the OS is released" case.

But enough about me... Let's talk about your OS.

How does your OS currently handle video mode auto-selection (and ensure that the chosen video mode is actually visible/supported by the monitor)? How does your OS avoid the "user has to change a setting to get video to work but user can't see anything because video doesn't work" problem? How long did it take you to write all of the native video drivers so that you could avoid "not-native resolution" on all computers?


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
onlyonemac
Member
Member
Posts: 1146
Joined: Sat Mar 01, 2014 2:59 pm

Re: Dodgy EDIDs (was: What does your OS look like?)

Post by onlyonemac »

I am talking about upscaling.

Imagine this: you have a monitor 6 pixels high and a video signal 4 pixels high. The video signal consists of alternating black and white rows. This is what you'll get on the monitor: first row will be completely black, second row will be grey because it's halfway between the black and the white row, the third row will be white because it's in line with the end of the second row of the signal, the fourth row will be black again, the fifth row will be grey, and the sixth row will be white. Oops - our sharp black-and-white lines have become two black-to-white 3-pixel-high gradients end to end.

Now imagine this: your signal is 3 pixels high with a black row in the middle and the other two rows are white. It's rendered on a display 4 pixels high. The first row of the display will be white, the second row will be grey because it's halfway between the black row and the white row, the third row will also be grey because it's halfway between a black row and a white row, and the last row will be white. Oh dear, now the black line's disappeared completely and turned into a grey line of twice the thickness! This is what I mean by "falling in the gap", and had that black line been the cross-stroke on a letter "t" the letter would now be much harder to recognise (especially considering that other parts of the letter will probably remain black depending on their alignment with the display).

(Of course, these examples would apply to any display where the ratio between the width/height of the monitor and the width/height of the video signal have the same ratio as the example, and similar effects will occur with other ratios as well.)

The bottom line is, when you're upscaling by a non-integer ratio, you're going to get some blur.
Last edited by onlyonemac on Tue Feb 09, 2016 2:43 pm, edited 1 time in total.
When you start writing an OS you do the minimum possible to get the x86 processor in a usable state, then you try to get as far away from it as possible.

Syntax checkup:
Wrong: OS's, IRQ's, zero'ing
Right: OSes, IRQs, zeroing
FallenAvatar
Member
Member
Posts: 283
Joined: Mon Jan 03, 2011 6:58 pm

Re: Dodgy EDIDs (was: What does your OS look like?)

Post by FallenAvatar »

onlyonemac wrote:The bottom line is, when you're upscaling by a non-integer ratio, you're going to get some blur.
And Brendan has already addressed this in his port where he listed resolutions and sizes of characters and monitors. Just because "some" blurring has happened does not mean the characters will be illegible. They CAN become illegible unles you take some precautions (such as picking a font-size where even when blurred the text is still legible, which brendan has already stated he is looking into.)
onlyonemac wrote:
  • The graphics are only "slightly more blurry" when multiple users here have commented otherwise
I would say a couple vocal users (as in 2) have commented otherwise with no proof, and doing no work to defend their position.

- Monk

P.S. I agree with brendan's ideas, both about the blurred text not being an issue, and user-config is bad (in this case).
Post Reply