Standardized Representation of Colour
Posted: Thu Jun 04, 2009 5:48 am
Hi,
For my OS I want all graphics data to use a standardized representation for colours; where colours in this standardized representation are converted into a device's representation by the corresponding device driver (video card driver, printer driver, scanner driver, etc).
Some systems of representing colours aren't able to represent all colours. For example, the RGB colours (or standardized "sRGB") used by monitors and video cards are inadequate to describe all colours. The wikipedia has a good picture describing this inadequacy (from the wikipedia page on sRGB, where all colours outside the triangle can't be displayed correctly:
One thing to understand here is that even though a colour can't be displayed, for things like alpha blending and anti-aliasing colours are mixed, and colours that can't be displayed may be mixed into colours that can be displayed. For example, consider this (256 * 128) picture:
If this is scaled down to an 8 * 4 picture (and then blown up again so you can see individual pixels) it becomes:
If the green band in first picture was a colour that a monitor can't display, then a monitor would still be able to display all colours in the scaled down image perfectly. However, if the green band was converted into a colour that the monitor can display before the scaling is done, then the resulting scaled down image (which could have been displayed perfectly) would be wrong (e.g. the pixels that are anti-aliased would have the wrong hue).
Also, certain devices (e.g. digital cameras) have a far larger range of luminance than the human eye can handle. This is mostly so that software can adjust exposure after the picture has been taken (e.g. so that dark pictures can be made lighter without losing detail). Therefore I want to be able to represent a similar range of luminance (e.g. from absolute black to the brightness of the sun). This is also important for things like specular highlights in 3D models.
My current thinking is to use something like HSV (Hue Saturation and "Value"); where "Value" (brightness) is defined in lux and has a much larger range (e.g. from 0 lux to 65536 lux).
The main problem here is defining "hue". One idea I had was to use the equivelent wave length of monochromatic light (e.g. 460 nm for blue, 520 nm for green, 700 nm for red, etc) for some colours, and the ratio of blue/red for other colours (as there is no equivelent monochromatic light for purples). For example (with hue encoded as 16-bit unsigned integers), 0x0000 could represent 460 nm light (blue), 0x2AAB could represent 495 nm light (cyan), 0x5555 could represent 520 nm light (green), 0xAAAA could represent 700 nm light (red); then 0xD555 could represent an equal mixture of 460 nm light and 700 nm light (magenta) and 0xEAAA could represent a mixture where the 460 nm light is stronger than the 700 nm light (purple). This is good in that it's able to represent all visible hues, but I don't know how to combine colours - it's not a linear relationship, for example 50% 500 nm light combined with 50% 600 nm light does not produce 550 nm light (I'd guess the correct result would be closer to 570 nm).
Does anyone have any ideas or suggestions?
Thanks,
Brendan
For my OS I want all graphics data to use a standardized representation for colours; where colours in this standardized representation are converted into a device's representation by the corresponding device driver (video card driver, printer driver, scanner driver, etc).
Some systems of representing colours aren't able to represent all colours. For example, the RGB colours (or standardized "sRGB") used by monitors and video cards are inadequate to describe all colours. The wikipedia has a good picture describing this inadequacy (from the wikipedia page on sRGB, where all colours outside the triangle can't be displayed correctly:
One thing to understand here is that even though a colour can't be displayed, for things like alpha blending and anti-aliasing colours are mixed, and colours that can't be displayed may be mixed into colours that can be displayed. For example, consider this (256 * 128) picture:
If this is scaled down to an 8 * 4 picture (and then blown up again so you can see individual pixels) it becomes:
If the green band in first picture was a colour that a monitor can't display, then a monitor would still be able to display all colours in the scaled down image perfectly. However, if the green band was converted into a colour that the monitor can display before the scaling is done, then the resulting scaled down image (which could have been displayed perfectly) would be wrong (e.g. the pixels that are anti-aliased would have the wrong hue).
Also, certain devices (e.g. digital cameras) have a far larger range of luminance than the human eye can handle. This is mostly so that software can adjust exposure after the picture has been taken (e.g. so that dark pictures can be made lighter without losing detail). Therefore I want to be able to represent a similar range of luminance (e.g. from absolute black to the brightness of the sun). This is also important for things like specular highlights in 3D models.
My current thinking is to use something like HSV (Hue Saturation and "Value"); where "Value" (brightness) is defined in lux and has a much larger range (e.g. from 0 lux to 65536 lux).
The main problem here is defining "hue". One idea I had was to use the equivelent wave length of monochromatic light (e.g. 460 nm for blue, 520 nm for green, 700 nm for red, etc) for some colours, and the ratio of blue/red for other colours (as there is no equivelent monochromatic light for purples). For example (with hue encoded as 16-bit unsigned integers), 0x0000 could represent 460 nm light (blue), 0x2AAB could represent 495 nm light (cyan), 0x5555 could represent 520 nm light (green), 0xAAAA could represent 700 nm light (red); then 0xD555 could represent an equal mixture of 460 nm light and 700 nm light (magenta) and 0xEAAA could represent a mixture where the 460 nm light is stronger than the 700 nm light (purple). This is good in that it's able to represent all visible hues, but I don't know how to combine colours - it's not a linear relationship, for example 50% 500 nm light combined with 50% 600 nm light does not produce 550 nm light (I'd guess the correct result would be closer to 570 nm).
Does anyone have any ideas or suggestions?
Thanks,
Brendan