Comment 9 for bug 180693

Revision history for this message
Troy James Sobotka (troy-sobotka) wrote :

There are a few problems here of course, but the bulk aren't entirely 24bit's fault.

Hardware aside (TN LCD panels for example) I believe that way way way back when, computer graphics were evolved at SGI around 24bit as 'true colour' due to some of the median estimates of 17.3 million colours. Some estimates are as low as 10 million ( http://www.amazon.com/Business-Science-Industry-Applied-Optics/dp/0471452122 ).

The bigger problem is related to the fact that those tones are showing up next to each other and the human eye's ability to 'adapt' to spot the differences. This can result in some people being able to differentiate between certain tones with a little over 1000 values. The matter only gets more complicated when we consider that the human eye doesn't react to colour in a linear fashion - being more sensitive to the red end of the spectrum versus the blue. This makes it extremely difficult to concretely define a concrete 'how many colours' value.

Ultimately whatever hack / panacea will require some degree of dithering, regardless of bit depth.