- Edited
Looking at the Intel support forums (Windows), there's lots of reports of incorrect color depth being detected for displays, and users complaining they can't select 10/12 bit color. Also a few requests to disable dithering.
Our driver supports Color Depths of 8-bit or 12-bit via HDMI*.
Currently, if a 10-bit display is used the driver will default to 8-bit with Dithering or 12-bit if supported.
Please refer to Deep Color Support of Intel Graphics White Paper. (Page 11)
There is already a request to allow users to manually select desired Color Depth via IGCC (Intel® Graphics Command Center), but this is a work in progress with no ETA however it is in our Top priority list.
The above doesn't impact at all the Encoding/Decoding capabilities of the graphics controller. HEVC 10-bit video encoding/decoding via Hardware is supported by the graphics controller.
There was another post I saw, and forget the link, but the support answer was that the driver automatically enables dithering for 8Bit and above, and disables for 6bit. Which seems backwards, as a poster pointed out. Is there a way to spoof or rewrite an EDID to be detected as 6bit?
Intel Deep Color White Paper - https://www.intel.com/content/dam/www/public/us/en/documents/white-papers/deep-color-support-graphics-paper.pdf
Most of the traditional laptop panels were internally of 6bpc (=18bpp panels). This naturally means even a normal 24bpp image can show dithering when viewed on an 18bpp panel. To avoid this, a process called “dithering” is applied which is almost like introducing a small noise to adjoining pixel. This will create variations in 18bit representation and results in hiding color banding on such panels. Either the GPU’s display HW or panel itself might do this. When a panel does this, source (GPU display HW) is not aware of the same and panel will advertise itself as a normal 8bpc (24bit) panel
So I gather from this that the reason dithering is always enabled is a failsafe to avoid a poor 6bit+frc implementation. Which in a way makes sense because there could be very cheap monitors out there that are advertised as 8bit but are 6bit+frc (as we know), so the dithering at the GPU side is ensuring a consistent 8bit output across all monitors.
Windows* OS doesn’t support more than 8 bpc for desktop. This means even if an application has a more than 8 bpc content, it will all be compressed to 8 bpc during desktop window composition process as shown in figure below
So perhaps all these applications that are starting to cause strain are designed with HDR in mind, however they are being dithered down to 8bpc?
In many respects I can understand why dithering is enabled. It's much easier to just force all displays to 8-bit rather than have the possibility of an incorrectly detected monitor/color combo. However I think that advanced controls should still be available to the consumer as dithering on a true 8-bit monitor will produce extra noise when it isn't needed. So I'm not getting what I pay for as a consumer. So eye strain or not, dithering isn't needed if the monitor is detected correctly and correct color range is selected.