Link Well, it turns off dithering for applications that want to work in 10bit color depth.
In other words: Currently I run Windows 7 with an ATI FireGL card (9 years old) that does not have dithering enabled for most apps. Now when I use apps that request 10-bit color depth like Capture One Pro (photography app) or Opera the card starts to dither heavily in order to display the 10-bit color depth that the 8-bit display would not be able to show.
Enabling 10-bit color support tells the driver that I use a 10-bit display and turns off dithering. Interestingly this also works without a 10-bit screen.
So I am currently very happy that I can use some of the apps now that I have not been able to use because of said dithering in 10-bit mode.
I assume this is the same effect that others experience playing video games. Some games request a higher color depth and the video card resorts to dithering.
Now that the industry's next big thing is HDR and high contrast with flashy vivid (and may I say mostly unnatural) colors I think it makes sense that manufacturers use methods like dithering to stay competitive.
I hope that the industry either catches up with affordable 10-bit displays that do not dither or that our voice gets heard more and that devices like an E-ink laptop that Sony supposedly works on get bigger R&D budgets and get released sooner.
Sometimes I cannot believe that we are such a minority that cannot use new tech. I would assume that more people are affected already and do not realize yet what the cause of stress, headache, eye strain or bad sleep is.
Also, does no employee at Apple, Microsoft, Sony, etc. have these issues with their own tech or are they so blinded by the next brighter, more colorful display to wow the world and to stay ahead of the competition that topics like ergonomics and health get forgotten?