- Edited
KM @JTL I am clearly out of my depth but in @deepflame's situation he said a software setting on the GPU was set to "10-bit mode"...and even though the laptop's panel wasn't 10-bit this fooled the apps requesting it into believing they didn't need to dither to get that 10-bits so it sounds like the apps trusted the gpu and not the display?
IF trickery would even work universally, rather than having to make hundreds or more of individual panels to report false capability back to whatever is requesting that quality, my thought (again based in very shallow ideas and no programming ability so might be nonsense) was could we somehow come from the other direction and extrapolate what his gpu setting was doing (fooling the requesting apps so they don't bother to dither as they don't think they need to to achieve the 10 bits) to a higher level global behavior so when whatever it is in the OS that says "show the desktop and file system and everything etc in this quality and dither to get that quality if you can't natively do it" (just like the apps in his case, and which has to be happening somewhere in software since in just a bog standard distro makes 6 bit laptop panels dither to more colors right?) that we get a system wide change rather than app by app.
Is that just obviously stupid and I am too ignorant to how everything comes together for this to even be possible? What is it in linux for example that says what quality to display the graphical DE in and triggers dithering to get it? Is it the gpu driver? Xorg? As far as I know laptop panels are "dumb" and aren't making that choice internally like some desktop monitors can right?