hpst @JTL I am clearly out of my depth but in @deepflame's situation he said a software setting on the GPU was set to "10-bit mode"...and even though the laptop's panel wasn't 10-bit this fooled the apps requesting it into believing they didn't need to dither to get that 10-bits so it sounds like the apps trusted the gpu and not the display?
I don't have the same software & hardware to test in front of me so I don't know what his setup is doing.
hpst Is that just obviously stupid and I am too ignorant to how everything comes together for this to even be possible? What is it in linux for example that says what quality to display the graphical DE in and triggers dithering to get it? Is it the gpu driver? Xorg? As far as I know laptop panels are "dumb" and aren't making that choice internally like some desktop monitors can right?
No. @deepflame's experiment is the first I've heard of this, but I'd need the same hardware in front of me in order to understand what it's doing and if I can port it to be more universal.