I came across a thread I have seen before but overlooked the comment about color profiles and dithering. See post #5 from Yasamoka here https://forums.guru3d.com/threads/how-to-disable-dithering-in-linux.387362/
"AMD GPU dithering is supposed to be happening only when you are using color profiles. In that case, it would use a 10-bit LUT and output an 8-bit signal with dithering enabled to eliminate gradient banding. Otherwise, there should be no dithering if you do not have a color profile loaded.
If you're using a color profile, you could stop dithering from being used by truncating color profile gamma curve output values from ~16-bit to 8-bit. I have code for this so if you're using a color profile, send it to me via PM and I'll convert it to 8-bit."
Anyone know how to use this knowledge to test intel graphics on linux? Is there a way to disable color profiles period to see if it makes a difference? I know redshift messes with profiles and gamma too.
How do I get be sure no profile is used in linux to test this theory? Laptop panels are all 6-bit as far as panelook.com says so seems I'd need 6bit in and out if that's possible....his comment says you do that by "truncating the gamma curve" If it works and I see the banding I know dithering is off as far as I understand...and could then see if it helps. I recall seeing one person here say calibrated color profiles HELPED them, but it made zero difference to me, so once again we have muddy info...but I'd like to try this if I can figure out how. All I can find are unanswered dead ends like this https://forums.geforce.com/default/topic/980918/how-to-disable-dithering-for-6-bit-lcd-panel-/ or people arguing to get glasses. use f.lux etc that nearly every eyetrain thread devolves to while people try to prove how "right" they are by shitting up threads but not giving useful info.