If the EDID was respected and reliable then the FireGL/Firepro cards would not have the option to enable 10bit color mode. If it was reliable then the card would just rely on the info it got from the screen, right?
I think enabling this "10-bit color" mode just tells the graphics card what to do when 10bit color is requested. Should it dither or not.
So I am not sure if fooling the system to believe it has a 10bit display attached solves the dithering issue.
For me as a software engineer the better approach would be to see when the problem occurred.
My idea would be the following:
- get a setup that does not dither on an old Linux box
- update the graphics driver/kernel so long until dithering starts.
- check the graphics driver source and look for DITHERING constants or other code changes that might have affected this.
- reverting the code change, compile the driver/kernel module and see if it helped.
Think this would help us to see what really is going on. Again, I think the issue is color depth (temporal dithering)
The same we could do for Chrome as well. As people have pointed out old versions of Chrome were working well. Maybe we can find old versions of Chromium (best portable versions) somewhere that we can test and see which version introduced the different rendering method that causes strain.
As Chromium is also open source we could check for code changes here as well. I would assume they could be massive but it is work a try. In the case of Chrome I think we are talking about an updated version of harfbuzz for font rendering here ( https://www.freedesktop.org/wiki/Software/HarfBuzz/ ).