I've been looking into why I can tolerate temporal dithering on monitors, but not GPUs. Using samples taken using my capture card I have analysed how many frames it takes a pixel to change.
As with all my capture samples, I captured the windows desktop only.
To do this, I decode the first frame of my captured footage and turn it into raw RGB values. I then get the next frame, compare it with the original, and the next frame and compare and so on. I record how many frames it takes for each pixel of the first frame to change. Lets call this Time Till Change (TTC).
The image below shows a small selection of the TTCs. It does not look very random, particularly that big blob of 1s and 2s on the bottom half.
I found that dithering seems to happen in blobs on the screen. Some blobs didn't even dither over 90 frames. A TTC of -1 means no change in colour value.
Below is a frequency histogram of the TTCs obtained for the above. If the TTCs were truly random there would be a smooth curve. The peaks and troughs show that enough in aggregate, the dithering method is not great.
The temporal dithering produced by GPUs is not particularly random. There are observable patterns. I have no way of testing how the dithering of a monitor works, but for now I presume they do a better job of producing a random dithering field and that is why they affects me less.
I repeated this test on both an RX570 and a GTX660 and saw similar patterns.
[Edited as a I found a mistake in the image, will probably find a few more if I look long enough]
Update 1
When doing the original testing I noticed a lot of the TTCs seemed to correlate with particular colours. To explore if this was the case I captured the grey colour palette below using a GTX660. The image was displayed using google chrome.
The resulting TTCs confirmed my suspicion that the temporal dithering frequency does depend on the colours being displayed. The taller the spike, the larger the TTC, ie the more frames it took for the pixel to change colour (dither). The results are fascinating. Some of the colours, including the background colour did not dither at all (tested for 300frames). Other colours had quite long TTCs, some quite short. The Z part of the labels on the graph show the TTC for that pixel. In simple terms: some colours aren't being temporally dithered, some are dithering quickly and consistently, others are dithering over a wide range of frames.
Next I looked at spatial dithering - where the colour values do not change with time - but are still dithered. the graph below shows the Red sub-pixel values for the colour palette. Z-labels show the actual value for the pixel selected. It shows the colours which were not temporally dithered, are not spatially dithered either.
I use my GTX660 everyday without any discomfort, its the only modern external GPU I can use without issue. I don't think its special, I think I have simply adapted to it over 7 years of use. Some applications do cause me pain when using it though, namely the updated steam library interface and the odd website. I think whats occurring here is the colour palettes in use are producing different dithering frequencies that I'm not used to it.
This could also explain why colour settings improve pain for some people. I will explore this next.
Update 2
I changed the colour setting for the GTX660 from default to the other options available: RGB Full, RGB Limited, and YCC444. That screen should look familiar to windows Nvidia users.
YCC444 and RGB Limited produced no change. Same TTCs, same patterns. RGB Full, was a different matter entirely. All the dark colours disappeared. I viewed it on my monitor to check it wasn't the capture card messing up, still no dark colours. I think this is what's called Black Crush? but I'm not at all familiar with it.
Stranger still? No dithering on RGB Full, none at all. Obviously this requires further investigation. I only captured the grey palette with RGB Full, so I'll be trying out more normal images.
Update 3
Tried out some other cards. First the GTX550. It used temporal dithering on both full and limited RGB. But neither used a random algorithm. TTCs were all between 1 and 3.
Next I tried an RX570. Its dithering was a bit more random than the GTX660, but it was still affected by colour. Below is the TTCs for Limited RGB set to 8 bit, 1080interlaced - I couldn't get the RX570 to send 1080p to the capture card unfortunately. There were some differences when capturing 10 bit, but I think I have posted enough graphs in this thread already.
When set to YCC422 there was no dithering at all.