I just found this thread about oscilloscope + photodiode measurements, in which is info about detecting temporal dithering:
In posts #22 and #25 he talks about dithering and says that by analyzing the frequency spectrum one could see if (monitor panel's) temporal dithering is active. He says it appears as a spike at 15 Hz. I don't really understand this yet, as I have yet to learn the very basics about frequency spectrums and spectrum analyzing, but if it's true, it might be a very easy and perhaps reliable test for temporal dithering that we could add to our oscilloscope setup.
Maybe for some of you this is more clear when you read it than it is to me? This is stuff that wasn't taught at school. I also don't understand how a refresh rate of 60 Hz would lead to just 30 Hz polarity inversion (shouldn't it be 60 Hz, too?). I have a feeling understanding all this is vital to understand possible eye strain effects of temporal dithering.
It could also be that only FRC (display hardware's own, internal dithering) appears as 15 Hz flicker while external temporal dithering (graphics hardware, graphics driver) would be sent inside the normal 60 Hz video signal, thus becoming 30 Hz flicker(?). Further meaning those two could interfere with each other...