Assumption: on a 60 Hz display in 60 Hz mode, the dithering flicker has to be between 0 and 60 Hz.
I measured low frequencies on average over time (several minutes) with my oscilloscope in both Windows 2015 LTSB and Manjaro Mate 17.1.11 (Linux). This is what I found:
Windows:
Manjaro Linux:
(You can right-click the pictures and open them in new tabs to see the full image size)
The spike at 60 Hz is probably the screen's refresh rate. Which then surprisingly makes an impact big enough for the oscilloscope setup to pick up. However, those large bumps below 60 Hz, may that be the graphic card's temporal dithering or even the monitor's FRC (not that I knew it had FRC)? The graphics card is NVIDIA Quadro NVS 295. On Linux I use the open source nouveau driver and made sure to turn off dithering via xrandr. xrandr --prop clearly tells dithering is set to "off". The Windows NVIDIA drivers in general are said not to use any dithering by default. They do not have options to control it. Someone said they would turn it on automatically when changing gamma or brightness via the driver settings, but I did not do that.
The spikes at 14.26 Hz and 28.56 Hz are mysterious, too. Why would there be any flicker at such a low frequency?
Any ideas what we are seeing here? Thoughts?