yeah. I meant on windows. The idea is simple. On a 60Hz standard monitor dithering will work with 30Hz and leads of course to a 30Hz flickering at parts of the screen.
I don't know exactly how dithering was invented, however I think that the better and newer algorithms make problems cause it's only temporal on different parts of the display.
Long story short. If dithering is the problem, a higher Refresh rate will also lead to a higher dithering frequency. Always the half of the refreshing rate of the display.