martin It's never a bad idea to get your eyes checked and I'd recommend that everybody gets a check up.
What I will say is after a lot of time browsing it seems to me the latest PC display drivers are using temporal dithering. This is because it hides banding and can make an 8bit display look almost as good as a 10/12bit. To the majority of users this effect is imperceptible, however there are various dithering algorithms available (Apple is the worst for me, and I suspect it's their own in-house dithering algorithm). The very nature of dithering involves flicker and introduces noise, however obviously the positives outweigh the negatives (no need to put 10/12 bit panels in any new tech, $profit).
Consumers want the nicest, most vibrant graphics and for this reason banding is a no-no. So we have no choice but to use technology with dithering baked in (perhaps in some cases at the GPU level) to avoid poor image quality.
I am interested to know what research was done into the effects of temporal dithering, if any. We are seeing movement with PWM-free devices now and DC-driven mobiles, but the very real flicker caused by dithering is not addressed. Intel propose that there will be an option to select colour depth soon, but it's doubtful this will disable dithering.
I suppose an analogy could be that everybody else is happy with stereo sound, but we need mono.