I am sensitive to dithering and flickering, for example, I cannot physically use windows 11 or iOS 26, my head hurts.
The monitor is the same, the old AW2723DF.
First, I installed the LATEST Windows 10 Pro on a new computer. And I got a headache like I did with Windows 11.
After that I installed Windows the same as on the old one, the official build of WIndows 10 PRO version 19045.4780.
It got better.
Only the computer is completely new - RTX 5080, 9800x3d, asus TUF 850 plus wifi, gskill 6000mhz 28CL.
It tightens and hurts in the back of the head.
I've already tried everything..
G-sync is turned off in the nvidia panel, rgb 8bit, full range is set.
Standard programs novideo_srgb, ColorControl.
HAGS is disabled in Windows.
The HDR is turned off.
MPO is disabled.
I turned off the resizable BAR in the BIOS.
I turned off the Clock Spread Spectrum in the BIOS. I
Inserted the hdmi cable - it became even a little easier.
I've slightly reduced the sharpness in the monitor, and the brightness in the nvidia panel has become even better.
I plugged the DP cable into the iGPU. It feels easier, but my eyes are swimming anyway.
But that doesn't suit me, because there were 0 headaches on the old computer.
Update.
I solved the problem.
The problem turned out to be the 5080, so I replaced it with an old 4070TI and everything is fine again. No headaches or stress anymore.
P.S.
Windows 11 22H2:Build 22621.963, which some here called safe, actually causes headaches just like the other Windows 11 versions.
P.S.S.
I don't know how to live anymore. Just a few years ago, it was enough to surround myself with secure devices. The marketing b1stards have found a way to get to me: they started adding software dithering to Windows 11, iOS 26, and the 5000 series of video cards.
Update.
As you remember, I returned the 5080 a long time ago, and today I returned the 4090 as well. But I almost managed to solve it.
I won’t describe everything. Briefly:
I bought a monitor to replace the AW2723DF — ASUS ROG Strix XG27ACG. True 8-bit. About half of the load disappeared immediately; apparently these video cards amplify the monitor’s FRC. Therefore, I recommend selling your old monitor and choosing a true 8-bit monitor without FRC. THIS IS MANDATORY. Without this, there’s no point in even talking about further attempts. You can filter for this on DisplaySpecifications.
I set 8-bit in the NVIDIA Control Panel. Just in case, I disabled novideo_srgb. Banding appeared — which means everything worked as intended. Dithering is disabled both on the GPU and on the monitor (because the new monitor simply doesn’t have it). Unfortunately, about 20% of the strain remains — there’s no obvious pain, but there’s a sensation like a tight band around the head, so simply sitting in front of the 4090 is uncomfortable. That’s why I returned it to the store today.
I don’t know what’s wrong with these cards. It feels like there’s some kind of physical, non-disableable dithering. Or maybe they’re so POWERFUL that they output something strange to the monitor that I can physically feel.
The adventure continues — for tomorrow I’ve ordered a 5070 Ti GIGABYTE.
I also wrote to NVIDIA support and am trying to get a CLEAR answer from them about what exactly is going on.
Update.
Yesterday I received the GeForce RTX 5070 Ti WINDFORCE SFF 16 GB (GV-N507TWF3-16GD). I spent around 12 hours with it in total.
Huge progress. It does not give me a headache. That means: no pain in the back of the head (visual strain), no tight “band” around the head, and no temple pain either.
However, one problem unfortunately remains — my vision still “floats.” When looking at the desktop, it feels like everything is glowing: flashes, loss of focus, slight blur. But the eyes themselves or the eye muscles don’t hurt. After about an hour, I adapt by roughly 80%. Because of this, I’m keeping the card, but I’m not rushing to sell the 4070 Ti yet.
By now you already understand how sensitive I am, and the only thing that can be stated for sure is this: you need a monitor without FRC — true 8-bit or true 10-bit. Removing FRC eliminated pain almost completely on two GPUs from different generations. GPU dithering and monitor FRC do not work together properly, so apparently some kind of interpolation occurs, which is what drives us crazy.
At the same time, Nvidia’s 50-series clearly causes tension for sensitive people in any case. It’s just unclear what exactly it is — some kind of physical, non-disableable dithering?
Why the 5070 Ti is relatively okay — I have a few theories:
Because of the new monitor with true 8-bit. Possibly even the old 5080 would have worked for me if I had upgraded the monitor first. But this does not explain why the 4090 causes tension and brain fog on the same true 8-bit monitor — which leads to the second theory.
GPU power. EMI modulation, all those frequencies, watts, etc., may amplify sensitivity. That would explain why the 4090 was bad for me, while the 4070 Ti and 4060 are perfectly fine.
(Minor influence) Some dependence on the manufacturer — they use different power layouts and components. Only the GPU chip itself is the same; the rest of the hardware differs.
P.S.
I plan to get a true 10-bit monitor for work, because they require Windows 11 due to security updates — and as you already know, I cannot tolerate Windows 11 at all; I practically pass out immediately. Microsoft officially stated: buy a true 10-bit monitor and dithering will disappear on Windows 11. Well, we’ll test that. I found a relatively cheap true 10-bit option — ASUS ProArt Display PA328QV.
Update.
There’s good news and bad news.
I also spent some time on the 5070 Ti — the RTX 50 series is completely intolerable for people who are sensitive to dithering (color flickering).
Big “BUT” — I was being seriously dumb, and for some reason no one in the thread corrected me with: “Dude, forget the GPU, just output through the iGPU.”
And that turns out to be the solution.
I didn’t use this method before because I assumed there would be significant downsides.
In reality, it’s nothing but upsides:
The GPU stays idle all the time, ~25°C, and is only active in games.
Dithering is completely gone — literally zero. Everything is great again.
I tested BF6 on ultra settings in two modes:
– DisplayPort connected directly to the GPU
– DisplayPort connected to the motherboard (iGPU output)
The performance difference is about 1–2% FPS, around 1 ms of latency, with no stutters or instability at all.
I also ran the AC: Shadows benchmark — everything was excellent there as well.
The only so-called “downside” is that you can’t enable G-Sync, but for our community this flicker feature isn’t needed anyway, so it’s hardly a downside 🙂
So the topic is basically closed for now. I’m honestly exhausted after these weeks.
Main conclusions as of today:
The monitor must be without FRC — true 8-bit or true 10-bit. This immediately cuts the load in half, even on problematic GPUs.
You can buy any graphics card, but connect the display to the iGPU. If you’re worried that the iGPU might dither (which is unlikely), go with Intel — it’s easier to fully disable dithering there using software.