So i just got a RTX A4000(Samsung VRAM) the other day, this GPU is similar to the RTX 3070 using the GA104 GPU. I knew it was risky buying a new GPU but i wanted something with more power, unfortunely this card causes major eyestrain even with my Dell UP2720Q true 10-bit monitor. When i installed the card, i loaded Windows 10 20H2 (since i have an installation with that on my Vaio SX14 laptop with an Intel iGPU that doesnt bother me, the screen even uses PWM on that laptop but doesnt bother me), so once i got it up and running i noticed the image of the monitor changed, it was no longer "flat", my monitor is calibrated and i installed the color profile which did not improve the eye strain or strange "rounded-off-ish" image quality the RTX A4000 was producing. So naturally i blamed Windows 10, and moved over to my Windows 7 installation just to find the A4000 doesnt have any Windows 7 Drivers but i was able to mod the Windows Server 2012 R2 Nvidia 472.17 drivers with NVCleanInstall and enabled test signing mode in Windows 7 then drivers installed just fine, but to my surprise the eyestrain was there still and the image still looked not "flat", i then went to apply the disable dithering registry mod for Nvidia GPU's which did not help at all but im not surprised since there shouldnt be any dithering with a 10bit setup. I then booted into my Linux installation which normally doesnt give me any strain and found i get the same exact type of eyestrain there also, it seems when i have this RTX A4000 installed i get the same type of eyestrain everywhere, as if its some kind of problem with the way the card outputs the image in general not specific to the operating system. I gave up and reinstalled my old Quadro K4200 and everything immediately was good and strain free again.

Im not sure if the problem i have with this RTX A4000 is due to dithering or not unless its trying to dither above 10bit which im not sure if thats even possible. The problem simply seems to be the way the GPU is outputting the image.

I also tested with these other monitors in 8bit only mode and it was the same situation:

Dell UP3017 (8bit+FRC)

Dell U2720Q (8bit+FRC)

Dell U3219Q (8bit+FRC)

Is there anyone here that is using a RTX 2070 Founders edition or a RTX 4000 GPU without eyestrain? These are the next two cards i want to try but im worried they will cause the same eyestrain as the RTX A4000.

    Not that I would have your card specifically but I believe the source of the problem is still the same. Welcome to the club. I can't tell for 100% if it is temporal dithering we're dealing with here, but I believe some form of it yes. I can even imagine something that makes the card not process the full image in each frame which boosts their performance.

    As for me where I am kind of "unique" compared to the average population and why I believe I personally have these symptoms is my sensitivity to motion - back in the CRT days I was suffering incredibly in front of 60 Hz CRT monitors and I could "detect" up to 85 Hz easily - only the introduction of 100Hz and beyond solved all my issues. I could detect flickering on plasma displays where most of the people I know were saying they can't see anything.

    What we're dealing with here is not usual flickering, this is different, but it is some form of image instability. All LCDs have unstable image to some extent by definition, but this is simply too severe and our eyes and brains are trying to protect us…

    It's really awful that they don't disclose that information. The possibility of a manufacturer even saying that they cover full sRGB but don't claim any higher color space could still use temporal dithering to reach that 100% sRGB. Many 8-bit panels won't reach 100% just due to quality control, manufacturing, etc…

    The big question I have is how do we get in contact with the right people to find out?

    If people are based in the US here perhaps starting a class action lawsuit against nvidia and other companies might be the way to get their attention. They're damaging your health after all…

    The symptoms i get are a dizzy feeling, like there is motion on the screen and it feels no where near as sharp. Besides the eye strain symptoms the general image quality is worse as there is like a layer of blurriness/lack of sharpness on all images,videos etc. The Quadro K4200 produces a much better image than this new A4000 which is just shocking. Even the BIOS screen has this blurry effect on the boot logo and menu. There are some other A4000 revision BIOS's that people have uploaded to Techpowerup and im tempted to try one of those to see if maybe a BIOS update can fix it.

    I was reading that GPU's after a certain generation started to use a form of memory compression and there are people in the comments of this video talking about when they switched graphics card they noticed a loss of image quality. I think maybe a bad memory compression algorithm and some form of dithering is enabled at the hardware level of the A4000 to somewhat cheat in order to gain easy performance. https://www.youtube.com/watch?v=R1IGWsllYEo

    I wish there was a modern GPU that didnt resort to this trickery for extra performance, i plan on buying a RTX 4000 from Amazon which is the Turing based version of the A4000, its slightly weaker but should be plenty powerful enough, if i get eyestrain i will just return it.

      Just curious, what color profile are you using for windows? Also, is HDR turned on in windows? Your symptoms seem to mirror mine when I come across a troublesome display. Can't focus on anything in particular on the screen, it doesn't seem sharp, feeling woozy, and brain overload.

      Jack The Quadro K4200 produces a much better image than this new A4000 which is just shocking. Even the BIOS screen has this blurry effect on the boot logo and menu

      Sounds very similar to alleged issues with recent Nvidia cards, so you aren't alone.

      I realize this is more a hypothetical than anything but if I had this particular bad card in my possession I could probably deduct what the potential cause is relatively quickly.

      Jack I wish there was a modern GPU that didnt resort to this trickery for extra performance, i plan on buying a RTX 4000 from Amazon which is the Turing based version of the A4000, its slightly weaker but should be plenty powerful enough, if i get eyestrain i will just return it.

      Can't comment on modern Nvidia GPUs, but I've recently had good fortune with a certain Radeon Pro reference card. Frustratingly, this is not a blanket endorsement of all AMD cards because similar issues can exist there as well.

      https://ledstrain.org/d/1652-any-success-stories-upgrading-to-rtx-3000-series/2

      Jack This is an excellent comment! Memory compression could be exactly what's causing this as a side effect. They constantly improve the technique, many people had troubles even with older generation cards, perhaps with increasing its "efficiency" it's getting worse and with the latest generation it's above the limit of a higher number of people.

      Would you consider writing this on the NVidia forum? We all should start posting it and try to expose this…

        From some searching it appears the a4000 has the G104 'chip' the same as the 3060ti.

        I cant really offer any help but the blurry effect I have raised with Nvidia from my 3060ti compared to my 1660s, it is extremly noticable to me especially on a TN screen, slightly less on an IPS but either way that is only one of the problems with it.

        Changing settings in nvidia control panel for sharpness helps a bit but its not the same clear image that the 1660s offers and is most noticable in 3d applications.

        I would like to think the blurryness is part of why it gives me migraines but who knows, Nvidia say there is no differnence in output between the two cards.

          HAL9000 If I had a card "A" that was good and a card "B" that was bad in front of me I might be able to get some empirical data as for whats possibly going on.

          Perhaps as well we're asking Nvidia the wrong questions so that's potentially why they claim no difference. In addition I suspect I know some ways that might potentially skip the normal customer service queue of support (which I mentioned before is "siloed" from the engineering teams).

          I posted in the DIY osciloscope PWM thread about dithering and how I've been able to make a cheap setup capable of recording it with an iphone SE on a tripod with a carson microflip microscope attached https://imgur.com/Pb27cPv. https://www.youtube.com/watch?v=vh0vRJNnHy8 (this was a p24h-10 monitor). All my monitor recordings were done on a g4560 using the hd610 igp in windows 10 with ditherig running.

          An interesting thing I've found is that it seems you can't rely on panel bit depth specifications to know whether or not the display dithers. I recorded very clear dithering on my AOC u27v3 monitor that uses the PANDA LM270PF1L panel https://imgur.com/DW1FYjx. This panel should be true native 8-bit as it's also used in the AOC U2790VQ and Philips 276E8VJSB monitors, both of which are 10-bit (8-bit+FRC).

          https://www.displayspecifications.com/en/model/f8d41b4a

          https://www.displayspecifications.com/en/model/f02b1537

          I guess this makes sense since the monitor's control board ultimately determines whether or not dithering is used. So slapping a budget control board that reaches 8-bit color by using 6-bit+FRC onto a native 8-bit panel will just make the 8-bit panel dither. Now the real question is do the control boards that reach 10-bit through 8-bit+FRC also use 6-bit+FRC to cover the initial 8-bit color range?

            chahahc

            I posted in the DIY osciloscope PWM thread

            Move to a new thread for organizational reasons. All posts preserved and a redirect link is in the other thread.

            machala Yeah i think this plays a part in the eye strain especially since theres a difference in general image quality from switching from a K4200 to a A4000. I can try writing there but i doubt they will take it seriously unless enough people join in and complain.

            HAL9000 The exact revision is "GA104-875-A1" and it seems closest to the RTX 3070 TI(GA104-400-A1) and the RTX 3070 TI 16GB which never was actually released, I wonder if all GA104 based GPU's will cause the same symptoms or if moving up to a GA102(RTX 3080, 3090, A5000) based GPU would be any better. It was reported here that a RTX 3090 gave @screengazer eyestrain so it seems like maybe the GA102 series will cause the same symptoms. https://ledstrain.org/d/1048-new-graphics-card-rtx-3090-gives-eyestrain

            https://www.techpowerup.com/gpu-specs/nvidia-ga104.g964

            https://www.techpowerup.com/gpu-specs/nvidia-ga102.g930

            It sounds like you have similar experience with your 3060 Ti as im having with my A4000, i tried raising the sharpness in the control panel and it helped a very small about amount but still not usable for me.

            I have a RTX 4000 based on the TU104 GPU arriving soon which comes with Samsung VRAM, i really hope it will be free of eyestrain. If that fails ill try a RTX 5000 after that and see what happens. The 5700 XT's ive tried have been the worst so far though.

            https://www.techpowerup.com/gpu-specs/nvidia-tu104.g854

            6 days later

            So i received my Quadro RTX 4000, and out of the box, this card is certainly usable, the image quality is "different" its not bad/worse, just different, but it feels its still slightly less flat, it doesnt cause me eyestrain/dizzyness and im glad to be able to finally use a somewhat recent GPU.

            I saw several posts about a program called NvColorControl

            https://ledstrain.org/d/1048-new-graphics-card-rtx-3090-gives-eyestrain/5

            https://ledstrain.org/d/1119-g-sync-laptop-zero-eye-strain-from-gaming-tons-from-regular-use-help/20

            https://www.avsforum.com/threads/2020-lg-cx%E2%80%93gx-dedicated-gaming-thread-consoles-and-pc.3138274/page-21#post-59699820

            I ran it with the settings "NvColorControl.exe 10 RGB 2" and the image looks the same as my Quadro K4200 now. From what i understand about reading the instructions from Vinz80 is that setting "2" is disabling the state for Nvidia to enable the dithering, if thats true then that would mean Nvidia is just enabling dithering for everything even if the monitor has a true 10bit panel.

            I still need to test it under Windows 10 to see if NvColorControl helps, but on my Windows 7 installation it made a huge difference.

              6 days later

              Jack Thanks for your report! This is a great find. What driver version are you using on your Windows 7 setup?

              Jack So how do you guys figure out if the given card has Samsung RAM? It's not specified anywhere and even if you have the card in your hand all the internal components are covered by the cooling setup, so you don't see it unless you take the heatsink off…?

              @degen For Windows 7 the last officially released Quadro driver is 441.66 but i used the oldest stable branch version i could find which is R410 U6 (412.29) February 22, 2019. They also have an older one from the "New Feature Branch" R415 U2 (416.78) November 13 2018. My Windows 7 installation is from an ISO without SP1 and i install the SP1 and minimal related updates manually and replace the outdated security certificates with one from a Windows 10 install by exporting each one manually since on an old Windows 7 ISO they will be expired. I never use Windows update to make sure nothing unwanted gets by.

              @machala Usually they are listed in GPUZ, and im pretty sure all of the Quadro RTX and RTX A series use only Samsung VRAM and not Micron, Elpida or Hynix. If it isnt listed in GPUZ then you would have to take apart the heatsink/fan assembly to inspect the VRAM modules, which gives an opportunity to replace the thermal paste since the stock thermal paste from the factory is usually bad.

                Side note, my IGP only says DDR3 because it's using system memory. CPU-Z can be used to show you who the manufacturer is on the SPD tab--in my case, Kingston.

                dev