Jack

I ran it with the settings "NvColorControl.exe 10 RGB 2" and the image looks the same as my Quadro K4200 now.

1) can you say that the fonts become more thin and sharper after this action?

2) does it create a registry branch:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\nvlddmkm\DisplayDatabase\XXX_mon]
with the variable:
"DitherRegistryKey"=hex:db,nn,nn... ??

3) You only need to run the NvColorControl.exe once or every time you restart/login for the desired effect(image looks the same as my Quadro K4200) ?

4) Does this setting go off after actions like changing resolutions, starting games, sleeping?

5) Does your monitor support 10 bit color ?

I still need to test it under Windows 10 to see if NvColorControl helps, but on my Windows 7 installation it made a huge difference.

6) As far as I know, the Quadro series drivers have a built-in options in the driver GUI interface, which allows you to enable and disable dithering without additional programs.

Do you see this GUI switch?

It is possible that the default GUI dithering switch value of K4200 is different from A4000 ?

2 months later

I have two workstation in different locations and on two I had problems such as you describe. And on both of them the problem was solved by changing the BIOS version of the motherboard.
I have a theory that this is due to poor compatibility of devices from different generations. For example, you have a 2018 motherboard and you put a 2022 video card into it. I think there must be desynchronization issues here.
In my case, it was too fresh a bios that was not compatible with old cards, so I had to roll back. It's on the first workstation.
The second one had to update the bios to the latest version. And after that my new 3070 worked fine. No eye pain, no graininess in pictures, no bad color reproduction.

    6 days later

    CepheiHR8938

    Ah this is very interesting, im about to upgrade my system from a Evga X79 FTW motherboard which is from 2011 running a bios from 2014 but there was an update for it in 2018 for the Spectre vulnerability so i never bothered to update it. I'll try updating to the latest bios and see if i notice any improvements. In a few days im switching my system to a Evga z490 kingpin with a i9 10900KF so it will be interesting to test the A4000 again with this new setup since the motherboard is more period correct for this GPU.

      14 days later

      CepheiHR8938

      I finished my upgrade and with the i9 10900k/EVGA Z490 Dark KP and Quadro RTX 4000 im using Windows 2004 build 19041.572 completely comfortably where as on my older hardware it felt grainy/strainy, not only can i use it, this is more comfortable than my Windows 7/Quadro K4200 setup, but this might be because im using a 4K monitor and Windows 10 handling font scaling better or something, not completely sure. I havent tested the RTX A4000 card since im a bit nervous it could sabotage the setup, so i ordered another NVMe SSD to clone the existing setup to that drive and swap in the A4000. I think you are 100% onto something though about the poor compatibility of devices, i think my X79 motherboard was just too old.

      I also noticed a huge increase in audio quality after i switched to the new Z490 motherboard through my Schiit Modi Multibit USB DAC which has a iFI Defender isolating the 5v line away from the motherboards USB power into a Allo Shanti linear power supply, because im isolating the 5v line, it shouldnt have improved/changed the audio quality but it did. On my X79 board if i ever plugged in a microphone to the 3.5mm jack on the back, it would emit a ton of electrical noise and people on discord would always complain about the noise, so maybe some motherboards have poor separation of traces/components and can suffer from noise and maybe newer graphics cards are more sensitive to this as well, just like audio equipment and TV's/Monitors.

      Old Setup: EVGA X79 FTW | Xeon E5-2690 | Quadro K4200&Quadro RTX 4000 | Dell UP2720Q True 10bit monitor. (unusable with new RTX A4000 and previously Windows 10 2004)

      New Setup: EVGA Z490 Dark KP | I9 10900K | Quadro RTX 4000 | Dell UP2720Q True 10bit monitor. (Now usable with Windows 10 2004). (Will test with RTX A4000 soon)

        Jack On my X79 board if i ever plugged in a microphone to the 3.5mm jack on the back, it would emit a ton of electrical noise and people on discord would always complain about the noise, so maybe some motherboards have poor separation of traces/components and can suffer from noise

        Potential interactions between GPUs and motherboards is a complex subject, but sound quality issues caused by electrical interference with motherboard audio is definitely a known thing.

        Jack

        The symptoms you describe are exactly what I feel in front of most modern monitors and laptop screens. Like this weird bluriness and the image not being completely flat. It's like the image is dirty. And it's not the anti glare layer as it's the same on glossy panels. I don't understand how they make images look worse than 10 years ago but somehow they do it while selling gpus for thousands of dollars. Really something is off.

        7 days later

        Jack Hello Jack, how is your new setup with RTX 4000? It's been a few days since you have it. Did it help get rid of your problems? I'm planning to get an RTX 4000 myself, based on your posts. I think it is the only option left. I tried a macBook pro 16 2021, it was horrible. Now I'm writing from a legion 5 pro, same thing, it's barely usable.

        3 months later

        Just a long term update, I can use the Quadro RTX 4000 8GB without eyestrain, it seems like some Turing based cards can be good or bad, but the RTX A4000 16GB (RTX 3070 equivalent) is still absolutely unusable in Windows 10 2004, Windows 7, and Linux. Im pretty much convinced at this point that the GPU's themselves are the problem, even after upgrading to a Evga Z490 Dark, and I9-10900K the RTX A4000 is still unusable. I have tested this with multiple monitors over the duration.

        Dell UP2720Q, Dell UP3218K, Dell UP3221Q, Dell UP3017, Dell U2720Q, Dell UP2716D, Dell U3223QE, Dell U2723QE, Dell UP2716D, Dell U3219Q. None of these monitors made a difference when being used with a problematic card, there was always strain present when compared to a known good safe reference card like the Quadro K4200.

        List of graphics cards i tested also:

        RTX 4000(good), RTX A4000(bad), 5700 XT(bad), RX 480(bad), GTX 680 Classified(good), Quadro K4200(good), Quadro 4000(good), Firepro V5900(good)

        For my other systems i am just resorting to buying the most powerful Kepler based GPU's. Although im planning to test the Radeon Pro W5500 and W5700 soon as a final test. I also plan to test some cards on Windows Server 2016 and 2019 just to see if maybe something is different. Hopefully someday a way to edit the GPU Bios to disable this hardware level dithering becomes available in the future.

          Jack Just a long term update, I can use the Quadro RTX 4000 8GB without eyestrain, it seems like some Turing based cards can be good or bad, but the RTX A4000 16GB (RTX 3070 equivalent) is still absolutely unusable in Windows 10 2004, Windows 7, and Linux. Im pretty much convinced at this point that the GPU's themselves are the problem, even after upgrading to a Evga Z490 Dark, and I9-10900K the RTX A4000 is still unusable. I have tested this with multiple monitors over the duration.

          Nice hearing from you again

          My research is inconclusive and not yet "done" (in part due to a lack of needed hardware) but I have reason to believe that a) Potential per unit variance in otherwise identical GPUs may exist and b) certain firmware differences in both motherboard and GPU VBIOS may cause subtle interactions that may cause issues that aren't easy to isolate.

          And that's before any alleged application/OS rendering issues.

          Nobody said it was going to be easy.

          • Jack replied to this.

            Jack Just fyi if you happen to be located in Europe/Germany Amazon.de currently sells the W5700 for a cheap 300€.

              karut Amazon.de can ship internationally in some cases

                JTL but returning the card wouldn't be as easy of course

                • JTL replied to this.

                  karut Probably not. Amazon generally does have a good return policy but I can imagine it might be complicated with international shipping.

                  JTL I agree, i think there are different VBIOS revisions on certain cards and that using certain older motherboards with newer GPU's can cause the different interactions. Ill keep my bad RTX A4000 around and try finding some VBIOS posted on the internet to try flashing it with as an experiment, they all use Samsung VRAM and are reference models so it should be pretty easy to do.

                  karut Thanks, im located in USA right now though so i can pick them up pretty cheap on ebay.

                  4 days later

                  I know this isnt a real solution, but it does work nicely. If you have a motherboard with two pci-e slots, you can just use an old Kepler based GPU to connect your monitor and use the more power GPU that causes eyestrain to handle high performance applications or games. I have 3 extra Quadro K4200's which are all safe for my eyes so i just tossed the problematic RTX A4000 in my secondary system and did the EnableMsHybrid registry trick to be able to select it as the high performance GPU and it runs applications wonderfully and is also amazing for games. An added bonus is you get the clear crisp image of the safe kepler GPU's and can run certain games at 4K without that blurry dithering look. The image quality is superb this way. Im using Windows 10 2004 19041.789, with safe GPU's i have no problem with this version of Windows 10. A lot of people from this thread use old Tesla GPU's with this trick with either a weak primary GPU or integrated GPU.

                  https://forum.level1techs.com/t/gaming-on-my-tesla-more-likely-than-you-think/

                  https://forum.level1techs.com/t/gaming-on-my-tesla-more-likely-than-you-think/171185/70

                    Jack thanks for the two-gpu-idea as backup solution if everything else fails.

                    7 days later

                    I was reading about EDID information and i came across a post that a graphic card will dither if connected to a monitor that reports a lower bit capability than what the graphic card can output at max. These new cards can apparently output 12bit color, so even with a real 10bit monitor the dithering is still active. I exported my monitors EDID and have been trying to find if the bit depth is defined in the EDID and i dont think its there, if its not defined that might explain why the graphics card is dithering as its not defined in the EDID. My theory is that even if its defined as 10bit in the monitor EDID is that hopefully just changing it to 12bits would trick the GPU to turn off its dithering. Older Nvidia cards dont support 12bit color, so this leads me to believe that possibly modding the monitors EDID can trick new GPU's to disable the dithering

                    So i was able to spoof the EDID of my monitor to make it appear as a 12bit monitor, this didnt help at all. I use the "color control" program to make sure Nvidia dithering is disabled at the registry level and there is still massive strain with the RTX A4000. https://github.com/Maassoft/ColorControl Maybe when i get nvdithctrl running it might work but at this point im 100% convinced this is a VBIOS problem.

                    I was just reading about how different generations of GPU's do rendering, and it is stated that there is a "Tiled Rendering mode" and a "Immediate Rending mode".

                    Early in the development of desktop GPUs, several companies developed tiled architectures. Over time, these were largely supplanted by immediate-mode GPUs with fast custom external memory systems.

                    Major examples of this are:

                    PowerVR rendering architecture (1996): The rasterizer consisted of a 32×32 tile into which polygons were rasterized across the image across multiple pixels in parallel. On early PC versions, tiling was performed in the display driver running on the CPU. In the application of the Dreamcast console, tiling was performed by a piece of hardware. This facilitated deferred rendering—only the visible pixels were texture-mapped, saving shading calculations and texture-bandwidth.

                    Microsoft Talisman (1996)

                    Dreamcast (powered by PowerVR chipset) (1998)

                    Gigapixel GP-1 (1999)[6]

                    Intel Larrabee GPU (2009) (canceled)

                    PS Vita (powered by PowerVR chipset) (2011)[7]

                    Nvidia GPUs based on the Maxwell architecture and later architectures (2014)[8] <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<

                    AMD GPUs based on the Vega (GCN5) architecture and later architectures (2017)[9][10] <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<

                    Intel Gen11 GPU and later architectures (2019)[11][12][13]

                    Examples of non-tiled architectures that use large on-chip buffers are:

                    Xbox 360 (2005): the GPU contains an embedded 10 MB eDRAM; this is not sufficient to hold the raster for an entire 1280×720 image with 4× multisample anti-aliasing, so a tiling solution is superimposed when running in HD resolutions and 4× MSAA is enabled.[14]

                    Xbox One (2013): the GPU contains an embedded 32 MB eSRAM, which can be used to hold all or part of an image. It is not a tiled architecture, but is flexible enough that software developers can emulate tiled rendering.[15][failed verification]

                    https://en.wikipedia.org/wiki/Maxwell_(microarchitecture)

                    https://en.wikipedia.org/wiki/Tiled_rendering

                    https://www.techpowerup.com/231129/on-nvidias-tile-based-rendering

                    https://www.kitguru.net/components/graphic-cards/joao-silva/nvidia-is-developing-a-new-multi-gpu-tiled-rendering-technique-for-turing-cards/

                      a month later

                      Jack
                      Thank you for not giving up trying to get to the truth.

                      Regarding my post CepheiHR8938
                      I must admit that I jumped to conclusions about the system with 3070 in one of the locations. I didn't have time to do a good test. Now I went back there and found that the discomfort is still there. Now I want to do more tests with different drivers and bios.
                      The case in the other location with the replacement of 730 to 1060 on the ryzen 2600 platform is still valid and there really helped with the bios downgrade.
                      I also want to share an interesting phenomenon after I bought a new 3070. Before installing the 3070, the system had a 1060. I did not scrub the system of the old drivers with a DDU or something similar. I just plugged in the 3070 and the system worked fine with the drivers that were installed for the 1060. No eye pain, and a great picture. Then windows decided to update the drivers on its own without my involvement. And everything got messed up. I was very angry, I decided to install back the version that was, but with an active 3070. And I was not able to go back to the previous state when everything was fine.
                      This led me to believe that the problem lies more in the plane of software and the mode of the driver, depending on the video card. But what exactly is to be found out.

                      dev