Jack
Thank you for not giving up trying to get to the truth.

Regarding my post CepheiHR8938
I must admit that I jumped to conclusions about the system with 3070 in one of the locations. I didn't have time to do a good test. Now I went back there and found that the discomfort is still there. Now I want to do more tests with different drivers and bios.
The case in the other location with the replacement of 730 to 1060 on the ryzen 2600 platform is still valid and there really helped with the bios downgrade.
I also want to share an interesting phenomenon after I bought a new 3070. Before installing the 3070, the system had a 1060. I did not scrub the system of the old drivers with a DDU or something similar. I just plugged in the 3070 and the system worked fine with the drivers that were installed for the 1060. No eye pain, and a great picture. Then windows decided to update the drivers on its own without my involvement. And everything got messed up. I was very angry, I decided to install back the version that was, but with an active 3070. And I was not able to go back to the previous state when everything was fine.
This led me to believe that the problem lies more in the plane of software and the mode of the driver, depending on the video card. But what exactly is to be found out.

I have seen it suggested that it is sometimes an improvement if you do all updates first, and the driver last.

Worth a try on a fresh partition or external drive.

I just got a gigabyte m28u monitor, and its true 8bit. Maybe u want to try it. Works for me, at least its a second day i'm using it. Can't add anything more right now.

    5 months later

    So something interesting happened, i decided to update my BIOS on my EVGA Z490 Dark motherboard to test if a newer BIOS would allow me any more overclocking headroom and once i upgraded from version 1.07 released on 9/28/2020 to version 1.10 released on 11/11/2021 i started to experience eyestrain with my Quadro RTX 4000, even the BIOS screen had the strange smeared/glossy/blurry look that other bad cards ive tested had. I flashed the BIOS back to version 1.07 and everything went back to normal and is comfortable. I recommended anyone battling with a bad card to flash and test the entire range of BIOSes available for the motherboard.

    Maybe this could be caused by the GOP of the motherboard BIOS interacting with the GOP modules on the GPU in different ways and some work better with certain models of GPU's than others.

    https://winraid.level1techs.com/t/gop-update-and-extraction-tool-nvidia-only/91381/

    https://winraid.level1techs.com/t/amd-and-nvidia-gop-update-no-requests-diy/30917/923?page=36

    https://ledstrain.org/d/261-what-works-for-you-what-do-you-use-now-without-problems/187

    https://forums.guru3d.com/threads/display-port-gop-updater-guide-fix-blanking-screens-and-improve-monitor-compatibility.421417/

    My current Quadro RTX 4000 BIOS has UEFI version 5000B, i also found an older bios version of my card that has UEFI version 5000A, and a new version that has version 5000D. I bet if i wanted to use the newer z490 dark bios i could update my UEFI GOP and it would be normal.

    And here is a screenshot of a Quadro K4200 Kepler based GPU which is using UEFI version 1002F.

    In a case where it was not possible to make a bad GPU comfortable by testing various different motherboard bios versions it would certainly be worth a try changing GOP versions in the GPU's vbios until a comfortable one is found.

      I'm not sure if it's worth experimenting with flashing away to find new combos that work, when you already have a combo that works today.

        Sunspark I have an identical second system im working on that uses the same model motherboard so ill work on making the RTX A4000 Ampere based card comfortable, so ill be able to test all these possible combinations soon. My Quadro RTX 4000 has a waterblock on it currently so its not so quick to swap it out to test in my primary system.

        @Jack I hope you find something interesting here because I can say with 100% certainty now that the issue is baked into the goddamn motherboard and it somehow infects every card that it touches. I have a system that was working for me for years without eyestrain, I fucked it up installing a 970 from another system causing it to dither horrifically now. I have replaced everything in this system except for the motherboard and the horrific dithering is still there.

          Jack

          I realize this isn't trivial, but if I were in the same situation I'd get a lossless capture card setup on another computer and actually compare the raw frames between a "good" and "bad" firmware configuration with the same GPU and motherboard between them (Can be done with my VideoDiff software). Very quickly this would lead to empirical evidence what's wrong and (hopefully) can be sent to the "right people" in the right places for (potentially) a solution.

            If you think you have a bad motherboard that is doing something to every card you plug in, then stop plugging cards into it. Get a new motherboard.

            BloodyHell619 What model motherboard do you have? Try flashing the original release bios version to it and if the release bios doesnt fix it flash a version of the bios that has a release date as close as possible to the release date of the graphics card or the date of the vbios. If your motherboard model doesnt have a lot of bios version to select from then save a copy of your GPU's original vbios and use the GOP update tool to change the GOP version inside the vbios file and flash it to see if fixed the issue. Use "nvflash64.exe --version romnamehere.rom" to check which GOP version your vbios has and select one that is the same size as the original to add into the vbios file.

            @JTL I do want to pickup a lossless capture card but the only model that seems to be lossless is the DVI2PCIe, i wanted to check if there are other lossless PCI-e capture cards available but if not i will pickup DVI2PCIe to get some actual captures of a bad setup and good setup.

            • JTL replied to this.

              Jack One thing the DVI2PCIe card can do that others cannot (at least easily) is emulate arbitrary EDID data from another display. If different EDID data has bearing on what GPUs do is an open question but I certainly consider leaving the opportunity open for investigation a good thing.

              Only thing I dislike about it is it requires either a Windows driver, or a distribution specific Linux driver, so it isn't ideal. VideoDiff currently works best under Linux, but optimizations in the future could be done for Windows.

              Sunspark If one could demonstrate the same GPU rendering "good" or "bad" output just by virtue of firmware changes and empirically measure the differences between them I certainly consider that potential for research.

                JTL Only with the caveat that there are other systems that are safe to fall back on should something go wrong, as is currently happening with BloodyHell619 where they have alleged that whatever is touched turns into coal. They only have the one system. If Jack has multiple systems, then why not sure.

                JTL I think the problem is not the picture itself or its structure. The problem is how these frames are delivered to the monitor. According to my sensations in those configurations when there is discomfort of the eyes I start to feel flickering of the screen. But the funny thing is that this flickering is not confirmed by the pencil test or the camera. These tests as it seems to me exclude only the problem of flickering of the monitor backlight but not flickering of rendering.

                  CepheiHR8938 I haven't seen anyone with issues with newer NVIDIA cards attempt any lossless capture of the display output, so if anyone is able to do it, it will be an interesting test.

                  5 months later

                  smilem The author of the topic just needs to read reviews on his monitor. It says that Dell UP2720Q is actually 8bit + FRC. Dithering (FRS) is forced to output 10bit. Even if there is a "True 10-bit" panel. There are no such problems on the cheaper BenQ EW2780U.

                  Based on the declared number of shades reproduced, it was possible to conclude that both monitors are capable of operating in 10-bit mode. But is that really the case? Not really.

                  Dell UP2720Q does this through the use of dithering (FRC) and, given the expanded color gamut of the device, this is a necessary measure.

                  The BenQ EW2780U monitor is much simpler: the 10-bit mode appeared in it only because of support for one of the HDR standards, but in reality, more than 8 bits are not required to provide high-quality color reproduction for its matrix with standard color gamut. This is exactly how the native mode of operation of the device is set and it is on it that you should stop. In other words, it makes no sense to use 10bit mode on the EW2780U.

                  • Jack replied to this.

                    smilem

                    Still testing some other motherboard platforms, but i was unable to get my RTX A4000 to work without strain on my Z490. I even tested it with a lossless capture card and the card itself doesnt dither in the BIOS. If i swap over to a Quadro RTX 4000 or K4200 the strain is gone.

                    AlanSmith

                    Thats the first review ive seen mention the UP2720Q is 8bit+2bit FRC, several other reviews and Dell themselves claims its true 10bit. In my case though the monitor is not the problem since i have several configurations which i can use this monitor strain free. The EW2780U is 8bit+2bit FRC but that review is saying the +2bit FRC only comes into play when displaying HDR. Are they trying to say the UP2720Q is using FRC even at 8bit?

                      AlanSmith Wow thats crazy so both Dell and Eizo are lying about their "10-bit" panels being true 10-bit? I heard rumors that the UP2720Q uses the same panel as the CS2740 and the coincidence that both these "10-bit" panel use forced dithering in even in 8bit mode seems to confirm that. Eizo themselves claim its true 10bit though or do you think they really are lying?

                      dev