RTX A4000 Causes Eyestrain With True 10-bit Monitor
I was reading about EDID information and i came across a post that a graphic card will dither if connected to a monitor that reports a lower bit capability than what the graphic card can output at max. These new cards can apparently output 12bit color, so even with a real 10bit monitor the dithering is still active. I exported my monitors EDID and have been trying to find if the bit depth is defined in the EDID and i dont think its there, if its not defined that might explain why the graphics card is dithering as its not defined in the EDID. My theory is that even if its defined as 10bit in the monitor EDID is that hopefully just changing it to 12bits would trick the GPU to turn off its dithering. Older Nvidia cards dont support 12bit color, so this leads me to believe that possibly modding the monitors EDID can trick new GPU's to disable the dithering
- Edited
So i was able to spoof the EDID of my monitor to make it appear as a 12bit monitor, this didnt help at all. I use the "color control" program to make sure Nvidia dithering is disabled at the registry level and there is still massive strain with the RTX A4000. https://github.com/Maassoft/ColorControl Maybe when i get nvdithctrl running it might work but at this point im 100% convinced this is a VBIOS problem.
I was just reading about how different generations of GPU's do rendering, and it is stated that there is a "Tiled Rendering mode" and a "Immediate Rending mode".
Early in the development of desktop GPUs, several companies developed tiled architectures. Over time, these were largely supplanted by immediate-mode GPUs with fast custom external memory systems.
Major examples of this are:
PowerVR rendering architecture (1996): The rasterizer consisted of a 32×32 tile into which polygons were rasterized across the image across multiple pixels in parallel. On early PC versions, tiling was performed in the display driver running on the CPU. In the application of the Dreamcast console, tiling was performed by a piece of hardware. This facilitated deferred rendering—only the visible pixels were texture-mapped, saving shading calculations and texture-bandwidth.
Microsoft Talisman (1996)
Dreamcast (powered by PowerVR chipset) (1998)
Gigapixel GP-1 (1999)[6]
Intel Larrabee GPU (2009) (canceled)
PS Vita (powered by PowerVR chipset) (2011)[7]
Nvidia GPUs based on the Maxwell architecture and later architectures (2014)[8] <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
AMD GPUs based on the Vega (GCN5) architecture and later architectures (2017)[9][10] <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Intel Gen11 GPU and later architectures (2019)[11][12][13]
Examples of non-tiled architectures that use large on-chip buffers are:
Xbox 360 (2005): the GPU contains an embedded 10 MB eDRAM; this is not sufficient to hold the raster for an entire 1280×720 image with 4× multisample anti-aliasing, so a tiling solution is superimposed when running in HD resolutions and 4× MSAA is enabled.[14]
Xbox One (2013): the GPU contains an embedded 32 MB eSRAM, which can be used to hold all or part of an image. It is not a tiled architecture, but is flexible enough that software developers can emulate tiled rendering.[15][failed verification]
https://en.wikipedia.org/wiki/Maxwell_(microarchitecture)
https://en.wikipedia.org/wiki/Tiled_rendering
https://www.techpowerup.com/231129/on-nvidias-tile-based-rendering
- Edited
Jack
Thank you for not giving up trying to get to the truth.
Regarding my post CepheiHR8938
I must admit that I jumped to conclusions about the system with 3070 in one of the locations. I didn't have time to do a good test. Now I went back there and found that the discomfort is still there. Now I want to do more tests with different drivers and bios.
The case in the other location with the replacement of 730 to 1060 on the ryzen 2600 platform is still valid and there really helped with the bios downgrade.
I also want to share an interesting phenomenon after I bought a new 3070. Before installing the 3070, the system had a 1060. I did not scrub the system of the old drivers with a DDU or something similar. I just plugged in the 3070 and the system worked fine with the drivers that were installed for the 1060. No eye pain, and a great picture. Then windows decided to update the drivers on its own without my involvement. And everything got messed up. I was very angry, I decided to install back the version that was, but with an active 3070. And I was not able to go back to the previous state when everything was fine.
This led me to believe that the problem lies more in the plane of software and the mode of the driver, depending on the video card. But what exactly is to be found out.
I have seen it suggested that it is sometimes an improvement if you do all updates first, and the driver last.
Worth a try on a fresh partition or external drive.
I just got a gigabyte m28u monitor, and its true 8bit. Maybe u want to try it. Works for me, at least its a second day i'm using it. Can't add anything more right now.
- Edited
So something interesting happened, i decided to update my BIOS on my EVGA Z490 Dark motherboard to test if a newer BIOS would allow me any more overclocking headroom and once i upgraded from version 1.07 released on 9/28/2020 to version 1.10 released on 11/11/2021 i started to experience eyestrain with my Quadro RTX 4000, even the BIOS screen had the strange smeared/glossy/blurry look that other bad cards ive tested had. I flashed the BIOS back to version 1.07 and everything went back to normal and is comfortable. I recommended anyone battling with a bad card to flash and test the entire range of BIOSes available for the motherboard.
Maybe this could be caused by the GOP of the motherboard BIOS interacting with the GOP modules on the GPU in different ways and some work better with certain models of GPU's than others.
https://winraid.level1techs.com/t/gop-update-and-extraction-tool-nvidia-only/91381/
https://winraid.level1techs.com/t/amd-and-nvidia-gop-update-no-requests-diy/30917/923?page=36
https://ledstrain.org/d/261-what-works-for-you-what-do-you-use-now-without-problems/187
My current Quadro RTX 4000 BIOS has UEFI version 5000B, i also found an older bios version of my card that has UEFI version 5000A, and a new version that has version 5000D. I bet if i wanted to use the newer z490 dark bios i could update my UEFI GOP and it would be normal.
And here is a screenshot of a Quadro K4200 Kepler based GPU which is using UEFI version 1002F.
In a case where it was not possible to make a bad GPU comfortable by testing various different motherboard bios versions it would certainly be worth a try changing GOP versions in the GPU's vbios until a comfortable one is found.
I'm not sure if it's worth experimenting with flashing away to find new combos that work, when you already have a combo that works today.
Sunspark I have an identical second system im working on that uses the same model motherboard so ill work on making the RTX A4000 Ampere based card comfortable, so ill be able to test all these possible combinations soon. My Quadro RTX 4000 has a waterblock on it currently so its not so quick to swap it out to test in my primary system.
@Jack I hope you find something interesting here because I can say with 100% certainty now that the issue is baked into the goddamn motherboard and it somehow infects every card that it touches. I have a system that was working for me for years without eyestrain, I fucked it up installing a 970 from another system causing it to dither horrifically now. I have replaced everything in this system except for the motherboard and the horrific dithering is still there.
I realize this isn't trivial, but if I were in the same situation I'd get a lossless capture card setup on another computer and actually compare the raw frames between a "good" and "bad" firmware configuration with the same GPU and motherboard between them (Can be done with my VideoDiff software). Very quickly this would lead to empirical evidence what's wrong and (hopefully) can be sent to the "right people" in the right places for (potentially) a solution.
If you think you have a bad motherboard that is doing something to every card you plug in, then stop plugging cards into it. Get a new motherboard.
- Edited
BloodyHell619 What model motherboard do you have? Try flashing the original release bios version to it and if the release bios doesnt fix it flash a version of the bios that has a release date as close as possible to the release date of the graphics card or the date of the vbios. If your motherboard model doesnt have a lot of bios version to select from then save a copy of your GPU's original vbios and use the GOP update tool to change the GOP version inside the vbios file and flash it to see if fixed the issue. Use "nvflash64.exe --version romnamehere.rom" to check which GOP version your vbios has and select one that is the same size as the original to add into the vbios file.
@JTL I do want to pickup a lossless capture card but the only model that seems to be lossless is the DVI2PCIe, i wanted to check if there are other lossless PCI-e capture cards available but if not i will pickup DVI2PCIe to get some actual captures of a bad setup and good setup.
Jack One thing the DVI2PCIe card can do that others cannot (at least easily) is emulate arbitrary EDID data from another display. If different EDID data has bearing on what GPUs do is an open question but I certainly consider leaving the opportunity open for investigation a good thing.
Only thing I dislike about it is it requires either a Windows driver, or a distribution specific Linux driver, so it isn't ideal. VideoDiff currently works best under Linux, but optimizations in the future could be done for Windows.
JTL Only with the caveat that there are other systems that are safe to fall back on should something go wrong, as is currently happening with BloodyHell619 where they have alleged that whatever is touched turns into coal. They only have the one system. If Jack has multiple systems, then why not sure.
JTL I think the problem is not the picture itself or its structure. The problem is how these frames are delivered to the monitor. According to my sensations in those configurations when there is discomfort of the eyes I start to feel flickering of the screen. But the funny thing is that this flickering is not confirmed by the pencil test or the camera. These tests as it seems to me exclude only the problem of flickering of the monitor backlight but not flickering of rendering.
CepheiHR8938 I haven't seen anyone with issues with newer NVIDIA cards attempt any lossless capture of the display output, so if anyone is able to do it, it will be an interesting test.
Any updates?
- Edited
smilem The author of the topic just needs to read reviews on his monitor. It says that Dell UP2720Q is actually 8bit + FRC. Dithering (FRS) is forced to output 10bit. Even if there is a "True 10-bit" panel. There are no such problems on the cheaper BenQ EW2780U.
Based on the declared number of shades reproduced, it was possible to conclude that both monitors are capable of operating in 10-bit mode. But is that really the case? Not really.
Dell UP2720Q does this through the use of dithering (FRC) and, given the expanded color gamut of the device, this is a necessary measure.
The BenQ EW2780U monitor is much simpler: the 10-bit mode appeared in it only because of support for one of the HDR standards, but in reality, more than 8 bits are not required to provide high-quality color reproduction for its matrix with standard color gamut. This is exactly how the native mode of operation of the device is set and it is on it that you should stop. In other words, it makes no sense to use 10bit mode on the EW2780U.