Jack how are things going? did you manage to do something?
RTX A4000 Causes Eyestrain With True 10-bit Monitor
I finished my upgrade and with the i9 10900k/EVGA Z490 Dark KP and Quadro RTX 4000 im using Windows 2004 build 19041.572 completely comfortably where as on my older hardware it felt grainy/strainy, not only can i use it, this is more comfortable than my Windows 7/Quadro K4200 setup, but this might be because im using a 4K monitor and Windows 10 handling font scaling better or something, not completely sure. I havent tested the RTX A4000 card since im a bit nervous it could sabotage the setup, so i ordered another NVMe SSD to clone the existing setup to that drive and swap in the A4000. I think you are 100% onto something though about the poor compatibility of devices, i think my X79 motherboard was just too old.
I also noticed a huge increase in audio quality after i switched to the new Z490 motherboard through my Schiit Modi Multibit USB DAC which has a iFI Defender isolating the 5v line away from the motherboards USB power into a Allo Shanti linear power supply, because im isolating the 5v line, it shouldnt have improved/changed the audio quality but it did. On my X79 board if i ever plugged in a microphone to the 3.5mm jack on the back, it would emit a ton of electrical noise and people on discord would always complain about the noise, so maybe some motherboards have poor separation of traces/components and can suffer from noise and maybe newer graphics cards are more sensitive to this as well, just like audio equipment and TV's/Monitors.
Old Setup: EVGA X79 FTW | Xeon E5-2690 | Quadro K4200&Quadro RTX 4000 | Dell UP2720Q True 10bit monitor. (unusable with new RTX A4000 and previously Windows 10 2004)
New Setup: EVGA Z490 Dark KP | I9 10900K | Quadro RTX 4000 | Dell UP2720Q True 10bit monitor. (Now usable with Windows 10 2004). (Will test with RTX A4000 soon)
Jack On my X79 board if i ever plugged in a microphone to the 3.5mm jack on the back, it would emit a ton of electrical noise and people on discord would always complain about the noise, so maybe some motherboards have poor separation of traces/components and can suffer from noise
Potential interactions between GPUs and motherboards is a complex subject, but sound quality issues caused by electrical interference with motherboard audio is definitely a known thing.
The symptoms you describe are exactly what I feel in front of most modern monitors and laptop screens. Like this weird bluriness and the image not being completely flat. It's like the image is dirty. And it's not the anti glare layer as it's the same on glossy panels. I don't understand how they make images look worse than 10 years ago but somehow they do it while selling gpus for thousands of dollars. Really something is off.
Jack Hello Jack, how is your new setup with RTX 4000? It's been a few days since you have it. Did it help get rid of your problems? I'm planning to get an RTX 4000 myself, based on your posts. I think it is the only option left. I tried a macBook pro 16 2021, it was horrible. Now I'm writing from a legion 5 pro, same thing, it's barely usable.
- Edited
Just a long term update, I can use the Quadro RTX 4000 8GB without eyestrain, it seems like some Turing based cards can be good or bad, but the RTX A4000 16GB (RTX 3070 equivalent) is still absolutely unusable in Windows 10 2004, Windows 7, and Linux. Im pretty much convinced at this point that the GPU's themselves are the problem, even after upgrading to a Evga Z490 Dark, and I9-10900K the RTX A4000 is still unusable. I have tested this with multiple monitors over the duration.
Dell UP2720Q, Dell UP3218K, Dell UP3221Q, Dell UP3017, Dell U2720Q, Dell UP2716D, Dell U3223QE, Dell U2723QE, Dell UP2716D, Dell U3219Q. None of these monitors made a difference when being used with a problematic card, there was always strain present when compared to a known good safe reference card like the Quadro K4200.
List of graphics cards i tested also:
RTX 4000(good), RTX A4000(bad), 5700 XT(bad), RX 480(bad), GTX 680 Classified(good), Quadro K4200(good), Quadro 4000(good), Firepro V5900(good)
For my other systems i am just resorting to buying the most powerful Kepler based GPU's. Although im planning to test the Radeon Pro W5500 and W5700 soon as a final test. I also plan to test some cards on Windows Server 2016 and 2019 just to see if maybe something is different. Hopefully someday a way to edit the GPU Bios to disable this hardware level dithering becomes available in the future.
Jack Just a long term update, I can use the Quadro RTX 4000 8GB without eyestrain, it seems like some Turing based cards can be good or bad, but the RTX A4000 16GB (RTX 3070 equivalent) is still absolutely unusable in Windows 10 2004, Windows 7, and Linux. Im pretty much convinced at this point that the GPU's themselves are the problem, even after upgrading to a Evga Z490 Dark, and I9-10900K the RTX A4000 is still unusable. I have tested this with multiple monitors over the duration.
Nice hearing from you again
My research is inconclusive and not yet "done" (in part due to a lack of needed hardware) but I have reason to believe that a) Potential per unit variance in otherwise identical GPUs may exist and b) certain firmware differences in both motherboard and GPU VBIOS may cause subtle interactions that may cause issues that aren't easy to isolate.
And that's before any alleged application/OS rendering issues.
Nobody said it was going to be easy.
JTL I agree, i think there are different VBIOS revisions on certain cards and that using certain older motherboards with newer GPU's can cause the different interactions. Ill keep my bad RTX A4000 around and try finding some VBIOS posted on the internet to try flashing it with as an experiment, they all use Samsung VRAM and are reference models so it should be pretty easy to do.
karut Thanks, im located in USA right now though so i can pick them up pretty cheap on ebay.
- Edited
I know this isnt a real solution, but it does work nicely. If you have a motherboard with two pci-e slots, you can just use an old Kepler based GPU to connect your monitor and use the more power GPU that causes eyestrain to handle high performance applications or games. I have 3 extra Quadro K4200's which are all safe for my eyes so i just tossed the problematic RTX A4000 in my secondary system and did the EnableMsHybrid registry trick to be able to select it as the high performance GPU and it runs applications wonderfully and is also amazing for games. An added bonus is you get the clear crisp image of the safe kepler GPU's and can run certain games at 4K without that blurry dithering look. The image quality is superb this way. Im using Windows 10 2004 19041.789, with safe GPU's i have no problem with this version of Windows 10. A lot of people from this thread use old Tesla GPU's with this trick with either a weak primary GPU or integrated GPU.
https://forum.level1techs.com/t/gaming-on-my-tesla-more-likely-than-you-think/
https://forum.level1techs.com/t/gaming-on-my-tesla-more-likely-than-you-think/171185/70
I was reading about EDID information and i came across a post that a graphic card will dither if connected to a monitor that reports a lower bit capability than what the graphic card can output at max. These new cards can apparently output 12bit color, so even with a real 10bit monitor the dithering is still active. I exported my monitors EDID and have been trying to find if the bit depth is defined in the EDID and i dont think its there, if its not defined that might explain why the graphics card is dithering as its not defined in the EDID. My theory is that even if its defined as 10bit in the monitor EDID is that hopefully just changing it to 12bits would trick the GPU to turn off its dithering. Older Nvidia cards dont support 12bit color, so this leads me to believe that possibly modding the monitors EDID can trick new GPU's to disable the dithering
- Edited
So i was able to spoof the EDID of my monitor to make it appear as a 12bit monitor, this didnt help at all. I use the "color control" program to make sure Nvidia dithering is disabled at the registry level and there is still massive strain with the RTX A4000. https://github.com/Maassoft/ColorControl Maybe when i get nvdithctrl running it might work but at this point im 100% convinced this is a VBIOS problem.
I was just reading about how different generations of GPU's do rendering, and it is stated that there is a "Tiled Rendering mode" and a "Immediate Rending mode".
Early in the development of desktop GPUs, several companies developed tiled architectures. Over time, these were largely supplanted by immediate-mode GPUs with fast custom external memory systems.
Major examples of this are:
PowerVR rendering architecture (1996): The rasterizer consisted of a 32×32 tile into which polygons were rasterized across the image across multiple pixels in parallel. On early PC versions, tiling was performed in the display driver running on the CPU. In the application of the Dreamcast console, tiling was performed by a piece of hardware. This facilitated deferred rendering—only the visible pixels were texture-mapped, saving shading calculations and texture-bandwidth.
Microsoft Talisman (1996)
Dreamcast (powered by PowerVR chipset) (1998)
Gigapixel GP-1 (1999)[6]
Intel Larrabee GPU (2009) (canceled)
PS Vita (powered by PowerVR chipset) (2011)[7]
Nvidia GPUs based on the Maxwell architecture and later architectures (2014)[8] <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
AMD GPUs based on the Vega (GCN5) architecture and later architectures (2017)[9][10] <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Intel Gen11 GPU and later architectures (2019)[11][12][13]
Examples of non-tiled architectures that use large on-chip buffers are:
Xbox 360 (2005): the GPU contains an embedded 10 MB eDRAM; this is not sufficient to hold the raster for an entire 1280×720 image with 4× multisample anti-aliasing, so a tiling solution is superimposed when running in HD resolutions and 4× MSAA is enabled.[14]
Xbox One (2013): the GPU contains an embedded 32 MB eSRAM, which can be used to hold all or part of an image. It is not a tiled architecture, but is flexible enough that software developers can emulate tiled rendering.[15][failed verification]
https://en.wikipedia.org/wiki/Maxwell_(microarchitecture)
https://en.wikipedia.org/wiki/Tiled_rendering
https://www.techpowerup.com/231129/on-nvidias-tile-based-rendering
- Edited
Jack
Thank you for not giving up trying to get to the truth.
Regarding my post CepheiHR8938
I must admit that I jumped to conclusions about the system with 3070 in one of the locations. I didn't have time to do a good test. Now I went back there and found that the discomfort is still there. Now I want to do more tests with different drivers and bios.
The case in the other location with the replacement of 730 to 1060 on the ryzen 2600 platform is still valid and there really helped with the bios downgrade.
I also want to share an interesting phenomenon after I bought a new 3070. Before installing the 3070, the system had a 1060. I did not scrub the system of the old drivers with a DDU or something similar. I just plugged in the 3070 and the system worked fine with the drivers that were installed for the 1060. No eye pain, and a great picture. Then windows decided to update the drivers on its own without my involvement. And everything got messed up. I was very angry, I decided to install back the version that was, but with an active 3070. And I was not able to go back to the previous state when everything was fine.
This led me to believe that the problem lies more in the plane of software and the mode of the driver, depending on the video card. But what exactly is to be found out.
I have seen it suggested that it is sometimes an improvement if you do all updates first, and the driver last.
Worth a try on a fresh partition or external drive.
I just got a gigabyte m28u monitor, and its true 8bit. Maybe u want to try it. Works for me, at least its a second day i'm using it. Can't add anything more right now.
- Edited
So something interesting happened, i decided to update my BIOS on my EVGA Z490 Dark motherboard to test if a newer BIOS would allow me any more overclocking headroom and once i upgraded from version 1.07 released on 9/28/2020 to version 1.10 released on 11/11/2021 i started to experience eyestrain with my Quadro RTX 4000, even the BIOS screen had the strange smeared/glossy/blurry look that other bad cards ive tested had. I flashed the BIOS back to version 1.07 and everything went back to normal and is comfortable. I recommended anyone battling with a bad card to flash and test the entire range of BIOSes available for the motherboard.
Maybe this could be caused by the GOP of the motherboard BIOS interacting with the GOP modules on the GPU in different ways and some work better with certain models of GPU's than others.
https://winraid.level1techs.com/t/gop-update-and-extraction-tool-nvidia-only/91381/
https://winraid.level1techs.com/t/amd-and-nvidia-gop-update-no-requests-diy/30917/923?page=36
https://ledstrain.org/d/261-what-works-for-you-what-do-you-use-now-without-problems/187
My current Quadro RTX 4000 BIOS has UEFI version 5000B, i also found an older bios version of my card that has UEFI version 5000A, and a new version that has version 5000D. I bet if i wanted to use the newer z490 dark bios i could update my UEFI GOP and it would be normal.
And here is a screenshot of a Quadro K4200 Kepler based GPU which is using UEFI version 1002F.
In a case where it was not possible to make a bad GPU comfortable by testing various different motherboard bios versions it would certainly be worth a try changing GOP versions in the GPU's vbios until a comfortable one is found.