So contacting Asus was pretty pointless.

I replied to NVIDIA and sent them the dump. Waiting for their response. In the meantime though @HAL9000 I looked up both your monitor, and they are both 6bit+FRC right?

BENQ GW2480

LG WL2950S

Have you or anyone else had the opportunity to try these cards with a true 10bit display? Since these cards support true 10bit technically there should be no dithering at 10bit by the card itself.

I also tried a few more of those dithering disabling apps, and I discovered a few interesting things. This tool has a test image which is pretty handy:

Calibration-tools

1- Changing the dithering setting does have an effect on the screen. You can clearly see the test image change.

2- Recording the screen with a screen recorder though shows absolutely no changes, so I guess these apps are only changing the dithering settings of the monitor and are literally doing nothing to what is coming from the card.

3- the dithering setting in most of these apps has enabled/disabled/default, and it was interesting that my monitor is doing some dithering by itself by default because setting it to default was exactly like setting it to enabled.

    8 days later

    How in hell are they making new gpus that potentially make the display worse than with old gpus. Sounds completely crazy, can't they just keep things as they are when they are good.

    HAL9000 While I haven't explored what these memory dumps possibly contain, my guess is potentially the most one could deduct from such a memory dump is the internal settings of the GPU driver. If this is the case and the card is always "messing up" internally I wouldn't be surprised if such a memory dump fails to gather useful results.

    BloodyHell619

    The Benq should be 8bit, may have gotten the model wrong, its the GL2480 I have:

    https://www.displayspecifications.com/en/model/8b9f1a3f

    The LG is 6+frc, yet its ok on my old machines, its very strange.

    BloodyHell619

    I'm not saying that was me, but I raised a ticket with them for a 1660s and a 3060ti. I want toto RMA the 3060 and hope that the replacement is not the same as I think it is also faulty as it outputs a blurred image.

    I get that the end contact people are just doing their jobs and realistically will be told by their engineering department what to say (if they are even looking at the logs) but there is a difference, I can see it, I've even linked a few websites where people state the same (however I have not linked this website, as with no offence intended I believe to the outside observer we all appear to be 'crazy' people - or we just 'need to get our eyes tested / get glasses' - which is quite depressing really considering the issues we have and the things everyone tries to be able to use these devices)- which is why I hope my card is actually faulty.

      18 days later

      HAL9000 The Benq should be 8bit, may have gotten the model wrong, its the GL2480 I have:

      https://www.displayspecifications.com/en/model/8b9f1a3f

      The LG is 6+frc, yet its ok on my old machines, its very strange.

      Oh, damn. So a true 8 bit won't help either, that only leaves a true 10bit as an option. I am interested though to see how things would be with an 8bit+FRC monitor. When you put it on 10bit(8bit+FRC) will the card and the monitor be both doing dithering? Maybe the monitor's dithering might take over, and it might be a better algorithm and cause less strain.

      HAL9000 I'm not saying that was me, but I raised a ticket with them for a 1660s and a 3060ti.

      Oh, yeah. I would say that was definitely your case.

      HAL9000 I get that the end contact people are just doing their jobs and realistically will be told by their engineering department what to say (if they are even looking at the logs) but there is a difference, I can see it, I've even linked a few websites where people state the same (however I have not linked this website, as with no offence intended I believe to the outside observer we all appear to be 'crazy' people - or we just 'need to get our eyes tested / get glasses' - which is quite depressing really considering the issues we have and the things everyone tries to be able to use these devices)- which is why I hope my card is actually faulty.

      That is so damn true. Everywhere I have posted or any post I saw on this subject, The first comments were always about us being crazy or imagining stuff. What was even more interesting is that I actually see quite a few posts where people are actually constantly requesting NVIDIA to enable dithering because of bad color banding 🙂) The damn thing IS enabled lol and people still have issues with banding.

      HAL9000 I want toto RMA the 3060 and hope that the replacement is not the same as I think it is also faulty as it outputs a blurred image.

      I think it's the dithering that's causing the blurriness.

      Check this. Googles first definition for dithering.

      I actually suddenly thought of this weird theory regarding the blurriness. When I had the card in, whenever I played a very low res video, one that would normally look very pixelated, the eyestrain was maddening. It was like my eyes were getting sucked out of their sockets. And the video seemed blurry, but rather smooth and not as pixelated as it should be. Could it be that the card is treating pixelation as banding?? And then doing dithering on it? Maybe DLSS might even have something to do with it. Like it's doing a combination of DLSS and dithering to make low res videos look better.

      If you still have the card in. I would be really interested about a test on this. My eyes are too fried right now to do it myself, maybe I might attempt it some other time. Try opening a rather low resolution video that seems pixelated and compare how it looks with the 3060 vs the 1660.

      HAL9000 I want toto RMA the 3060 and hope that the replacement is not the same as I think it is also faulty as it outputs a blurred image.

      Let us know if the replaced card makes any difference.

      • JTL replied to this.

        BloodyHell619 @HAL9000 Don't see why messing around with various monitors would do any good if hypothetically the output stage of the GPU is so severely compromised that messing with color profiles or drivers is just patching over the problem making it "less terrible" as opposed to what should be an "untainted" output.

        Speaking of which, anyone know the cheapest Nvidia card that has a "bad" output? Especially in these times with the supply chain crisis. I currently don't own any and if I ever get my test rig up and running again might want to have one for empirical comparisons and research.

          JTL Don't see why messing around with various monitors would do any good if hypothetically the output stage of the GPU is so severely compromised that messing with color profiles or drivers is just patching over the problem making it "less terrible" as opposed to what should be an "untainted" output.

          Yeah, it really sucks. But what other choice is there left? NVIDIA impudently lies, and says the card is not doing any dithering! People are calling us crazy. Reading around, I am seeing that AMD and Intel cards are also dithering and as bad. And there literally is no other option on the market. We'd literally have to give up gaming. I even think the PS5 and XBOX do it -.-

          This is an interesting and rather sad post. It makes it clear that none of the dithering disabling tweaks and apps will disable it. This is probably why NVIDIA refuses to admit the card is doing dithering. It could be that it's not even fixable with a driver.

          https://hub.displaycal.net/forums/topic/eliminating-temporal-dithering-for-sensitive-folks/

          JTL Speaking of which, anyone know the cheapest Nvidia card that has a "bad" output? Especially in these times with the supply chain crisis. I currently don't own any and if I ever get my test rig up and running again might want to have one for empirical comparisons and research.

          If I am correct, the 2060 should be the cheapest troublemaker.

          • JTL replied to this.

            BloodyHell619 Yeah, it really sucks. But what other choice is there left? NVIDIA impudently lies, and says the card is not doing any dithering! People are calling us crazy. Reading around, I am seeing that AMD and Intel cards are also dithering and as bad. And there literally is no other option on the market. We'd literally have to give up gaming. I even think the PS5 and XBOX do it -.-

            My work computer has a Radeon Pro W5500, and at least my unit has no dithering by default either in BIOS or under Linux with the amdgpu driver.

            I certainly do think more R&D is needed here.

              BloodyHell619 This is an interesting and rather sad post. It makes it clear that none of the dithering disabling tweaks and apps will disable it. This is probably why NVIDIA refuses to admit the card is doing dithering. It could be that it's not even fixable with a driver.

              I have seen that post before, and I don't think that simplification is entirely accurate, but it's more complex than a simple "yes/no" question.

              ryans Also, you can see temporal dithering on an e-ink display here -- notice the dancing dots.

              Yes, my eInk display (Dasung not eReader) makes temporal dithering very visible as dancing dots, and eInk gives me symptoms at least as fast, if not faster, than other monitors when the dithering is obvious. Dithering is particularly bad on the Mac generally, especially in Apple programs like Preview, or if Nightshift is on (extra whole-screen dithering), or if I highlight something (extra dithering in the highlight). Desktop versions of Microsoft and Adobe programs (old versions at this point) aren’t nearly as bad as Apple Preview. I tried to post some videos here - it’s hard to capture a good video, but Mac dithering on eInk is very visible in person and it looks like writhing worms:

              https://ledstrain.org/d/1203-win10-finally-caught-up-to-me-and-i-might-be-losing-my-career-because-of-it/32

              I haven’t used eInk much with my current “best” Windows 7 setup, other than to figure out that I prefer my CCFL monitor to the eInk.

              a month later

              JTL Not sure if you mentioned this anywhere, but did you test Windows with your Radeon GPU as well?

              • JTL replied to this.

                devilgrove No.

                a) Don't want to mess with my work computer

                b) Considering other alleged rendering issues with recent Windows 10 a "unusable" result could be potentially a red herring.

                BloodyHell619 Same as you, i get eye strain for gpu card like rtx 3050, 3060ti,3070,2060. Then i change my prosesor from ryzen 3500x to i5 12400. And i switch the display hdmi from gpu rtx 3070 to the hdmi motherboard. So i use igpu intel HD. And voila, my eyestrain is gone. Even if i play a game.in game option i still can choose to use rtx 3070 but without eye strain. Hope this can help u

                  13 days later

                  I spent $799 on a new NVidia 3080 because there is a setting in the Linux NVidia control panel that is supposed to disable temporal dithering. It turns out that setting does nothing. I still got really bad eye strain.

                  Here is a video showing the temporal dithering with the nvidia control panel temporal dithering setting set to "disabled". This was filmed from my iphone 5s:

                  https://www.youtube.com/watch?v=u1uSl7vE7EU

                  14 days later

                  ludwig

                  Thanks so much, this is pretty life-saving, and it's also huge proof that NVIDIA is blatantly lying about the cards not doing any dithering. I wish we could somehow makeeveryone realize this and force NVIDIA to do something about this.
                  Any idea if the same would be possible if I bought a Ryzen CPU with integrated graphics? Don't wanna change both my MB and CPU for this. If that is true, I might actually go and buy a new CPU. Totally worth it. I am constantly reminded how big of a mistake it is to buy a CPU without integrated graphics, like my Ryzen 3600xt. It is such a handy thing to have when you get into trouble.

                  There is one more problem with this though, since the MB only has one HDMI output, you can't use a multiple monitor setup, right?

                  "Any cheap monitor suggestions that would neither have PWM nor drive me blind with Temporal Dithering?"

                  Luckily for your wallet temporal dithering is a software (os/device driver) thing - not a hardware thing. I feel your pain on the equipment cost.

                    "huge proof that NVIDIA is blatantly lying about the cards not doing any dithering"
                    They are probably talking about while rendering a 3d scene. All cards dither on 2d rendering.

                    ludwig So i use igpu intel HD. And voila, my eyestrain is gone. Even if i play a game.in game option i still can choose to use rtx 3070 but without eye strain.

                    Makes sense that the gpu cards do dithering your eyes don't like in 2D rendering.

                    dev