ryans Have you considered trying MXLinux, just to see what Linux does for your strain?

From my (limited) understanding, if the card output is so severely compromised that even the BIOS screen has visual issues, it raises much more difficult questions than just merely trying other drivers or operating systems.

Also checkout my thread on binocular vision. It is possible the "flicker" caused by temporal dithering is trigger a pre-existing binocular vision dysfunction

In my opinion if the problem is from the card itself (which it probably is) it makes sense to first isolate that and mitigate the issue on a technical level (such as by obtaining a GPU known to not have dithering issues) than worry about possible compensation for visual issues afterwards.

    ryans

    Oh, I see. I thought it was something that happened all the time. I still doubt I have BVD. Both my eyes see perfectly alone, and I think I did get a sorta BVD test everytime I went to have them checked.

    JTL

    Totally agree with both your statements.

    You cannot believe the amount of time and effort I put into changing setting of my TU7000 Samsung TV back when I had it and was suffering from its PWM. Every time I felt I had found a good color, brightness setting that would take the pain away. I even managed to tune it to a point where I could no longer capture the black lines that were running down the screen on camera. But still every time after a few hours of use the pain was back.

    I just sent tickets both to ASUS and NVIDIA asking for a way to disable Dithering. I hope I get something from them. Have any of you done so? If so what response did you get?

      BloodyHell619

      Nividia tell me my 3060 is not dithering and is exactly the same output as my 1660 based on the log dumps I sent them. Yet the output is noticably blurry and all the other assoicated symptoms even with my newest color management settings.

        JTL

        They requested kernel memory dumps, the link they sent me no longer works but it should be these:

        https://nvidia.custhelp.com/app/answers/detail/a_id/5149/kw/logs

        https://nvidia.custhelp.com/app/answers/detail/a_id/4755/\~/manually-forcing-a-system-crash-using-a-keyboard

        which may actually be

        https://docs.microsoft.com/en-us/windows-hardware/drivers/debugger/kernel-memory-dump

        but the link they sent me with their instructions no longer exists. These memory dumps were approx 900mb for each card and had to be shared with nvidia.

          HAL9000

          I think I am about to receive the exact same carp they gave you too

          This should be exactly what they told you, right?

          I could swear they are lying about something.

          My eyes hurt so bad, I don't even wanna put the card back in again to take the dump lol. I think I am going to end up damaging my PCI slots, riser and my power cable pins putting the card in and out so many times lol

          So contacting Asus was pretty pointless.

          I replied to NVIDIA and sent them the dump. Waiting for their response. In the meantime though @HAL9000 I looked up both your monitor, and they are both 6bit+FRC right?

          BENQ GW2480

          LG WL2950S

          Have you or anyone else had the opportunity to try these cards with a true 10bit display? Since these cards support true 10bit technically there should be no dithering at 10bit by the card itself.

          I also tried a few more of those dithering disabling apps, and I discovered a few interesting things. This tool has a test image which is pretty handy:

          Calibration-tools

          1- Changing the dithering setting does have an effect on the screen. You can clearly see the test image change.

          2- Recording the screen with a screen recorder though shows absolutely no changes, so I guess these apps are only changing the dithering settings of the monitor and are literally doing nothing to what is coming from the card.

          3- the dithering setting in most of these apps has enabled/disabled/default, and it was interesting that my monitor is doing some dithering by itself by default because setting it to default was exactly like setting it to enabled.

            8 days later

            How in hell are they making new gpus that potentially make the display worse than with old gpus. Sounds completely crazy, can't they just keep things as they are when they are good.

            HAL9000 While I haven't explored what these memory dumps possibly contain, my guess is potentially the most one could deduct from such a memory dump is the internal settings of the GPU driver. If this is the case and the card is always "messing up" internally I wouldn't be surprised if such a memory dump fails to gather useful results.

            BloodyHell619

            The Benq should be 8bit, may have gotten the model wrong, its the GL2480 I have:

            https://www.displayspecifications.com/en/model/8b9f1a3f

            The LG is 6+frc, yet its ok on my old machines, its very strange.

            BloodyHell619

            I'm not saying that was me, but I raised a ticket with them for a 1660s and a 3060ti. I want toto RMA the 3060 and hope that the replacement is not the same as I think it is also faulty as it outputs a blurred image.

            I get that the end contact people are just doing their jobs and realistically will be told by their engineering department what to say (if they are even looking at the logs) but there is a difference, I can see it, I've even linked a few websites where people state the same (however I have not linked this website, as with no offence intended I believe to the outside observer we all appear to be 'crazy' people - or we just 'need to get our eyes tested / get glasses' - which is quite depressing really considering the issues we have and the things everyone tries to be able to use these devices)- which is why I hope my card is actually faulty.

              18 days later

              HAL9000 The Benq should be 8bit, may have gotten the model wrong, its the GL2480 I have:

              https://www.displayspecifications.com/en/model/8b9f1a3f

              The LG is 6+frc, yet its ok on my old machines, its very strange.

              Oh, damn. So a true 8 bit won't help either, that only leaves a true 10bit as an option. I am interested though to see how things would be with an 8bit+FRC monitor. When you put it on 10bit(8bit+FRC) will the card and the monitor be both doing dithering? Maybe the monitor's dithering might take over, and it might be a better algorithm and cause less strain.

              HAL9000 I'm not saying that was me, but I raised a ticket with them for a 1660s and a 3060ti.

              Oh, yeah. I would say that was definitely your case.

              HAL9000 I get that the end contact people are just doing their jobs and realistically will be told by their engineering department what to say (if they are even looking at the logs) but there is a difference, I can see it, I've even linked a few websites where people state the same (however I have not linked this website, as with no offence intended I believe to the outside observer we all appear to be 'crazy' people - or we just 'need to get our eyes tested / get glasses' - which is quite depressing really considering the issues we have and the things everyone tries to be able to use these devices)- which is why I hope my card is actually faulty.

              That is so damn true. Everywhere I have posted or any post I saw on this subject, The first comments were always about us being crazy or imagining stuff. What was even more interesting is that I actually see quite a few posts where people are actually constantly requesting NVIDIA to enable dithering because of bad color banding 🙂) The damn thing IS enabled lol and people still have issues with banding.

              HAL9000 I want toto RMA the 3060 and hope that the replacement is not the same as I think it is also faulty as it outputs a blurred image.

              I think it's the dithering that's causing the blurriness.

              Check this. Googles first definition for dithering.

              I actually suddenly thought of this weird theory regarding the blurriness. When I had the card in, whenever I played a very low res video, one that would normally look very pixelated, the eyestrain was maddening. It was like my eyes were getting sucked out of their sockets. And the video seemed blurry, but rather smooth and not as pixelated as it should be. Could it be that the card is treating pixelation as banding?? And then doing dithering on it? Maybe DLSS might even have something to do with it. Like it's doing a combination of DLSS and dithering to make low res videos look better.

              If you still have the card in. I would be really interested about a test on this. My eyes are too fried right now to do it myself, maybe I might attempt it some other time. Try opening a rather low resolution video that seems pixelated and compare how it looks with the 3060 vs the 1660.

              HAL9000 I want toto RMA the 3060 and hope that the replacement is not the same as I think it is also faulty as it outputs a blurred image.

              Let us know if the replaced card makes any difference.

              • JTL replied to this.

                BloodyHell619 @HAL9000 Don't see why messing around with various monitors would do any good if hypothetically the output stage of the GPU is so severely compromised that messing with color profiles or drivers is just patching over the problem making it "less terrible" as opposed to what should be an "untainted" output.

                Speaking of which, anyone know the cheapest Nvidia card that has a "bad" output? Especially in these times with the supply chain crisis. I currently don't own any and if I ever get my test rig up and running again might want to have one for empirical comparisons and research.

                  JTL Don't see why messing around with various monitors would do any good if hypothetically the output stage of the GPU is so severely compromised that messing with color profiles or drivers is just patching over the problem making it "less terrible" as opposed to what should be an "untainted" output.

                  Yeah, it really sucks. But what other choice is there left? NVIDIA impudently lies, and says the card is not doing any dithering! People are calling us crazy. Reading around, I am seeing that AMD and Intel cards are also dithering and as bad. And there literally is no other option on the market. We'd literally have to give up gaming. I even think the PS5 and XBOX do it -.-

                  This is an interesting and rather sad post. It makes it clear that none of the dithering disabling tweaks and apps will disable it. This is probably why NVIDIA refuses to admit the card is doing dithering. It could be that it's not even fixable with a driver.

                  https://hub.displaycal.net/forums/topic/eliminating-temporal-dithering-for-sensitive-folks/

                  JTL Speaking of which, anyone know the cheapest Nvidia card that has a "bad" output? Especially in these times with the supply chain crisis. I currently don't own any and if I ever get my test rig up and running again might want to have one for empirical comparisons and research.

                  If I am correct, the 2060 should be the cheapest troublemaker.

                  • JTL replied to this.

                    BloodyHell619 Yeah, it really sucks. But what other choice is there left? NVIDIA impudently lies, and says the card is not doing any dithering! People are calling us crazy. Reading around, I am seeing that AMD and Intel cards are also dithering and as bad. And there literally is no other option on the market. We'd literally have to give up gaming. I even think the PS5 and XBOX do it -.-

                    My work computer has a Radeon Pro W5500, and at least my unit has no dithering by default either in BIOS or under Linux with the amdgpu driver.

                    I certainly do think more R&D is needed here.

                      BloodyHell619 This is an interesting and rather sad post. It makes it clear that none of the dithering disabling tweaks and apps will disable it. This is probably why NVIDIA refuses to admit the card is doing dithering. It could be that it's not even fixable with a driver.

                      I have seen that post before, and I don't think that simplification is entirely accurate, but it's more complex than a simple "yes/no" question.

                      ryans Also, you can see temporal dithering on an e-ink display here -- notice the dancing dots.

                      Yes, my eInk display (Dasung not eReader) makes temporal dithering very visible as dancing dots, and eInk gives me symptoms at least as fast, if not faster, than other monitors when the dithering is obvious. Dithering is particularly bad on the Mac generally, especially in Apple programs like Preview, or if Nightshift is on (extra whole-screen dithering), or if I highlight something (extra dithering in the highlight). Desktop versions of Microsoft and Adobe programs (old versions at this point) aren’t nearly as bad as Apple Preview. I tried to post some videos here - it’s hard to capture a good video, but Mac dithering on eInk is very visible in person and it looks like writhing worms:

                      https://ledstrain.org/d/1203-win10-finally-caught-up-to-me-and-i-might-be-losing-my-career-because-of-it/32

                      I haven’t used eInk much with my current “best” Windows 7 setup, other than to figure out that I prefer my CCFL monitor to the eInk.

                      a month later

                      JTL Not sure if you mentioned this anywhere, but did you test Windows with your Radeon GPU as well?

                      • JTL replied to this.

                        devilgrove No.

                        a) Don't want to mess with my work computer

                        b) Considering other alleged rendering issues with recent Windows 10 a "unusable" result could be potentially a red herring.

                        BloodyHell619 Same as you, i get eye strain for gpu card like rtx 3050, 3060ti,3070,2060. Then i change my prosesor from ryzen 3500x to i5 12400. And i switch the display hdmi from gpu rtx 3070 to the hdmi motherboard. So i use igpu intel HD. And voila, my eyestrain is gone. Even if i play a game.in game option i still can choose to use rtx 3070 but without eye strain. Hope this can help u

                          dev