BloodyHell619

Have you tried the 3070 setting Windwos (I assume windows) icc profiles?

I've found that with a 2060 and yesterday with a 3060ti setting windows to SRGB and applying the 'default' profiles helps.

I'm now playing with the SRGB clamp using the link above as well.

I used my 3060ti yesterday for approx 1.5 hours on a VA screen that doesnt work for me even on my 'good' hardware - yet with the icc / SRGB settings in Windows and the Nvidia tool I feel less 'disabled' today than I have in the past. (as in previously when I started this testing I would have a very very bad migraine headache for 1-3 days afterwards) is the first time in a month i've uised the 3060ti as it gives me bad symptoms. The 2060 is also not as bad with these settings.

Neither are immediately usable as the 1660s though

    HAL9000

    Yes, I am pretty sure that my settings are set to sRGB and default, but as for the ICC profiles, where can I get these ICC profiles from? They are monitor specific, right? If so, do you think there would even be a profile for a monitor as old as mine?

    I hope this is not true, but I think your relief is because you haven't been behind it for a long time. For me, it took about like 4 days being behind the pc until the symptoms started heavily appearing.

    I'm having almost the same feelings as I had with the 3070ti with my old card, right now. I am just hoping I did not mess any settings up or fuck up my eyes and that they are just strained or the monitor has become faulty.

    I hope you still feel relief after a few days of use, that would be some real hope. Oh, and what monitor are you using with the 3060ti?

      BloodyHell619 but isn't it reasonably possible that throughout a day there is a chance that pretty much at any given moment one of our eyes might be stressed more than the other and the brain would do eye suppression to prevent it from further fatigue or even damage?

      I don't think someone with good binocular vision ever does eye suppression. Only if you have problems. What these flickering displays/lights do is disrupt the suppression you previously used.

        BloodyHell619

        With the 2060 and 3060 cards I get symptoms within minutes of use, same for various intel graphics laptops and symptoms get worse even after i stop usage and persist for days. Mobile phones also affect me and I cant use apple laptops (intel)

        I also stopped testing the 3060 as it got too much for me to essentially suffer every day and I started to use a 2060 and found it moustly the same symptoms wise, but then I changed the color management, this has helped it hasnt solved the issue.

        (Not with 1660s though this was always fine to use on same driver version and I never had to set any color management settings)

        I have Benq gw2480, LG29 ultrawide wl2950s Both fine to use with my 1660s. Neither immediately fine to use with 2060/3060

        So with the color profiles Ive got them for my monitors from the manufacture site, if your screen is very old this may not be possible but I've also tried the default icc srgb profile and the default wcs profile

        My theory is that applications are overwriting what windows is doing as on the 1660 there were some applications (games) that I could not use as they gave me the same symptoms but others were fine. But at the same time the graphics hardware is also. I can see when clamping SRGB with the Nvidia tool thing the colors change and I've also see when not using it launching apps (games) the colors seem to change on launch. - example being since setting the color profiles the 2060 is much more usable for me at 'desktop' whereas before it wasnt. Thinking that forcing the settings in windows is overwriting / reducing the cards effect on colors etc.

        This may be using a 'feature' on the RTX cards that the GTX card doesnt support / use etc as I only had issues on my 1660 with certain modern games - hell let loose on the 1660 causing me same symptoms that 'anything' on the 3060 does for example.

        I cant prove this though 🙁

        ryans Have you considered trying MXLinux, just to see what Linux does for your strain?

        From my (limited) understanding, if the card output is so severely compromised that even the BIOS screen has visual issues, it raises much more difficult questions than just merely trying other drivers or operating systems.

        Also checkout my thread on binocular vision. It is possible the "flicker" caused by temporal dithering is trigger a pre-existing binocular vision dysfunction

        In my opinion if the problem is from the card itself (which it probably is) it makes sense to first isolate that and mitigate the issue on a technical level (such as by obtaining a GPU known to not have dithering issues) than worry about possible compensation for visual issues afterwards.

          ryans

          Oh, I see. I thought it was something that happened all the time. I still doubt I have BVD. Both my eyes see perfectly alone, and I think I did get a sorta BVD test everytime I went to have them checked.

          JTL

          Totally agree with both your statements.

          You cannot believe the amount of time and effort I put into changing setting of my TU7000 Samsung TV back when I had it and was suffering from its PWM. Every time I felt I had found a good color, brightness setting that would take the pain away. I even managed to tune it to a point where I could no longer capture the black lines that were running down the screen on camera. But still every time after a few hours of use the pain was back.

          I just sent tickets both to ASUS and NVIDIA asking for a way to disable Dithering. I hope I get something from them. Have any of you done so? If so what response did you get?

            BloodyHell619

            Nividia tell me my 3060 is not dithering and is exactly the same output as my 1660 based on the log dumps I sent them. Yet the output is noticably blurry and all the other assoicated symptoms even with my newest color management settings.

              JTL

              They requested kernel memory dumps, the link they sent me no longer works but it should be these:

              https://nvidia.custhelp.com/app/answers/detail/a_id/5149/kw/logs

              https://nvidia.custhelp.com/app/answers/detail/a_id/4755/\~/manually-forcing-a-system-crash-using-a-keyboard

              which may actually be

              https://docs.microsoft.com/en-us/windows-hardware/drivers/debugger/kernel-memory-dump

              but the link they sent me with their instructions no longer exists. These memory dumps were approx 900mb for each card and had to be shared with nvidia.

                HAL9000

                I think I am about to receive the exact same carp they gave you too

                This should be exactly what they told you, right?

                I could swear they are lying about something.

                My eyes hurt so bad, I don't even wanna put the card back in again to take the dump lol. I think I am going to end up damaging my PCI slots, riser and my power cable pins putting the card in and out so many times lol

                So contacting Asus was pretty pointless.

                I replied to NVIDIA and sent them the dump. Waiting for their response. In the meantime though @HAL9000 I looked up both your monitor, and they are both 6bit+FRC right?

                BENQ GW2480

                LG WL2950S

                Have you or anyone else had the opportunity to try these cards with a true 10bit display? Since these cards support true 10bit technically there should be no dithering at 10bit by the card itself.

                I also tried a few more of those dithering disabling apps, and I discovered a few interesting things. This tool has a test image which is pretty handy:

                Calibration-tools

                1- Changing the dithering setting does have an effect on the screen. You can clearly see the test image change.

                2- Recording the screen with a screen recorder though shows absolutely no changes, so I guess these apps are only changing the dithering settings of the monitor and are literally doing nothing to what is coming from the card.

                3- the dithering setting in most of these apps has enabled/disabled/default, and it was interesting that my monitor is doing some dithering by itself by default because setting it to default was exactly like setting it to enabled.

                  8 days later

                  How in hell are they making new gpus that potentially make the display worse than with old gpus. Sounds completely crazy, can't they just keep things as they are when they are good.

                  HAL9000 While I haven't explored what these memory dumps possibly contain, my guess is potentially the most one could deduct from such a memory dump is the internal settings of the GPU driver. If this is the case and the card is always "messing up" internally I wouldn't be surprised if such a memory dump fails to gather useful results.

                  BloodyHell619

                  The Benq should be 8bit, may have gotten the model wrong, its the GL2480 I have:

                  https://www.displayspecifications.com/en/model/8b9f1a3f

                  The LG is 6+frc, yet its ok on my old machines, its very strange.

                  BloodyHell619

                  I'm not saying that was me, but I raised a ticket with them for a 1660s and a 3060ti. I want toto RMA the 3060 and hope that the replacement is not the same as I think it is also faulty as it outputs a blurred image.

                  I get that the end contact people are just doing their jobs and realistically will be told by their engineering department what to say (if they are even looking at the logs) but there is a difference, I can see it, I've even linked a few websites where people state the same (however I have not linked this website, as with no offence intended I believe to the outside observer we all appear to be 'crazy' people - or we just 'need to get our eyes tested / get glasses' - which is quite depressing really considering the issues we have and the things everyone tries to be able to use these devices)- which is why I hope my card is actually faulty.

                    18 days later

                    HAL9000 The Benq should be 8bit, may have gotten the model wrong, its the GL2480 I have:

                    https://www.displayspecifications.com/en/model/8b9f1a3f

                    The LG is 6+frc, yet its ok on my old machines, its very strange.

                    Oh, damn. So a true 8 bit won't help either, that only leaves a true 10bit as an option. I am interested though to see how things would be with an 8bit+FRC monitor. When you put it on 10bit(8bit+FRC) will the card and the monitor be both doing dithering? Maybe the monitor's dithering might take over, and it might be a better algorithm and cause less strain.

                    HAL9000 I'm not saying that was me, but I raised a ticket with them for a 1660s and a 3060ti.

                    Oh, yeah. I would say that was definitely your case.

                    HAL9000 I get that the end contact people are just doing their jobs and realistically will be told by their engineering department what to say (if they are even looking at the logs) but there is a difference, I can see it, I've even linked a few websites where people state the same (however I have not linked this website, as with no offence intended I believe to the outside observer we all appear to be 'crazy' people - or we just 'need to get our eyes tested / get glasses' - which is quite depressing really considering the issues we have and the things everyone tries to be able to use these devices)- which is why I hope my card is actually faulty.

                    That is so damn true. Everywhere I have posted or any post I saw on this subject, The first comments were always about us being crazy or imagining stuff. What was even more interesting is that I actually see quite a few posts where people are actually constantly requesting NVIDIA to enable dithering because of bad color banding 🙂) The damn thing IS enabled lol and people still have issues with banding.

                    HAL9000 I want toto RMA the 3060 and hope that the replacement is not the same as I think it is also faulty as it outputs a blurred image.

                    I think it's the dithering that's causing the blurriness.

                    Check this. Googles first definition for dithering.

                    I actually suddenly thought of this weird theory regarding the blurriness. When I had the card in, whenever I played a very low res video, one that would normally look very pixelated, the eyestrain was maddening. It was like my eyes were getting sucked out of their sockets. And the video seemed blurry, but rather smooth and not as pixelated as it should be. Could it be that the card is treating pixelation as banding?? And then doing dithering on it? Maybe DLSS might even have something to do with it. Like it's doing a combination of DLSS and dithering to make low res videos look better.

                    If you still have the card in. I would be really interested about a test on this. My eyes are too fried right now to do it myself, maybe I might attempt it some other time. Try opening a rather low resolution video that seems pixelated and compare how it looks with the 3060 vs the 1660.

                    HAL9000 I want toto RMA the 3060 and hope that the replacement is not the same as I think it is also faulty as it outputs a blurred image.

                    Let us know if the replaced card makes any difference.

                    • JTL replied to this.

                      BloodyHell619 @HAL9000 Don't see why messing around with various monitors would do any good if hypothetically the output stage of the GPU is so severely compromised that messing with color profiles or drivers is just patching over the problem making it "less terrible" as opposed to what should be an "untainted" output.

                      Speaking of which, anyone know the cheapest Nvidia card that has a "bad" output? Especially in these times with the supply chain crisis. I currently don't own any and if I ever get my test rig up and running again might want to have one for empirical comparisons and research.

                        JTL Don't see why messing around with various monitors would do any good if hypothetically the output stage of the GPU is so severely compromised that messing with color profiles or drivers is just patching over the problem making it "less terrible" as opposed to what should be an "untainted" output.

                        Yeah, it really sucks. But what other choice is there left? NVIDIA impudently lies, and says the card is not doing any dithering! People are calling us crazy. Reading around, I am seeing that AMD and Intel cards are also dithering and as bad. And there literally is no other option on the market. We'd literally have to give up gaming. I even think the PS5 and XBOX do it -.-

                        This is an interesting and rather sad post. It makes it clear that none of the dithering disabling tweaks and apps will disable it. This is probably why NVIDIA refuses to admit the card is doing dithering. It could be that it's not even fixable with a driver.

                        https://hub.displaycal.net/forums/topic/eliminating-temporal-dithering-for-sensitive-folks/

                        JTL Speaking of which, anyone know the cheapest Nvidia card that has a "bad" output? Especially in these times with the supply chain crisis. I currently don't own any and if I ever get my test rig up and running again might want to have one for empirical comparisons and research.

                        If I am correct, the 2060 should be the cheapest troublemaker.

                        • JTL replied to this.

                          BloodyHell619 Yeah, it really sucks. But what other choice is there left? NVIDIA impudently lies, and says the card is not doing any dithering! People are calling us crazy. Reading around, I am seeing that AMD and Intel cards are also dithering and as bad. And there literally is no other option on the market. We'd literally have to give up gaming. I even think the PS5 and XBOX do it -.-

                          My work computer has a Radeon Pro W5500, and at least my unit has no dithering by default either in BIOS or under Linux with the amdgpu driver.

                          I certainly do think more R&D is needed here.

                            BloodyHell619 This is an interesting and rather sad post. It makes it clear that none of the dithering disabling tweaks and apps will disable it. This is probably why NVIDIA refuses to admit the card is doing dithering. It could be that it's not even fixable with a driver.

                            I have seen that post before, and I don't think that simplification is entirely accurate, but it's more complex than a simple "yes/no" question.

                            dev