Batou Try writing a stern email to Nvidia support. Although I doubt it's going to do much.

Nouveau has limited support for newer Nvidia cards.

I have an EVGA GTX 1070 FE that I only tried under Linux. Might put it back in one of these days and try the chroma subsampling tests along with my 560.

    Batou No eyestrain under Win10. .... FWIW, I also tried friend's Zotac 1070 with same results.

    Really? Me, I couldn't cope with my Gigabyte 1070 or 1080 cards.

    I don't want to entertain the thought that some might be ok and not others.

      JTL I made a thread on Nvidia Linux forum. Got nowhere. Any idea what email would get their attention?

      AgentX20 All these card manufacturers stick pretty closely to Nvidia's reference. They "innovate" in cooling and looks area mostly. I doubt this has anything to do with board manufacturers. They all use the same silicon from Nvidia. It's hard to mess up other stuff.

      • JTL replied to this.

        Batou I'm beginning to wonder if the people who have "bad" GTX 970's vs "good" 970's simply have a different VBIOS from Nvidia that changed something...

        I've seen the VBIOS versions of "good" cards, now to find someone with a "bad" 970 and compare.

        7 days later

        Batou Forget to mention. @Slacor is mainly a Linux user so that screenshot was probably Linux.. I'll get him to confirm.

        I might test my 560 and 1070 on both WIndows and Linux with different drivers and compare.

        Batou That looks perfect to me. Is this under Linux or Win10? If under Linux, which driver version are you using and what's the display? Thanks!

        Under Linux, that's correct. I think the version is 384.90 with 750 Ti.
        By display do you mean monitor?

        ViewSonic VA2455sm

        Graphics:  Card: NVIDIA GM107 [GeForce GTX 750 Ti]
                   Display Server: X.Org 1.18.4 drivers: nvidia (unloaded: fbdev,vesa,nouveau)
                   Resolution: 1920x1080@60.00hz
                   GLX Renderer: GeForce GTX 750 Ti/PCIe/SSE2 GLX Version: 4.5.0 NVIDIA 384.90
        • JTL replied to this.

          Slacor I also know using a different GPU output (HDMI, DVI, VGA) can be different in terms of the output colorspace.

          Nvidia with HDMI in the past has had problems, where the colors would looked "washed out". Under Windows this was easy to fix, no idea about Linux though.

          a month later

          Decided to try my desktop again.

          EVGA GTX 1070 Founders Edition, Debian 8.8, MATE desktop environment.

          Seems ok at a first glance using it for an hour, even played some CS:GO under Windows 7.

          Connected by DVI to BenQ XL2720Z (27" 6+2 bit FRC TN panel, which should in theory be "OH GOD AWFUL")

          I wonder how much of this is my vision "healing" itself from the cannabinoid treatments, then again I managed to briefly use the card in July 2016 with Ubuntu when I got it. The reason why I didn't report on my tests before was that I was getting headaches from ALL large monitors (while using my 15" MBP was ok). Turns out it was my vision improving and I needed to stop using my glasses as they were too strong.

          Surprising, the image looks less "noisy" than an ASUS R9 270X with the "amdgpu" driver, and indeed others have complained.

          https://www.reddit.com/r/AMD_Linux/comments/6qoq1o/amdgpu_driver_how_to_disable_temporal_dither/

          I tried modifying the kernel source myself to disable the dithering but a) I didn't seem to edit the right function (haven't tried GDB kernel debugging yet) or b) it's done in hardware or VBIOS.

          Here's the YUV chroma sampling test with the above GTX 1070 configuration and monitor.

          https://imgur.com/a/Eo273

          3 months later

          JTL
          When I watch this on Eink, I can see the areas that will cause the camera to go into the pattern before it happens. confirmation of both methods. I would get rid of that background, the perimeter is dithering hell

          • JTL replied to this.

            ShivaWind Oh, that's the default desktop background of Debian.

            Good point though

            a month later

            just Trying out lubuntu linux + 1060 gpu. Is anyone here swearing by linux these days? which dist/drivers? still able to disable dithering?

              reaganry I'd avoid the later 9xx and 10xx cards. 😐

              I swear, at least the 1070 dithers by default even in BIOS before any OS loads!

                JTL Even some 9x0's are set up like this. I have two Dell XPS systems with theoretically identical GTX9x0 (960?) cards and one dithers HORRIFICALLY and one doesn't. That could be the Intel chips, but WOW.

                • JTL replied to this.

                  Gurm If your using the output port on the Nvidia card it's 99.5% the Nvidia's fault. If the Intel graphics is disabled in the BIOS it's 100% the Nvidia cards fault.

                  Wanna take some GPU-Z VBIOS dumps for me and send them my way? 😉 (Find someone else to do it if you don't want to hurt your eyes. I understand)

                  • Gurm replied to this.

                    JTL Well it's an XPS so I think everything runs through the Intel chip anyway in this configuration?

                    • JTL replied to this.

                      Gurm Are you implying it's a laptop?

                      Gurm Uh oh.

                      I know the dithering is 100% the Nvidia GPU with a bad 9xx or 10xx card on a desktop but I wonder how the situation is with laptops, since as you know, the final output is through the Intel card regardless if the Nvidia card is being used.

                      My brother has an HP Zbook 15 G3 with Intel graphics and an Nvidia Quadro M1000M (from the 9xx series), and what's interesting is by default it uses Optimus, but in the BIOS there is an option to just use the Nvidia card. I tried said option and in device manager the Intel card seems to be completely "disconnected" from the running system, although since it was running the Optimus driver Windows just loaded the standard VGA driver, refusing to use the Nvidia one.

                      dev