Slacor I also know using a different GPU output (HDMI, DVI, VGA) can be different in terms of the output colorspace.

Nvidia with HDMI in the past has had problems, where the colors would looked "washed out". Under Windows this was easy to fix, no idea about Linux though.

a month later

Decided to try my desktop again.

EVGA GTX 1070 Founders Edition, Debian 8.8, MATE desktop environment.

Seems ok at a first glance using it for an hour, even played some CS:GO under Windows 7.

Connected by DVI to BenQ XL2720Z (27" 6+2 bit FRC TN panel, which should in theory be "OH GOD AWFUL")

I wonder how much of this is my vision "healing" itself from the cannabinoid treatments, then again I managed to briefly use the card in July 2016 with Ubuntu when I got it. The reason why I didn't report on my tests before was that I was getting headaches from ALL large monitors (while using my 15" MBP was ok). Turns out it was my vision improving and I needed to stop using my glasses as they were too strong.

Surprising, the image looks less "noisy" than an ASUS R9 270X with the "amdgpu" driver, and indeed others have complained.

https://www.reddit.com/r/AMD_Linux/comments/6qoq1o/amdgpu_driver_how_to_disable_temporal_dither/

I tried modifying the kernel source myself to disable the dithering but a) I didn't seem to edit the right function (haven't tried GDB kernel debugging yet) or b) it's done in hardware or VBIOS.

Here's the YUV chroma sampling test with the above GTX 1070 configuration and monitor.

https://imgur.com/a/Eo273

3 months later

JTL
When I watch this on Eink, I can see the areas that will cause the camera to go into the pattern before it happens. confirmation of both methods. I would get rid of that background, the perimeter is dithering hell

  • JTL replied to this.

    ShivaWind Oh, that's the default desktop background of Debian.

    Good point though

    a month later

    just Trying out lubuntu linux + 1060 gpu. Is anyone here swearing by linux these days? which dist/drivers? still able to disable dithering?

      reaganry I'd avoid the later 9xx and 10xx cards. 😐

      I swear, at least the 1070 dithers by default even in BIOS before any OS loads!

        JTL Even some 9x0's are set up like this. I have two Dell XPS systems with theoretically identical GTX9x0 (960?) cards and one dithers HORRIFICALLY and one doesn't. That could be the Intel chips, but WOW.

        • JTL replied to this.

          Gurm If your using the output port on the Nvidia card it's 99.5% the Nvidia's fault. If the Intel graphics is disabled in the BIOS it's 100% the Nvidia cards fault.

          Wanna take some GPU-Z VBIOS dumps for me and send them my way? 😉 (Find someone else to do it if you don't want to hurt your eyes. I understand)

          • Gurm replied to this.

            JTL Well it's an XPS so I think everything runs through the Intel chip anyway in this configuration?

            • JTL replied to this.

              Gurm Are you implying it's a laptop?

              Gurm Uh oh.

              I know the dithering is 100% the Nvidia GPU with a bad 9xx or 10xx card on a desktop but I wonder how the situation is with laptops, since as you know, the final output is through the Intel card regardless if the Nvidia card is being used.

              My brother has an HP Zbook 15 G3 with Intel graphics and an Nvidia Quadro M1000M (from the 9xx series), and what's interesting is by default it uses Optimus, but in the BIOS there is an option to just use the Nvidia card. I tried said option and in device manager the Intel card seems to be completely "disconnected" from the running system, although since it was running the Optimus driver Windows just loaded the standard VGA driver, refusing to use the Nvidia one.

              11 days later

              ryans Won't work. some 9xx and all known 10xx cards dither even within the BIOS setup.

                12 days later

                JTL And this is why I'd like to hear some first hand reports about the AMD Vega cards. I know more recent AMD cards have a bad reputation, but I have an 6950 here that is perfectly good so they can make 'good' cards from my perspective.

                  2 months later

                  Another data point...

                  I normally run a Gigabyte 970 G1 Gaming card with no problems at all. Having failed with 1070, 1080, and 980Ti cards, I realised that the 980 card is the same chipset and was released at the same time as the 970. I thought to myself... maybe I can get a slight performance uplift?

                  I'm now trialing a Gigabyte 980 G1 Gaming card. It's very early days, but I think I'm detecting low levels of discomfort with it.

                  Both my good 970 and this suspect 980 are 1.0 versions from the same manufacturer, released around the same time with supposedly the same chipset.

                    AgentX20 Check the VBIOS using GPU-Z?

                    https://www.techpowerup.com/gpuz/

                    If you could find a bad 970 (SOMEONE NEEDS TO DO THIS), it would be interesting to compare the VBIOS version of the "good 970" vs the "bad 970"? Then I have some more ideas on top of that.

                    Yeah, I'll check out the 980 BIOS versions and see if there's anything there.

                    9 days later

                    OK - so I've now tried two Gigabyte G1 Gaming GTX980 cards (both v1.0 hardware).

                    They differ on the memory front - one is Hynix and one Samsung. Both have very early bios versions.

                    Both give me low levels of eye strain, which is hugely annoying for this is supposedly the SAME chip as my 'works-great' 970 card.

                    I'll spend some time shortly, and push both cards up to the newest BIOS version in case that helps. Both updates report they "improve compatibility with some monitors". Quite what that does I dunno. But I suspect it's probably a bad thing for folks like us here. Still worth a crack.

                      dev