Hi all,
This is a fascinating thread! I came across it after googiling for some cryptic Nvidia Xorg.conf parameters.

I have been struggling with Nvidia's Linux drivers for quite some time (8 months or so). My main issue is that I could never get graphics looking "right" under Linux. By right, I mean the same as under Windows 10. I could never put my finger on it as to what was wrong but the text just didn't look the same. This caused me a lot of eyestrain and I could never use my computer under Linux as I can under Windows. Even when I installed Windows fonts, the text rendering was not good and I spent a lot of time messing with fontconfig etc. But fontconfig was definitely not the issue. I did figure out what was wrong but I still haven't been able to fix it.

I'll cut right to the chase: open this test PNG file in your default image viewer (or a web browser), make sure it's set to 100% (no scaling) and look at the bottom two lines. If they're crisp and readable, your Nvidia driver is performing fine. If the two lines are fuzzy, it could be that you don't have a display that supports 4:4:4 chroma mode or it could be that you're running Nvidia under Linux.

Test: http://i.rtings.com/images/test-materials/2017/chroma-444.png

Under Windows 10, that test pattern looks absolutely PERFECT on my screen. Text is sharp and bottom two lines are perfectly readable and look just right. Under Linux, test pattern looks like 4:2:2 Chroma subsampling:

http://i.rtings.com/images/reviews/ju7100/ju7100-text-chroma-4k-30hz-large.jpg
Pictures of problems: http://www.rtings.com/tv/learn/chroma-subsampling

I have 4K Samsung HDR display and it's connected to my Nvidia GTX 1060 over HDMI 2.0 cable. I have tried all kinds of Xorg.conf paramerts and I just can't get it to display colors the same way that Windows 10 does. I've tried different cables (certified 4K UHD cables) but to no avail. The issue is not hardware but software.

I've tried disabling dithering, enabling Force Composition Pipeline but that does not help. I'm at loss as what could be wrong and I suspect only Nvidia can fix this.

Has anyone here experienced this issue?
Thanks.

  • JTL replied to this.
  • JTL likes this.

    JTL Well.. I'm not sure. Crisp is a hard thing to compare too. I looked at the example bad(?) image, perhaps this one looks better with the 750Ti?
    It seems ok to me

    enter image description here

      JTL Those options are the same as the ones you get from the nvidia-settings app. I've tried both the RGB and YCbCr444 settings with not much success. They both look the same on my screen under Linux.... and don't look correct. Under Windows 10, RBG looks absolutely perfect (haven't tested YCbCr444 under Win10).

      Slacor That looks perfect to me. Is this under Linux or Win10? If under Linux, which driver version are you using and what's the display? Thanks!

        Batou Also, what model of 1060 (like from what brand), do you have eystrain under Windows 10?

        Just collecting data.

          JTL It's an EVGA GTX 1060 6GB model. No eyestrain under Win10. Just Linux. FWIW, I also tried friend's Zotac 1070 with same results.
          Fault is with Nvidia's Linux binary driver. There's nothing you can do to fix it. Only they can fix it at this point.

            Batou Try writing a stern email to Nvidia support. Although I doubt it's going to do much.

            Nouveau has limited support for newer Nvidia cards.

            I have an EVGA GTX 1070 FE that I only tried under Linux. Might put it back in one of these days and try the chroma subsampling tests along with my 560.

              Batou No eyestrain under Win10. .... FWIW, I also tried friend's Zotac 1070 with same results.

              Really? Me, I couldn't cope with my Gigabyte 1070 or 1080 cards.

              I don't want to entertain the thought that some might be ok and not others.

                JTL I made a thread on Nvidia Linux forum. Got nowhere. Any idea what email would get their attention?

                AgentX20 All these card manufacturers stick pretty closely to Nvidia's reference. They "innovate" in cooling and looks area mostly. I doubt this has anything to do with board manufacturers. They all use the same silicon from Nvidia. It's hard to mess up other stuff.

                • JTL replied to this.

                  Batou I'm beginning to wonder if the people who have "bad" GTX 970's vs "good" 970's simply have a different VBIOS from Nvidia that changed something...

                  I've seen the VBIOS versions of "good" cards, now to find someone with a "bad" 970 and compare.

                  7 days later

                  Batou Forget to mention. @Slacor is mainly a Linux user so that screenshot was probably Linux.. I'll get him to confirm.

                  I might test my 560 and 1070 on both WIndows and Linux with different drivers and compare.

                  Batou That looks perfect to me. Is this under Linux or Win10? If under Linux, which driver version are you using and what's the display? Thanks!

                  Under Linux, that's correct. I think the version is 384.90 with 750 Ti.
                  By display do you mean monitor?

                  ViewSonic VA2455sm

                  Graphics:  Card: NVIDIA GM107 [GeForce GTX 750 Ti]
                             Display Server: X.Org 1.18.4 drivers: nvidia (unloaded: fbdev,vesa,nouveau)
                             Resolution: 1920x1080@60.00hz
                             GLX Renderer: GeForce GTX 750 Ti/PCIe/SSE2 GLX Version: 4.5.0 NVIDIA 384.90
                  • JTL replied to this.

                    Slacor I also know using a different GPU output (HDMI, DVI, VGA) can be different in terms of the output colorspace.

                    Nvidia with HDMI in the past has had problems, where the colors would looked "washed out". Under Windows this was easy to fix, no idea about Linux though.

                    a month later

                    Decided to try my desktop again.

                    EVGA GTX 1070 Founders Edition, Debian 8.8, MATE desktop environment.

                    Seems ok at a first glance using it for an hour, even played some CS:GO under Windows 7.

                    Connected by DVI to BenQ XL2720Z (27" 6+2 bit FRC TN panel, which should in theory be "OH GOD AWFUL")

                    I wonder how much of this is my vision "healing" itself from the cannabinoid treatments, then again I managed to briefly use the card in July 2016 with Ubuntu when I got it. The reason why I didn't report on my tests before was that I was getting headaches from ALL large monitors (while using my 15" MBP was ok). Turns out it was my vision improving and I needed to stop using my glasses as they were too strong.

                    Surprising, the image looks less "noisy" than an ASUS R9 270X with the "amdgpu" driver, and indeed others have complained.

                    https://www.reddit.com/r/AMD_Linux/comments/6qoq1o/amdgpu_driver_how_to_disable_temporal_dither/

                    I tried modifying the kernel source myself to disable the dithering but a) I didn't seem to edit the right function (haven't tried GDB kernel debugging yet) or b) it's done in hardware or VBIOS.

                    Here's the YUV chroma sampling test with the above GTX 1070 configuration and monitor.

                    https://imgur.com/a/Eo273

                    3 months later

                    JTL
                    When I watch this on Eink, I can see the areas that will cause the camera to go into the pattern before it happens. confirmation of both methods. I would get rid of that background, the perimeter is dithering hell

                    • JTL replied to this.

                      ShivaWind Oh, that's the default desktop background of Debian.

                      Good point though

                      dev