pocupine Let me clarify:

  1. You aren't ever going to get #ffffff, it might seem like you ought to be able to, but you aren't. Even if that's what you specify, it's not what you'll get. Color profiles futz with that. Now, it might be a reasonable supposition that if you SPECIFY #ffffff, then the card shouldn't dither, right? But you'll find in many cards there is simply no way to turn off the FRC. And FRC isn't selective... often it's applied as a final stage filter (although not always), after all other modifications (including color profile) have been applied.

  2. You would be correct IF a card ever output #000000 or #FFFFFF, that the monitor at that point should NOT dither it. But again - many do. Plasma TV's, as an example, can be told to display pure black or pure white, and will instead do a rolling color output. Partly this is power saving, partly it's to prevent image retention, partly it's to squeeeeeeeze more colors out of the device than it can actually produce.

  3. Many devices screw with the final output color, even if that color is white or black, based on the adjustments of the device. Long gone are the days when you can say "display pure black" and have the monitor simply NOT LIGHT those pixels... and rely on the backlight for lighting adjustments.

So while I agree with you in principle, I think that in practice the ability to say "it's pure white, it's DEFINITELY NEVER DITHERED" is limited.

    Gurm So when I view a white page (again, so you'll understand, white means #ffffff), my GPU doesn't output #ffffff to my monitor? Please provide sources that support this.

      pocupine You don't need sources. Set your background to white, then go into the control panel for your video card and yank on a slider and watch the screen output change... this is 100% the case for every video card. There is a software or hardware layer doing various fuckeries with the output before it goes to the monitor. I agree with you that IF you really sent #FFFFFF to the monitor it SHOULD display it without dither (although the monitor has software that does other fuckeries). But good luck making that happen.

      Additionally, WINDOWS has multiple abstraction layers before the output makes its way the screen. Color Profiles, color correction, gamma... all applied at the driver level, the card hardware level, the windows API level, the screen composition level... then again at the input and output phases of the display unit.

      Windows 10 added ANOTHER layer, called "Composition spaces". The OS support for this began in the Anniversary Edition, which is why many of us find builds 1607+ to be unusable.

        Do you think Windows 10 introduced dithering and that's why a lot of people are having trouble with it, even on monitors that are using 8-bits per pixel?

        Do you have any suggestion on how I can test if my eyestrain is due to dithering?

        4 days later

        Why not try contacting NVIDIA through official channels and ask them about dithering on 9xx/10xx cards? DevTalk and other forums aren't going to get us anywhere. and @si_edgey for starters put the Temporal Dithering discussion on the NVIDIA forum in the wrong category, for starters.

        https://nvidia.custhelp.com/app/ask

        If enough people contact them, maybe they will notice.

        I have been speaking to Nvidia and was escalated to a member of the driver team. He has not promised anything but has submitted a feature enhancement request to try and have dithering options included in future drivers. I have written back to try to get clarification of what has changed in the 10xx series hardware (and also later 9xx cards potentially) so will report back.

        He was actually very compassionate about the issue and (as expected) had never heard of this being an issue before. I would recommend that everyone get in touch with Nvidia to start making a noise about it so they are more likely to listen!

          Enclosed is a response from NVIDIA support

          Hi,
          My name is Derek and your support request was escalated to me for review. I have ready through the history and wanted to let you know that this feedback is something that I have passed on to our driver development teams.

          Currently there aren't any options or controls that I am aware of to let you turn it off.

          I was going to suggest you inquire on the developers forums but it looks like you are already aware of them. That would be the best place to ask about the possibility of being able to disable it.

          Derek-

          • JTL replied to this.

            JTL I have attached my resposne

            Hi Derek

            The problem with the developer forums as others have noted is that NVIDIA employees don't really read the forum, let alone the ones responsible for driver and/or VBIOS development.

            It would be interesting to see if temporal dithering is enabled in hardware with the later 900 and 1000-series cards. I am running Linux which does allow you to disable dithering in the driver yet I notice with a high speed camera what appears to be dithering with a BenQ 8-bit VA panel monitor over DVI (tested with EVGA GTX 1070) and as others have noted it can cause some eyestrain and headaches that are absent without the dithering.

            I appreciate your help though.

            Here is his response

            Sorry I couldn't be more help. Passing on this sort of user feedback is all I can really do.

            Derek-

            10 days later

            Hi all,
            This is a fascinating thread! I came across it after googiling for some cryptic Nvidia Xorg.conf parameters.

            I have been struggling with Nvidia's Linux drivers for quite some time (8 months or so). My main issue is that I could never get graphics looking "right" under Linux. By right, I mean the same as under Windows 10. I could never put my finger on it as to what was wrong but the text just didn't look the same. This caused me a lot of eyestrain and I could never use my computer under Linux as I can under Windows. Even when I installed Windows fonts, the text rendering was not good and I spent a lot of time messing with fontconfig etc. But fontconfig was definitely not the issue. I did figure out what was wrong but I still haven't been able to fix it.

            I'll cut right to the chase: open this test PNG file in your default image viewer (or a web browser), make sure it's set to 100% (no scaling) and look at the bottom two lines. If they're crisp and readable, your Nvidia driver is performing fine. If the two lines are fuzzy, it could be that you don't have a display that supports 4:4:4 chroma mode or it could be that you're running Nvidia under Linux.

            Test: http://i.rtings.com/images/test-materials/2017/chroma-444.png

            Under Windows 10, that test pattern looks absolutely PERFECT on my screen. Text is sharp and bottom two lines are perfectly readable and look just right. Under Linux, test pattern looks like 4:2:2 Chroma subsampling:

            http://i.rtings.com/images/reviews/ju7100/ju7100-text-chroma-4k-30hz-large.jpg
            Pictures of problems: http://www.rtings.com/tv/learn/chroma-subsampling

            I have 4K Samsung HDR display and it's connected to my Nvidia GTX 1060 over HDMI 2.0 cable. I have tried all kinds of Xorg.conf paramerts and I just can't get it to display colors the same way that Windows 10 does. I've tried different cables (certified 4K UHD cables) but to no avail. The issue is not hardware but software.

            I've tried disabling dithering, enabling Force Composition Pipeline but that does not help. I'm at loss as what could be wrong and I suspect only Nvidia can fix this.

            Has anyone here experienced this issue?
            Thanks.

            • JTL replied to this.
            • JTL likes this.

              JTL Well.. I'm not sure. Crisp is a hard thing to compare too. I looked at the example bad(?) image, perhaps this one looks better with the 750Ti?
              It seems ok to me

              enter image description here

                JTL Those options are the same as the ones you get from the nvidia-settings app. I've tried both the RGB and YCbCr444 settings with not much success. They both look the same on my screen under Linux.... and don't look correct. Under Windows 10, RBG looks absolutely perfect (haven't tested YCbCr444 under Win10).

                Slacor That looks perfect to me. Is this under Linux or Win10? If under Linux, which driver version are you using and what's the display? Thanks!

                  Batou Also, what model of 1060 (like from what brand), do you have eystrain under Windows 10?

                  Just collecting data.

                    JTL It's an EVGA GTX 1060 6GB model. No eyestrain under Win10. Just Linux. FWIW, I also tried friend's Zotac 1070 with same results.
                    Fault is with Nvidia's Linux binary driver. There's nothing you can do to fix it. Only they can fix it at this point.

                      dev