degen Speaking of Nvidia drivers, this isn't related to dithering, but I remember on WIndows Nvidia had a bug where with the GTX 5xx and most 6xx cards, any driver newer than 314.22 until 337 (I think) would cause the system to completely freeze so bad even the power button stopped working.

Lost a couple WIndows installs to it a few years ago 🙁

Just thought it was an interesting anecdote regarding Nvidia driver quality.

Trying a manually focused, manual camera settings up to a monitor to see what results I get.

Camera: Canon 80D
GPU: EVGA GTX 560 (released in late 2010, purchased in May 2012)
Monitor: BenQ GW2760HS using DisplayPort

https://www.youtube.com/watch?v=QV_qadkkRio https://www.youtube.com/watch?v=m3Fu6fgTsI8

I wonder how much of it is possible dithering and the other is moiré interference patterns?

    Gurm There is definitely something that changed in the output of the 10x0 series. It happened, actually, midway through the 9x0 generation. I had no problem at all with an MSI GTX970 which was introduced in 2015 I suspect... whereas the Zotac and EVGA models (which were the "lower power consumption" variety which only needed one power header) were instantly problematic.

    Pretty much matches my findings.

    Idea I had

    if we can find a GTX 970 submodel (as in MSI/EVGA/Gigabyte/Whatever GTX 970) that doesn't seem to have dithering and another that does, we might be able to dump the VBios to see what changed, if any. Might even be possible to downgrade it as well.

    For what it's worth, my 'good' 970 is a Gigabyte G1 Gaming edition:
    GPU: GM204-200-A1
    BIOS reported as: 84.04.2F.00.80

    EDIT: I'm running the latest Nvidia drivers without any hassles.

    I think my MSI 970 is fine. It doesn't seem to be any worse than my Asus 650 Ti Boost. I'm using old drivers (368.81, would like to try a pre-10xx release however) and WIndows 7.

    Video Chipset Codename: GM204-200/D17U-20
    Video BIOS Version: 84.04.84.00.29
    Video Chipset Revision: A1

    I bought this 970 very late into it's manufacturing lifespan. They had already switched to Elpida memory, which always happens after the reviews are done and such.

    It does require the 2 power cables like the other MSI 970s.

    Alright.

    Now I think a good idea is to find a bad 970, dump the BIOS (use GPU-Z) and compare the version and the files.

    This proves if it's the BIOS or something deeper.

    Good luck

    If someone gets eyestrain from reading black text on white background then it should hint that the problem is not with dithering, right? Because any screen can show black and white without dithering.

    JTL Does the pattern occur even on white background? It's hard to see from the video.

      pocupine White isn't really white, I assumed everyone knew that but perhaps not. Black text on a white background means that you've got massive amounts of FRC happening, might as well be a beige or purple background at that point. All three subpixels are firing to make white. Also "white" is a relative term, depending on gamma, color profiles, etc. so the assumption that there's no dithering with a white pixel is ... 100% incorrect.

        Gurm White as in the color #ffffff or rgb(255,255,255) should not require any dithering. And black, as is the color #000000 or rgb(0,0,0) should require it either.

        Dithering occurs when a monitor can't display a certain color, or more precisely a combination of RGB values (which correspond to the amount of light allowed to pass through the RGB subpixels). There's no reason at all that any display that uses 2-bits per pixel or more would need to dither for those two black and white values that I mentioned.

        I assumed everyone knew that but perhaps not.

        EDIT: Just to clarify, it is not possible to dither for the highest or lowest values a pixel can have. If a monitor can't display rgb(255,255,255) without dithering, then it can't do that with dithering either. This is only completely true for both the values rgb(255,255,255) and rgb(0,0,0).

        • Gurm replied to this.

          pocupine Let me clarify:

          1. You aren't ever going to get #ffffff, it might seem like you ought to be able to, but you aren't. Even if that's what you specify, it's not what you'll get. Color profiles futz with that. Now, it might be a reasonable supposition that if you SPECIFY #ffffff, then the card shouldn't dither, right? But you'll find in many cards there is simply no way to turn off the FRC. And FRC isn't selective... often it's applied as a final stage filter (although not always), after all other modifications (including color profile) have been applied.

          2. You would be correct IF a card ever output #000000 or #FFFFFF, that the monitor at that point should NOT dither it. But again - many do. Plasma TV's, as an example, can be told to display pure black or pure white, and will instead do a rolling color output. Partly this is power saving, partly it's to prevent image retention, partly it's to squeeeeeeeze more colors out of the device than it can actually produce.

          3. Many devices screw with the final output color, even if that color is white or black, based on the adjustments of the device. Long gone are the days when you can say "display pure black" and have the monitor simply NOT LIGHT those pixels... and rely on the backlight for lighting adjustments.

          So while I agree with you in principle, I think that in practice the ability to say "it's pure white, it's DEFINITELY NEVER DITHERED" is limited.

            Gurm So when I view a white page (again, so you'll understand, white means #ffffff), my GPU doesn't output #ffffff to my monitor? Please provide sources that support this.

              pocupine You don't need sources. Set your background to white, then go into the control panel for your video card and yank on a slider and watch the screen output change... this is 100% the case for every video card. There is a software or hardware layer doing various fuckeries with the output before it goes to the monitor. I agree with you that IF you really sent #FFFFFF to the monitor it SHOULD display it without dither (although the monitor has software that does other fuckeries). But good luck making that happen.

              Additionally, WINDOWS has multiple abstraction layers before the output makes its way the screen. Color Profiles, color correction, gamma... all applied at the driver level, the card hardware level, the windows API level, the screen composition level... then again at the input and output phases of the display unit.

              Windows 10 added ANOTHER layer, called "Composition spaces". The OS support for this began in the Anniversary Edition, which is why many of us find builds 1607+ to be unusable.

                Do you think Windows 10 introduced dithering and that's why a lot of people are having trouble with it, even on monitors that are using 8-bits per pixel?

                Do you have any suggestion on how I can test if my eyestrain is due to dithering?

                4 days later

                Why not try contacting NVIDIA through official channels and ask them about dithering on 9xx/10xx cards? DevTalk and other forums aren't going to get us anywhere. and @si_edgey for starters put the Temporal Dithering discussion on the NVIDIA forum in the wrong category, for starters.

                https://nvidia.custhelp.com/app/ask

                If enough people contact them, maybe they will notice.

                I have been speaking to Nvidia and was escalated to a member of the driver team. He has not promised anything but has submitted a feature enhancement request to try and have dithering options included in future drivers. I have written back to try to get clarification of what has changed in the 10xx series hardware (and also later 9xx cards potentially) so will report back.

                He was actually very compassionate about the issue and (as expected) had never heard of this being an issue before. I would recommend that everyone get in touch with Nvidia to start making a noise about it so they are more likely to listen!

                  Enclosed is a response from NVIDIA support

                  Hi,
                  My name is Derek and your support request was escalated to me for review. I have ready through the history and wanted to let you know that this feedback is something that I have passed on to our driver development teams.

                  Currently there aren't any options or controls that I am aware of to let you turn it off.

                  I was going to suggest you inquire on the developers forums but it looks like you are already aware of them. That would be the best place to ask about the possibility of being able to disable it.

                  Derek-

                  • JTL replied to this.
                    dev