Gurm Nice. Looks like I need to get one. Thanks.

a month later

KM Just an update.

i reverted to Debian 8 and tried the older nVidia driver (367 I think) and still get some strain there.
To be honest I never really used this card much except for GPU accelerated computing (think password cracking and stuff) so I never really used it's display outputs until recently, just briefly to install an OS.

I am using an EVGA GTX 1070 reference edition purchased in July 2016.

I switched to a BenQ GW60HS (27" VA) to rule out monitor dithering as well.
I have a 560 i can swap with so i will try that tonight.

I could also a) Contact EVGA and warranty the card (Warranty is until 2019) or b) I noticed on NewEgg that some Nvidia Quadro cards from ~2015 are still being manufactured. Should figure out what generation they are from.

There is definitely something that changed in the output of the 10x0 series. It happened, actually, midway through the 9x0 generation. I had no problem at all with an MSI GTX970 which was introduced in 2015 I suspect... whereas the Zotac and EVGA models (which were the "lower power consumption" variety which only needed one power header) were instantly problematic. The display actually LOOKED FLICKERY, leading me to suspect that Temporal Dithering is on by default in hardware for those chips.

    Gurm Yeah. I can see the pixels shifting on a static image :/

    Not really any harm in trying to RMA the card, since it was $600 might as well?

    Could also be a VBios thing.

    For reference, here is the model of 1070 I have.

    http://secure1.ncix.com/products/?sku=131857

    Dithering is actually really obvious to see once you know what to look for. Glad that others see it too.

    • JTL replied to this.

      degen Speaking of Nvidia drivers, this isn't related to dithering, but I remember on WIndows Nvidia had a bug where with the GTX 5xx and most 6xx cards, any driver newer than 314.22 until 337 (I think) would cause the system to completely freeze so bad even the power button stopped working.

      Lost a couple WIndows installs to it a few years ago 🙁

      Just thought it was an interesting anecdote regarding Nvidia driver quality.

      Trying a manually focused, manual camera settings up to a monitor to see what results I get.

      Camera: Canon 80D
      GPU: EVGA GTX 560 (released in late 2010, purchased in May 2012)
      Monitor: BenQ GW2760HS using DisplayPort

      https://www.youtube.com/watch?v=QV_qadkkRio https://www.youtube.com/watch?v=m3Fu6fgTsI8

      I wonder how much of it is possible dithering and the other is moiré interference patterns?

        Gurm There is definitely something that changed in the output of the 10x0 series. It happened, actually, midway through the 9x0 generation. I had no problem at all with an MSI GTX970 which was introduced in 2015 I suspect... whereas the Zotac and EVGA models (which were the "lower power consumption" variety which only needed one power header) were instantly problematic.

        Pretty much matches my findings.

        Idea I had

        if we can find a GTX 970 submodel (as in MSI/EVGA/Gigabyte/Whatever GTX 970) that doesn't seem to have dithering and another that does, we might be able to dump the VBios to see what changed, if any. Might even be possible to downgrade it as well.

        For what it's worth, my 'good' 970 is a Gigabyte G1 Gaming edition:
        GPU: GM204-200-A1
        BIOS reported as: 84.04.2F.00.80

        EDIT: I'm running the latest Nvidia drivers without any hassles.

        I think my MSI 970 is fine. It doesn't seem to be any worse than my Asus 650 Ti Boost. I'm using old drivers (368.81, would like to try a pre-10xx release however) and WIndows 7.

        Video Chipset Codename: GM204-200/D17U-20
        Video BIOS Version: 84.04.84.00.29
        Video Chipset Revision: A1

        I bought this 970 very late into it's manufacturing lifespan. They had already switched to Elpida memory, which always happens after the reviews are done and such.

        It does require the 2 power cables like the other MSI 970s.

        Alright.

        Now I think a good idea is to find a bad 970, dump the BIOS (use GPU-Z) and compare the version and the files.

        This proves if it's the BIOS or something deeper.

        Good luck

        If someone gets eyestrain from reading black text on white background then it should hint that the problem is not with dithering, right? Because any screen can show black and white without dithering.

        JTL Does the pattern occur even on white background? It's hard to see from the video.

          pocupine White isn't really white, I assumed everyone knew that but perhaps not. Black text on a white background means that you've got massive amounts of FRC happening, might as well be a beige or purple background at that point. All three subpixels are firing to make white. Also "white" is a relative term, depending on gamma, color profiles, etc. so the assumption that there's no dithering with a white pixel is ... 100% incorrect.

            Gurm White as in the color #ffffff or rgb(255,255,255) should not require any dithering. And black, as is the color #000000 or rgb(0,0,0) should require it either.

            Dithering occurs when a monitor can't display a certain color, or more precisely a combination of RGB values (which correspond to the amount of light allowed to pass through the RGB subpixels). There's no reason at all that any display that uses 2-bits per pixel or more would need to dither for those two black and white values that I mentioned.

            I assumed everyone knew that but perhaps not.

            EDIT: Just to clarify, it is not possible to dither for the highest or lowest values a pixel can have. If a monitor can't display rgb(255,255,255) without dithering, then it can't do that with dithering either. This is only completely true for both the values rgb(255,255,255) and rgb(0,0,0).

            • Gurm replied to this.

              pocupine Let me clarify:

              1. You aren't ever going to get #ffffff, it might seem like you ought to be able to, but you aren't. Even if that's what you specify, it's not what you'll get. Color profiles futz with that. Now, it might be a reasonable supposition that if you SPECIFY #ffffff, then the card shouldn't dither, right? But you'll find in many cards there is simply no way to turn off the FRC. And FRC isn't selective... often it's applied as a final stage filter (although not always), after all other modifications (including color profile) have been applied.

              2. You would be correct IF a card ever output #000000 or #FFFFFF, that the monitor at that point should NOT dither it. But again - many do. Plasma TV's, as an example, can be told to display pure black or pure white, and will instead do a rolling color output. Partly this is power saving, partly it's to prevent image retention, partly it's to squeeeeeeeze more colors out of the device than it can actually produce.

              3. Many devices screw with the final output color, even if that color is white or black, based on the adjustments of the device. Long gone are the days when you can say "display pure black" and have the monitor simply NOT LIGHT those pixels... and rely on the backlight for lighting adjustments.

              So while I agree with you in principle, I think that in practice the ability to say "it's pure white, it's DEFINITELY NEVER DITHERED" is limited.

                Gurm So when I view a white page (again, so you'll understand, white means #ffffff), my GPU doesn't output #ffffff to my monitor? Please provide sources that support this.

                  pocupine You don't need sources. Set your background to white, then go into the control panel for your video card and yank on a slider and watch the screen output change... this is 100% the case for every video card. There is a software or hardware layer doing various fuckeries with the output before it goes to the monitor. I agree with you that IF you really sent #FFFFFF to the monitor it SHOULD display it without dither (although the monitor has software that does other fuckeries). But good luck making that happen.

                  Additionally, WINDOWS has multiple abstraction layers before the output makes its way the screen. Color Profiles, color correction, gamma... all applied at the driver level, the card hardware level, the windows API level, the screen composition level... then again at the input and output phases of the display unit.

                  Windows 10 added ANOTHER layer, called "Composition spaces". The OS support for this began in the Anniversary Edition, which is why many of us find builds 1607+ to be unusable.

                    dev