For what it's worth, my 'good' 970 is a Gigabyte G1 Gaming edition:
GPU: GM204-200-A1
BIOS reported as: 84.04.2F.00.80

EDIT: I'm running the latest Nvidia drivers without any hassles.

I think my MSI 970 is fine. It doesn't seem to be any worse than my Asus 650 Ti Boost. I'm using old drivers (368.81, would like to try a pre-10xx release however) and WIndows 7.

Video Chipset Codename: GM204-200/D17U-20
Video BIOS Version: 84.04.84.00.29
Video Chipset Revision: A1

I bought this 970 very late into it's manufacturing lifespan. They had already switched to Elpida memory, which always happens after the reviews are done and such.

It does require the 2 power cables like the other MSI 970s.

Alright.

Now I think a good idea is to find a bad 970, dump the BIOS (use GPU-Z) and compare the version and the files.

This proves if it's the BIOS or something deeper.

Good luck

If someone gets eyestrain from reading black text on white background then it should hint that the problem is not with dithering, right? Because any screen can show black and white without dithering.

JTL Does the pattern occur even on white background? It's hard to see from the video.

    pocupine White isn't really white, I assumed everyone knew that but perhaps not. Black text on a white background means that you've got massive amounts of FRC happening, might as well be a beige or purple background at that point. All three subpixels are firing to make white. Also "white" is a relative term, depending on gamma, color profiles, etc. so the assumption that there's no dithering with a white pixel is ... 100% incorrect.

      Gurm White as in the color #ffffff or rgb(255,255,255) should not require any dithering. And black, as is the color #000000 or rgb(0,0,0) should require it either.

      Dithering occurs when a monitor can't display a certain color, or more precisely a combination of RGB values (which correspond to the amount of light allowed to pass through the RGB subpixels). There's no reason at all that any display that uses 2-bits per pixel or more would need to dither for those two black and white values that I mentioned.

      I assumed everyone knew that but perhaps not.

      EDIT: Just to clarify, it is not possible to dither for the highest or lowest values a pixel can have. If a monitor can't display rgb(255,255,255) without dithering, then it can't do that with dithering either. This is only completely true for both the values rgb(255,255,255) and rgb(0,0,0).

      • Gurm replied to this.

        pocupine Let me clarify:

        1. You aren't ever going to get #ffffff, it might seem like you ought to be able to, but you aren't. Even if that's what you specify, it's not what you'll get. Color profiles futz with that. Now, it might be a reasonable supposition that if you SPECIFY #ffffff, then the card shouldn't dither, right? But you'll find in many cards there is simply no way to turn off the FRC. And FRC isn't selective... often it's applied as a final stage filter (although not always), after all other modifications (including color profile) have been applied.

        2. You would be correct IF a card ever output #000000 or #FFFFFF, that the monitor at that point should NOT dither it. But again - many do. Plasma TV's, as an example, can be told to display pure black or pure white, and will instead do a rolling color output. Partly this is power saving, partly it's to prevent image retention, partly it's to squeeeeeeeze more colors out of the device than it can actually produce.

        3. Many devices screw with the final output color, even if that color is white or black, based on the adjustments of the device. Long gone are the days when you can say "display pure black" and have the monitor simply NOT LIGHT those pixels... and rely on the backlight for lighting adjustments.

        So while I agree with you in principle, I think that in practice the ability to say "it's pure white, it's DEFINITELY NEVER DITHERED" is limited.

          Gurm So when I view a white page (again, so you'll understand, white means #ffffff), my GPU doesn't output #ffffff to my monitor? Please provide sources that support this.

            pocupine You don't need sources. Set your background to white, then go into the control panel for your video card and yank on a slider and watch the screen output change... this is 100% the case for every video card. There is a software or hardware layer doing various fuckeries with the output before it goes to the monitor. I agree with you that IF you really sent #FFFFFF to the monitor it SHOULD display it without dither (although the monitor has software that does other fuckeries). But good luck making that happen.

            Additionally, WINDOWS has multiple abstraction layers before the output makes its way the screen. Color Profiles, color correction, gamma... all applied at the driver level, the card hardware level, the windows API level, the screen composition level... then again at the input and output phases of the display unit.

            Windows 10 added ANOTHER layer, called "Composition spaces". The OS support for this began in the Anniversary Edition, which is why many of us find builds 1607+ to be unusable.

              Do you think Windows 10 introduced dithering and that's why a lot of people are having trouble with it, even on monitors that are using 8-bits per pixel?

              Do you have any suggestion on how I can test if my eyestrain is due to dithering?

              4 days later

              Why not try contacting NVIDIA through official channels and ask them about dithering on 9xx/10xx cards? DevTalk and other forums aren't going to get us anywhere. and @si_edgey for starters put the Temporal Dithering discussion on the NVIDIA forum in the wrong category, for starters.

              https://nvidia.custhelp.com/app/ask

              If enough people contact them, maybe they will notice.

              I have been speaking to Nvidia and was escalated to a member of the driver team. He has not promised anything but has submitted a feature enhancement request to try and have dithering options included in future drivers. I have written back to try to get clarification of what has changed in the 10xx series hardware (and also later 9xx cards potentially) so will report back.

              He was actually very compassionate about the issue and (as expected) had never heard of this being an issue before. I would recommend that everyone get in touch with Nvidia to start making a noise about it so they are more likely to listen!

                Enclosed is a response from NVIDIA support

                Hi,
                My name is Derek and your support request was escalated to me for review. I have ready through the history and wanted to let you know that this feedback is something that I have passed on to our driver development teams.

                Currently there aren't any options or controls that I am aware of to let you turn it off.

                I was going to suggest you inquire on the developers forums but it looks like you are already aware of them. That would be the best place to ask about the possibility of being able to disable it.

                Derek-

                • JTL replied to this.

                  JTL I have attached my resposne

                  Hi Derek

                  The problem with the developer forums as others have noted is that NVIDIA employees don't really read the forum, let alone the ones responsible for driver and/or VBIOS development.

                  It would be interesting to see if temporal dithering is enabled in hardware with the later 900 and 1000-series cards. I am running Linux which does allow you to disable dithering in the driver yet I notice with a high speed camera what appears to be dithering with a BenQ 8-bit VA panel monitor over DVI (tested with EVGA GTX 1070) and as others have noted it can cause some eyestrain and headaches that are absent without the dithering.

                  I appreciate your help though.

                  Here is his response

                  Sorry I couldn't be more help. Passing on this sort of user feedback is all I can really do.

                  Derek-

                  10 days later

                  Hi all,
                  This is a fascinating thread! I came across it after googiling for some cryptic Nvidia Xorg.conf parameters.

                  I have been struggling with Nvidia's Linux drivers for quite some time (8 months or so). My main issue is that I could never get graphics looking "right" under Linux. By right, I mean the same as under Windows 10. I could never put my finger on it as to what was wrong but the text just didn't look the same. This caused me a lot of eyestrain and I could never use my computer under Linux as I can under Windows. Even when I installed Windows fonts, the text rendering was not good and I spent a lot of time messing with fontconfig etc. But fontconfig was definitely not the issue. I did figure out what was wrong but I still haven't been able to fix it.

                  I'll cut right to the chase: open this test PNG file in your default image viewer (or a web browser), make sure it's set to 100% (no scaling) and look at the bottom two lines. If they're crisp and readable, your Nvidia driver is performing fine. If the two lines are fuzzy, it could be that you don't have a display that supports 4:4:4 chroma mode or it could be that you're running Nvidia under Linux.

                  Test: http://i.rtings.com/images/test-materials/2017/chroma-444.png

                  Under Windows 10, that test pattern looks absolutely PERFECT on my screen. Text is sharp and bottom two lines are perfectly readable and look just right. Under Linux, test pattern looks like 4:2:2 Chroma subsampling:

                  http://i.rtings.com/images/reviews/ju7100/ju7100-text-chroma-4k-30hz-large.jpg
                  Pictures of problems: http://www.rtings.com/tv/learn/chroma-subsampling

                  I have 4K Samsung HDR display and it's connected to my Nvidia GTX 1060 over HDMI 2.0 cable. I have tried all kinds of Xorg.conf paramerts and I just can't get it to display colors the same way that Windows 10 does. I've tried different cables (certified 4K UHD cables) but to no avail. The issue is not hardware but software.

                  I've tried disabling dithering, enabling Force Composition Pipeline but that does not help. I'm at loss as what could be wrong and I suspect only Nvidia can fix this.

                  Has anyone here experienced this issue?
                  Thanks.

                  • JTL replied to this.
                  • JTL likes this.
                    dev