I'm currently investigating if the OS or Motherboard drivers have any bearing on this. They SHOULDN'T. but since the cables and monitors are the same...

KM My thoughts are we must buy and test current cards to see if they work.

Unfortunately, at some point someone has to buy expensive equipment and see if it works.
The best way is to have companies / people who do this for a living (reviewing hardware) review to see if it's bad or not.

Problem is, we've yet to determine what reliably causes the issue. PWM, Dithering are big candidates as well as others. But then what causes the dithering? Motherboard? Graphics unit? Montitor? Some of the above?

There have been some ideas already that haven't really panned out so I'm interested in hearing any ideas that I could setup / host.

In any case, The GeForce GTX 750Ti works well for me (dithering disabled under linux) with a 8-bit monitor 🙂

    a month later

    I recently bought an MSI GeForce GT 710 and use it with Windows 10. Seems it's usable so far. I wonder if one of those forced upgrades is going to break anything. Probably just a matter of time.

    KM, no it turns out that you just have to buy one of the NICE ones. Zotac and EVGA were both dreadul on my eyes, but MSI is just fine! (It also lights up red and has wicked-looking heatpipes...)

    I don't have any of these cards, but I can comment on a feature that might be worth considering.

    For HTPC use, the 960 is the only one of the 950, 960, 970 and 980 series that has both HDMI 2.0 and hardware h.265 encoding and decoding.

    10 days later

    This is interesting reading. I recently built a computer for home using an 8-bit Viewsonic VA2855SMH monitor and an EVGA GTX560 1GB card. This computer is causing me eyestrain whereas the same setup with a crappy old Asus GT 610 card doesn't.

    @Gurm - did you adjust to the EVGA card in the end? I would love to test an MSI GTX 560 for example, but the problem is the process of testing this 'just to see' can lead me to having a migraine for several days! This is what stops me experimenting too much as I just can't handle them any more.

    I think I'll pick up an MSI 750ti to see if that changes things..

    I went from GTX 650 -> 970 hooked up to my plasma TV. I thought at first the 970 looked a little "noisier" but since I have not had any symptoms I've chalked that up to me looking for a change when I switched cards.

    I got a reminder recently about how vital the GPU is in my eye-friendly chain when I hooked up a 3rd-gen Apple TV to my plasma TV and started to get all the classic symptoms within 30s. With my 970 viewing time on that TV, which is my most eye-friendly screen, is only limited by the natural fatiguing of my eyes over 4-5 hours. (More and more it looks like dithering is more problematic for me than PWM. My iPhone 6s which has no PWM becomes problematic within 15-30s which is much worse than many PWM screens which I use briefly out of necessity)

    Thanks for the info @degen - out of interest what brands were your 650 and 970? And yes, it sounds like temporal dithering is at least part of your problem - it is my entire problem!

    @Slacor - this forum is really great, a brilliant way of communicating. Quick idea - would it be possible to have some custom info from each user below their posts? This would include what they believe their problem to be (Blue Light / PWM / Temporal Dithering Sensitivity), their current working setup and maybe even a link to a post they've made detailing their solution so far? The reason for this is that a monitor solution for someone suffering from PWM sensitivity would be different to the solution for someone with TD sensitivity and vice versa, so it would be easier to focus on solutions that are applicable to you.

      si_edgey would it be possible to have some custom info from each user below their posts?

      If you hover over a username, it displays info about that user. You can "write something about yourself" in that by going to your Profile (Username -> Profile)

        Slacor Very useful feature. I just added my usable setting in my profile.

        9 days later

        I am starting to think that the MSI GeForce 970 has different architecture from the eVGA and Zotac. Both of those were cheaper cards in spite of being clocked higher. Did nVidia do a mid-generation change on the Maxwell chips to release higher-clock-capable chips with less cooling? It seems that some of the Maxwells feel as bad to me as the Pascals (10x0) in this regard.

        I can order an MSI, but would love to figure this out.

        My new GeForce GT 710 is not as good as I initially thought. Looking at the screen for longer periods of time didn't feel as good as with my old Quadro card. I blamed Windows updates for it, and for some days I tried Linux, which has its own still unsolved eye strain reasons. But now after installing Windows 7 SP1 my eyes still didn't relax. So I put my Quadro card back in - much better instantly.

        Even Quadro cards aren't 100% immune to this problem. I run a Quadro K420 at work, and when I updated to Windows 10 Anniversary my machine became instantly harder to look at. I downgraded immediately. There are so many ways for the drivers to utterly f*$# with our eyes. 🙁

        I also have a Quadro K4200 at work and together with DELLU2412M was a mess. Not as bad as an Intel graphics card, but not as good to can work all day with it.

        Currently I'm testing this Setup with DELL U2415 monitor. It seems to be better.

        What a nightmare. So many variable and so many interactions and who know so little. This is going to take years and years to sort out.

        a year later

        I had an idea recently on a possible method to detect "good" GTX 970's vs bad GTX 970's, and possibly convert the bad ones to good ones.

        I mentioned it in an email to @Gurm but I thought I would share here anyways

        I also had another idea regarding why some 970's might be "good" and
        others "bad" in terms of dithering and a way to test and to possibly
        turn bad ones into good ones.

        Since the dithering seems to happen no matter what driver or OS is used
        I'm inclined to think it's a VBIOS thing, and something got changed with
        newer VBIOS versions that affects newer manufactured cards.

        What would be a REALLY GOOD way to test is have a system where you have
        one "known good" card and another "known bad" card and compare the VBIOS
        versions, if they differ that could be a smoking gun. But we can go
        further. I know the nouveau driver under Linux let's you load a custom
        VBIOS version that the GPU should run. If it manages to load the VBIOS
        into GPU RAM it should take over the card functions, and if it works
        without dithering then you know it's the VBIOS. Even if it doesn't work
        doing a "hotswap" VBIOS might not be enough, in which case you can do a
        GPU flash under DOS. This has the risk of bricking the card, not the
        good one though, because your trying to overwrite the VBIOS on the bad
        card.

        Another way to test. I know some people who run their gaming OS under a
        virtual machine (think VMWare, KVM) and these hypervisors with PCIe pass
        through might be able to do a custom VBIOS hotswap as well.

        This is a good read:
        https://wiki.archlinux.org/index.php/PCI_passthrough_via_OVMF#UEFI_.28OVMF.29_Compatability_in_VBIOS

        Updated VBIOS can be used in the VM without flashing. To load it in QEMU:

        • KM likes this.

        And does anyone actually know if Windows 10 is doing dithering of it's own or something else? I remember back around circa fall 2015 running Windows 10 (1507) on this laptop getting strain with Chrome and I didn't know why, it was the blurry font rendering caused by DirectWrite. This is also why I want to get a lossless video capture card, capture some empirical data.

          dev