degen Unfortunately can't test the basic display driver as it won't recognise my capture card, might see if there is some way around it, it does kind of feel different to the intel driver.

EDIT: Was able to test the windows basic driver on a UHD630 i7 8700k, no dithering. Also no dithering on that gpu in normal use.

Edward I've seen a few posts like yours recently. And I feel pretty much the same as I did back in 2012 when this started for me - frustrated, struggling, and not clear on what technologies are even relevant to my situation. Does this sound true: the problem is more complex than our current available means of investigating it. If it is or not, what should we be doing?

I have learned more about displays and graphics in the last 5 years thanks to this site and others than ever before. I share your frustrations, however. Everything in my gut tells me that temporal dithering is the issue for most people here. PWM is very easy to rule out (simply don't use a screen that uses PWM!) but then it gets more complicated with pixel inversion (at this point my knowledge runs out).

If I have a good device and a bad device using the same HDMI cable and the same monitor, then the techie in me says "OK, it's something to do with the hardware/software in this machine" (monitor has been ruled out as the cause). So at this point we're looking at the 'point of failure' being..

VBIOS (Mobo/GPU) (which may default to temporal dither, hence why even on POST screens we get discomfort on new tech)
OS itself (W10 composition spaces).
OS Drivers (which could be set to temporal dither)

VBIOS is locked down and generally unaccessible to make changes AFAIK
OS - Maybe a regedit or fix could disable any OS tomfoolery
OS Drivers - (Ditherig/Nvidia reg keys which aren't 100%).

Reading into the patents concerning Temporal Dithering have been somewhat revealatory to me. It does cause flicker, regardless of how miniscule it's claimed to be or how it can "fool the eyes" to see the extra colors. It is snake oil for supposed '10 bit' displays, which in fact are 8bit. Or it's to guarantee uniformity of color on any panel. Which makes sense from a customer satisfaction perspective, but flicker is flicker.

    diop So for a solution we need to engage perhaps:

    • display manufacturers
    • VBIOS writers
    • OS writers
    • OS driver writers (does that mean OEMs for integrated/discrete GPUs, and basically Linux guys?)
    • I/O standards groups (for HDMI, DP, VGA, DVI, etc.)

    That seems like a far more complex approach than what has been achieved so far. PWM was solved by display tetch only, same for blue light.

    Things get more complex when we add that different sufferers are sensitive to different technological issues, and possibly different combinations of issues.

    I really want to know what people think the best next steps are.

    diop Reading into the patents concerning Temporal Dithering have been somewhat revealatory to me.

    On a related note, I have figured out I can test for temporal dithering creating a flicker effect by seeing if the dithering pattern repeats, and how many frames it takes to repeat. I will have a go at it when my work load is a bit lower.

    3 years later

    Seagull I think I've probably adapted to dithering patterns produced by different combinations of monitor/card. Hence, any changes produce discomfort as I'm no longer seeing the same pattern. I will test this out at some point by seeing if I can adjust to an uncomfortable card. I know I can adjust to uncomfortable smartphones, so I am fairly confident this is the case.

    Hope it's ok to bump this old thread but I'm beginning to suspect this could be the issue for some of us. Do you still think this theory is true @Seagull ?

      ryans

      Yes, for many years it have been key to how I manage my symptoms. Above a certain intensity of eyestrain/migraines I will abandon a device because it is unlikely I will be able to adapt to it, but for most devices I will try to get used to the device. I've never been able to adapt to an IPS LCD device, but flicker free TN LCDs and OLEDs now only take me a few weeks/months to be usable.

      It has been harder than it sounds though. A lot of my eye strain symptoms are linked to diet, I have a lot of food intolerances which trigger eyestrain and migraines. Its only since finding those intolerances and managing them that adapting to bad devices has been an option. I need to keep my body, and therefore my brain, as healthy as possible to be able to adapt to new devices.

        a year later

        Following @aiaf's project in the thread I disabled dithering on Apple silicon + Introducing Stillcolor macOS M1/M2/M3 (which I believe was partially inspired by the information in this thread), I purchased a Blackmagic Design UltraStudio Recorder 3G and downloaded ffmpeg to perform my own testing for temporal dithering and try to contribute some more data to the community here.  However, a few things are unclear and I want to ensure I am collecting valid data.

        First, I observed that several GPUs (specifically NVIDIA) on both macOS and Windows exhibit temporal dithering when I use a DisplayPort to HDMI cable, but do not exhibit dithering when using a straight HDMI cable from the HDMI output port of the GPU.  I tried an older HDMI cable as well as an "Ultra High Speed" cable, but it seemed to make no difference.  I read in another thread that older HDMI cables may effectively reduce color depth via their limited bandwidth, but otherwise I was unsure whether this is the expected outcome.

        Second, I am beginning to question whether the GPU/OS will send the same kind of output to the capture card as to a normal display.  After inspecting the EDID of the capture card, it appears to advertise itself as a "non-RGB color display" taking YCbCr input.  On macOS at least, I suspect the capture card was being detected as a television (if I am interpreting the output of AllRez correctly).  On Windows, I noticed that when both a normal display and the capture card were connected to my NVIDIA GTX 3070, dithering was evident on the capture card side, but ColorControl (with automatic settings) appeared to indicate that dithering was only active on the capture card and not on the normal display:

        I don't know of a way to determine dithering status in an equivalent way on macOS.  Basically, I am concerned that we may be comparing apples to oranges with this method.

        Lastly, some of the results simply do not make sense.  A capture from my AMD Radeon Pro WX 5100 (on both macOS High Sierra and Monterey via OpenCore) exhibited no temporal dithering, while the picture on normal monitors seemed (subjectively) very unstable and I strongly suspect temporal dithering is in effect.  On the flip side, my original NVIDIA GT120 exhibits (again, subjectively) the most stable picture of all the GPUs I tested on a normal monitor, but the capture output seemed to show the most significant temporal dithering of all samples.  At this point, I am starting to wonder if slower/deeper temporal dithering may be easier on the eyes than the barely-perceptible kind because our eyes and brains can actually make sense of it.

        Any thoughts would be appreciated.  Here is the parsed EDID data for the Blackmagic capture card:

        Block 0, Base EDID:
          EDID Structure Version & Revision: 1.3
          Vendor & Product Identification:
            Manufacturer: BMD
            Model: 0
            Serial Number: 1 (0x00000001)
            Made in: week 52 of 2012
          Basic Display Parameters & Features:
            Digital display
            Maximum image size: 71 cm x 40 cm
            Gamma: 2.50
            Non-RGB color display
            First detailed timing is the preferred timing
          Color Characteristics:
            Red  : 0.6396, 0.3447
            Green: 0.2910, 0.6347
            Blue : 0.1630, 0.0927
            White: 0.2880, 0.2958
          Established Timings I & II:
            DMT 0x04:   640x480    59.940476 Hz   4:3     31.469 kHz     25.175000 MHz
          Standard Timings: none
          Detailed Timing Descriptors:
            DTD 1:  1920x1080   60.000000 Hz  16:9     67.500 kHz    148.500000 MHz (800 mm x 450 mm)
                         Hfront   88 Hsync  44 Hback  148 Hpol P
                         Vfront    4 Vsync   5 Vback   36 Vpol P
            DTD 2:  1920x1080   50.000000 Hz  16:9     56.250 kHz    148.500000 MHz (800 mm x 450 mm)
                         Hfront  528 Hsync  44 Hback  148 Hpol P
                         Vfront    4 Vsync   5 Vback   36 Vpol P
            Display Product Name: 'BMD HDMI'
            Display Range Limits:
              Monitor ranges (GTF): 50-60 Hz V, 15-45 kHz H, max dotclock 80 MHz
          Extension blocks: 1
        Checksum: 0x01
        
        ----------------
        
        Block 1, CTA-861 Extension Block:
          Revision: 3
          Basic audio support
          Supports YCbCr 4:4:4
          Supports YCbCr 4:2:2
          Native detailed modes: 2
          Video Data Block:
            VIC   5:  1920x1080i  60.000000 Hz  16:9     33.750 kHz     74.250000 MHz (native)
            VIC   4:  1280x720    60.000000 Hz  16:9     45.000 kHz     74.250000 MHz (native)
            VIC   6:  1440x480i   59.940060 Hz   4:3     15.734 kHz     27.000000 MHz (native)
            VIC  20:  1920x1080i  50.000000 Hz  16:9     28.125 kHz     74.250000 MHz (native)
            VIC  19:  1280x720    50.000000 Hz  16:9     37.500 kHz     74.250000 MHz (native)
            VIC  21:  1440x576i   50.000000 Hz   4:3     15.625 kHz     27.000000 MHz (native)
            VIC  32:  1920x1080   24.000000 Hz  16:9     27.000 kHz     74.250000 MHz (native)
            VIC  33:  1920x1080   25.000000 Hz  16:9     28.125 kHz     74.250000 MHz (native)
            VIC  34:  1920x1080   30.000000 Hz  16:9     33.750 kHz     74.250000 MHz (native)
            VIC  31:  1920x1080   50.000000 Hz  16:9     56.250 kHz    148.500000 MHz (native)
            VIC  16:  1920x1080   60.000000 Hz  16:9     67.500 kHz    148.500000 MHz (native)
            VIC   3:   720x480    59.940060 Hz  16:9     31.469 kHz     27.000000 MHz
            VIC  18:   720x576    50.000000 Hz  16:9     31.250 kHz     27.000000 MHz
            VIC   1:   640x480    59.940476 Hz   4:3     31.469 kHz     25.175000 MHz
          Audio Data Block:
            Linear PCM:
              Max channels: 8
              Supported sample rates (kHz): 48 44.1 32
              Supported sample sizes (bits): 24
          Speaker Allocation Data Block:
            FL/FR - Front Left/Right
            LFE1 - Low Frequency Effects 1
            FC - Front Center
            BL/BR - Back Left/Right
            FLc/FRc - Front Left/Right of Center
          Vendor-Specific Data Block (HDMI), OUI 00-0C-03:
            Source physical address: 0.0.0.0
          Vendor-Specific Data Block, OUI 00-00-00:
          Colorimetry Data Block:
            BT2020YCC
            BT2020RGB
          HDR Static Metadata Data Block:
            Electro optical transfer functions:
              Traditional gamma - SDR luminance range
              SMPTE ST2084
              Hybrid Log-Gamma
            Supported static metadata descriptors:
              Static metadata type 1
        Checksum: 0x3a  Unused space in Extension Block: 84 bytes

          macsforme It will definitely NOT send the same to the capture card as to the monitor unless you spoof the EDID. You can use BetterDisplay to download the monitor EDID. I don't have a capture card myself, but it should be possible to spoof the EDID.

            macsforme I might have more to add later, but one reason I went for the DVI2PCIe is you can program an arbitrary EDID binary to the card, and it remains persistent.

            AFAIK that isn't (easily) possible for the Blackmagic cards.

              async but it should be possible to spoof the EDID.

              Depending on what "layer" the spoofing is done I wouldn't be surprised if that has an effect on the results.

              Safest is a capture card that has it's own nonvolatile storage to program an arbitrary EDID binary independent of the host system. From the perspective of the target device it just sees an arbitrary monitor connected.

              macsforme

              Very interesting work!

              I would be careful about making assessments about dithering by eye though. An LCD monitor may have its own native dithering and will have LCD inversion artefacts occurring alongside any dithering from the GPU. Then there is your brain's processing of the visual signals from your eyes: you could be more or less sensitive to some dithering algorithms, if you are used to a particular dithering algorithm from a GPU you used often you may have been desensitised to it. Combining these factors makes it difficult to make a reliable assessment by eye, and may be why we as a community can't agree on what is a good/safe setup.

              The next step could be to create a system to test image stability on the monitor itself. Many people, myself included, have used sensors to measure a single predominate frequency of monitor flicker. If this could be expanded to measuring a broad band of flickering frequencies it could be used to determine if the monitor's image has become more or less stable without human subjectivity.

                Seagull if there exists a capture device that similar to a monitor calibration device it would be fairly easy to create a workflow that tests different stuff.

                A bit more tricky, but in theory it would be possible to create an iPhone app that can do it. Kinda strange that no one created an iPhone app for decent monitor calibration as well. Probably needs to run the test patterns as some different fpses to make sure it captures everything tho

                macsforme

                So I am in the same situation and have the same BMD capture device and can only capture dithering from RTX laptop when ColorControl dithering is set to "Auto: 8-bit Temporal" and never when color control dithering is set to disabled. I am of the opinion that I need to find something like DVI2PCIe or some EDID device that goes before the bmd capture device in the signal chain otherwise I'm comparing apples to oranges as you said. I am also using a higher bandwidth HDMI to HDMI cable that in theory should allow dithering to occur.

                I posted some EDID devices here and not sure if they are useful:
                https://ledstrain.org/d/2589-products-to-try-or-avoid-pwm-flicker-and-temporal-dithering-testing/185

                https://www.black-box.de/en-de/page/25843/Resources/Technical-Resources/Black-Box-Explains/multimedia/EDID-Emulation

                Do you still see dithering from your video analysis when you disable dithering inside color control?

                  async It will definitely NOT send the same to the capture card as to the monitor unless you spoof the EDID. You can use BetterDisplay to download the monitor EDID. I don't have a capture card myself, but it should be possible to spoof the EDID.

                  This suggestion guided me in the right direction. I was able to spoof the EDID of one of my real monitors (with a macOS display EDID override file) to the BMD capture device, and it was still able to work and capture video. What I ultimately discovered was that regardless of EDID spoofing or not, my AMD Radeon Pro WX 5100 will only output a signal with temporal dithering if the capture device is connected at boot time. Disconnecting and reconnecting the capture device causes the output to no longer have temporal dithering until the next reboot… strange! Regardless, this resolves the main discrepancy I encountered regarding the WX 5100 GPU not seeming to have temporal dithering while the visual experience suggested otherwise. The fact that temporal dithering was only detected via DisplayPort and not via HDMI on all GPUs I tried (with the exception of my 2015 MacBook Pro 15" with an AMD dGPU, which exhibited it on both) was another interesting discrepancy which may yield helpful insights.

                  JTL I might have more to add later, but one reason I went for the DVI2PCIe is you can program an arbitrary EDID binary to the card, and it remains persistent.

                  If you mean the $1,400 DVI2PCIe Duo… ouch. 🙁 The Blackmagic device was $70 (used) on eBay, plus the cost of the Thunderbolt cable.

                  Seagull I would be careful about making assessments about dithering by eye though.

                  A valid point, and I completely agree. Correct me if I'm wrong, but my impression was that empirically measuring temporal dithering from a physical LCD panel requires sophisticated cameras/tools which are out of reach for most people. As far as visual assessment, I try to limit my reliance on it to getting a general sensation of image stability, and have that form the basis for further empirical investigation.

                  photon78s Do you still see dithering from your video analysis when you disable dithering inside color control?

                  I never got that far, as it looked like I needed to reboot my workstation to make ColorControl work fully, and I was unable to do so at the time. However, I expect I would have reached the same results that you did. My main concern for that workstation was that dithering is off on the normal display (which was what ColorControl seemed to indicate), and then I had secondary concerns that the BMD capture device may not be getting the same output that a standard monitor would (which now seem mostly resolved, at least on macOS).

                  @DisplaysShouldNotBeTVs, are you willing to share your method for disabling temporal dithering on the AMD GPU on the MacBookPro11,5, as you've mentioned elsewhere? Now that I have a means of reliably measuring temporal dithering, I want to start trying to disable it, leveraging all possible methods.

                    macsforme spoof the EDID

                    do you have the EDID of a "safe" and "unsafe" monitor? Through CRU you can save settings to a file, for example

                      simplex do you have the EDID of a "safe" and "unsafe" monitor? Through CRU you can save settings to a file, for example

                      I have two displays (different models) corrected to this Mac Pro, both of which I consider "safe" although they are 6-bit+FRC so not ideal. Both generally seem to exhibit temporal dithering roughly equally, the degree of which varies based on the GPU currently installed. I plan to post my findings soon, but so far all GPUs exhibited temporal dithering detectable with the ffpmeg time blend method from the post I linked above. Some outputs were (subjectively) more or less comfortable than others.

                        macsforme empirically measuring temporal dithering from a physical LCD panel requires sophisticated cameras/tools

                        I've seen people do it high frame rate phone cameras, but I as I said before, I think the best way to do it would be a cheap optical sensor attached to a microphone jack, oscilloscope software to record the sensors input, and a Fourier transform to show all the different flicker frequencies present.

                        2 months later

                        macsforme I plan to post my findings soon

                        The results of my investigation into temporal dithering on Intel-based macOS are as follows. This testing pertained to external outputs only (not built-in laptop displays) given that a capture card was used. The machines tested included a 2009 Mac Pro, 2015 15-inch Retina MacBook Pro (iGPU only and iGPU/AMD dGPU variants), 2012 13-inch MacBook Pro (non-Retina), and 2009 15-inch MacBook Pro. The GPUs tested in the Mac Pro included the original NVIDIA GT 120, an NVIDIA GTX 640, an NVIDIA GTX TITAN (Kepler), an NVIDIA GTX 1080 FE, and an AMD Radeon Pro WX 5100.

                        Notably, the macOS version and firmware version of each machine did not appear to make any difference in the outcome, in the cases where several were tested. Most of the HDMI output testing was done with a standard HDMI cable, but I repeated several of the tests with a high-speed HDMI cable, and it seemed to make no difference on my hardware.

                        2009 Mac Pro - All NVIDIA cards do have temporal dithering on DisplayPort outputs, and do not have temporal dithering on HDMI outputs. The AMD WX 5100 has temporal dithering on its DisplayPort outputs (which is the only type of output), but only when plugged in at boot time (not after hot-plugging the monitor). Having multiple monitors plugged in and hot-plugging them seemed to cause inconsistent results as far as dithering.

                        2009 MacBook Pro & 2012 MacBook Pro - The former has an NVIDIA dGPU while the latter has an Intel iGPU; on mini-DisplayPort outputs, no temporal dithering was detected.

                        2015 15-inch Retina MacBook Pro - The iGPU-only model (MacBookPro11,4) does not exhibit temporal dithering on either mini-DisplayPort nor HDMI outputs. The model with the AMD dGPU (MacBookPro11,5) does exhibit temporal dithering on both mini-DisplayPort and HDMI outputs.

                          macsforme The model with the AMD dGPU (MacBookPro11,5) does exhibit temporal dithering on both mini-DisplayPort and HDMI outputs.

                          Does this change if you try the Psychtoolbox kext + Octave commands dither disable method I posted? I've already confirmed this method affects flicker systemwide as I can see noticeable small changes to colors and reduced flicker on an old TN monitor I tried it on. (Unfortunately that monitor also has its own FRC on 8bit signals and macOS seems to not be capable of forcing a 6bit connection)

                          So I'm curious if the PTB kext method actually stops the "GPU-level" temporal dithering entirely or if it only "reduces" it, before I go and try to search for a true 8bit monitor

                            dev