Do you mind sharing those 5 frames you captured? I'm guessing you're testing for differences between each frame? I'm curious where dithering is happening the most. I'm guessing on images with smooth transitions/gradients. But what about flat color images?

Also, do these cards capture lossless/raw? Or do they apply some sort of YUV 4:2:0 conversion? I wonder how this would affect analysis.

    Wallboy

    I am pretty sure the card was capturing losslessly, it had a lot of encoding options. I can't remember what setting I used, but it probably had lossless in its name. It created some huge files for only a few seconds of capture aswell. I don't have the computer with the card or data to hand, but I might be collecting it from my office this month if I have time and space in my travel bag.

    I can't say about where dithering happens the most as I was only capturing the windows desktop.

    2 months later

    @Seagull I wonder how do we square these results with the "consensus" which seems to be that GTX 6xx is more comfy than Intel. That is my experience (Also that ditherig doesn't help with symptoms). I find Microsoft Basic Display Adapter is more comfy than Intel, but what is the cause if Intel is not dithering with ditherig running (or even when it's not as these results indicate)?

      degen

      Could be that the older 6xx cards used a more random dithering algorithm. That's how I explain to myself how the dithering of my 6bit monitor is ok but that of a modern card causes discomfort. I'll see if I can test the basic display driver.

      degen We don't know what we're doing. We need to understand compute side and display side technology better. And we need to analyse each item individually and in combination to get a better idea. So far we're at: PWM often bad, dithering (compute side) often bad, FRC (display side) often bad, sub-pixel rendering often bad, brightness & contrast often important, environmental lighting often important, blue light bad...what else? Backlight patterns, inversion patterns, display input/output technologies (HDMI vs DP vs VGA), response times?

      I've seen a few posts like yours recently. And I feel pretty much the same as I did back in 2012 when this started for me - frustrated, struggling, and not clear on what technologies are even relevant to my situation. Does this sound true: the problem is more complex than our current available means of investigating it. If it is or not, what should we be doing?

      There are obviously smart people on here. But we make terribly slow, haphazard and often lucky progress.

        degen Unfortunately can't test the basic display driver as it won't recognise my capture card, might see if there is some way around it, it does kind of feel different to the intel driver.

        EDIT: Was able to test the windows basic driver on a UHD630 i7 8700k, no dithering. Also no dithering on that gpu in normal use.

        Edward I've seen a few posts like yours recently. And I feel pretty much the same as I did back in 2012 when this started for me - frustrated, struggling, and not clear on what technologies are even relevant to my situation. Does this sound true: the problem is more complex than our current available means of investigating it. If it is or not, what should we be doing?

        I have learned more about displays and graphics in the last 5 years thanks to this site and others than ever before. I share your frustrations, however. Everything in my gut tells me that temporal dithering is the issue for most people here. PWM is very easy to rule out (simply don't use a screen that uses PWM!) but then it gets more complicated with pixel inversion (at this point my knowledge runs out).

        If I have a good device and a bad device using the same HDMI cable and the same monitor, then the techie in me says "OK, it's something to do with the hardware/software in this machine" (monitor has been ruled out as the cause). So at this point we're looking at the 'point of failure' being..

        VBIOS (Mobo/GPU) (which may default to temporal dither, hence why even on POST screens we get discomfort on new tech)
        OS itself (W10 composition spaces).
        OS Drivers (which could be set to temporal dither)

        VBIOS is locked down and generally unaccessible to make changes AFAIK
        OS - Maybe a regedit or fix could disable any OS tomfoolery
        OS Drivers - (Ditherig/Nvidia reg keys which aren't 100%).

        Reading into the patents concerning Temporal Dithering have been somewhat revealatory to me. It does cause flicker, regardless of how miniscule it's claimed to be or how it can "fool the eyes" to see the extra colors. It is snake oil for supposed '10 bit' displays, which in fact are 8bit. Or it's to guarantee uniformity of color on any panel. Which makes sense from a customer satisfaction perspective, but flicker is flicker.

          diop So for a solution we need to engage perhaps:

          • display manufacturers
          • VBIOS writers
          • OS writers
          • OS driver writers (does that mean OEMs for integrated/discrete GPUs, and basically Linux guys?)
          • I/O standards groups (for HDMI, DP, VGA, DVI, etc.)

          That seems like a far more complex approach than what has been achieved so far. PWM was solved by display tetch only, same for blue light.

          Things get more complex when we add that different sufferers are sensitive to different technological issues, and possibly different combinations of issues.

          I really want to know what people think the best next steps are.

          diop Reading into the patents concerning Temporal Dithering have been somewhat revealatory to me.

          On a related note, I have figured out I can test for temporal dithering creating a flicker effect by seeing if the dithering pattern repeats, and how many frames it takes to repeat. I will have a go at it when my work load is a bit lower.

          3 years later

          Seagull I think I've probably adapted to dithering patterns produced by different combinations of monitor/card. Hence, any changes produce discomfort as I'm no longer seeing the same pattern. I will test this out at some point by seeing if I can adjust to an uncomfortable card. I know I can adjust to uncomfortable smartphones, so I am fairly confident this is the case.

          Hope it's ok to bump this old thread but I'm beginning to suspect this could be the issue for some of us. Do you still think this theory is true @Seagull ?

            ryans

            Yes, for many years it have been key to how I manage my symptoms. Above a certain intensity of eyestrain/migraines I will abandon a device because it is unlikely I will be able to adapt to it, but for most devices I will try to get used to the device. I've never been able to adapt to an IPS LCD device, but flicker free TN LCDs and OLEDs now only take me a few weeks/months to be usable.

            It has been harder than it sounds though. A lot of my eye strain symptoms are linked to diet, I have a lot of food intolerances which trigger eyestrain and migraines. Its only since finding those intolerances and managing them that adapting to bad devices has been an option. I need to keep my body, and therefore my brain, as healthy as possible to be able to adapt to new devices.

              a year later

              Following @aiaf's project in the thread I disabled dithering on Apple silicon + Introducing Stillcolor macOS M1/M2/M3 (which I believe was partially inspired by the information in this thread), I purchased a Blackmagic Design UltraStudio Recorder 3G and downloaded ffmpeg to perform my own testing for temporal dithering and try to contribute some more data to the community here.  However, a few things are unclear and I want to ensure I am collecting valid data.

              First, I observed that several GPUs (specifically NVIDIA) on both macOS and Windows exhibit temporal dithering when I use a DisplayPort to HDMI cable, but do not exhibit dithering when using a straight HDMI cable from the HDMI output port of the GPU.  I tried an older HDMI cable as well as an "Ultra High Speed" cable, but it seemed to make no difference.  I read in another thread that older HDMI cables may effectively reduce color depth via their limited bandwidth, but otherwise I was unsure whether this is the expected outcome.

              Second, I am beginning to question whether the GPU/OS will send the same kind of output to the capture card as to a normal display.  After inspecting the EDID of the capture card, it appears to advertise itself as a "non-RGB color display" taking YCbCr input.  On macOS at least, I suspect the capture card was being detected as a television (if I am interpreting the output of AllRez correctly).  On Windows, I noticed that when both a normal display and the capture card were connected to my NVIDIA GTX 3070, dithering was evident on the capture card side, but ColorControl (with automatic settings) appeared to indicate that dithering was only active on the capture card and not on the normal display:

              I don't know of a way to determine dithering status in an equivalent way on macOS.  Basically, I am concerned that we may be comparing apples to oranges with this method.

              Lastly, some of the results simply do not make sense.  A capture from my AMD Radeon Pro WX 5100 (on both macOS High Sierra and Monterey via OpenCore) exhibited no temporal dithering, while the picture on normal monitors seemed (subjectively) very unstable and I strongly suspect temporal dithering is in effect.  On the flip side, my original NVIDIA GT120 exhibits (again, subjectively) the most stable picture of all the GPUs I tested on a normal monitor, but the capture output seemed to show the most significant temporal dithering of all samples.  At this point, I am starting to wonder if slower/deeper temporal dithering may be easier on the eyes than the barely-perceptible kind because our eyes and brains can actually make sense of it.

              Any thoughts would be appreciated.  Here is the parsed EDID data for the Blackmagic capture card:

              Block 0, Base EDID:
                EDID Structure Version & Revision: 1.3
                Vendor & Product Identification:
                  Manufacturer: BMD
                  Model: 0
                  Serial Number: 1 (0x00000001)
                  Made in: week 52 of 2012
                Basic Display Parameters & Features:
                  Digital display
                  Maximum image size: 71 cm x 40 cm
                  Gamma: 2.50
                  Non-RGB color display
                  First detailed timing is the preferred timing
                Color Characteristics:
                  Red  : 0.6396, 0.3447
                  Green: 0.2910, 0.6347
                  Blue : 0.1630, 0.0927
                  White: 0.2880, 0.2958
                Established Timings I & II:
                  DMT 0x04:   640x480    59.940476 Hz   4:3     31.469 kHz     25.175000 MHz
                Standard Timings: none
                Detailed Timing Descriptors:
                  DTD 1:  1920x1080   60.000000 Hz  16:9     67.500 kHz    148.500000 MHz (800 mm x 450 mm)
                               Hfront   88 Hsync  44 Hback  148 Hpol P
                               Vfront    4 Vsync   5 Vback   36 Vpol P
                  DTD 2:  1920x1080   50.000000 Hz  16:9     56.250 kHz    148.500000 MHz (800 mm x 450 mm)
                               Hfront  528 Hsync  44 Hback  148 Hpol P
                               Vfront    4 Vsync   5 Vback   36 Vpol P
                  Display Product Name: 'BMD HDMI'
                  Display Range Limits:
                    Monitor ranges (GTF): 50-60 Hz V, 15-45 kHz H, max dotclock 80 MHz
                Extension blocks: 1
              Checksum: 0x01
              
              ----------------
              
              Block 1, CTA-861 Extension Block:
                Revision: 3
                Basic audio support
                Supports YCbCr 4:4:4
                Supports YCbCr 4:2:2
                Native detailed modes: 2
                Video Data Block:
                  VIC   5:  1920x1080i  60.000000 Hz  16:9     33.750 kHz     74.250000 MHz (native)
                  VIC   4:  1280x720    60.000000 Hz  16:9     45.000 kHz     74.250000 MHz (native)
                  VIC   6:  1440x480i   59.940060 Hz   4:3     15.734 kHz     27.000000 MHz (native)
                  VIC  20:  1920x1080i  50.000000 Hz  16:9     28.125 kHz     74.250000 MHz (native)
                  VIC  19:  1280x720    50.000000 Hz  16:9     37.500 kHz     74.250000 MHz (native)
                  VIC  21:  1440x576i   50.000000 Hz   4:3     15.625 kHz     27.000000 MHz (native)
                  VIC  32:  1920x1080   24.000000 Hz  16:9     27.000 kHz     74.250000 MHz (native)
                  VIC  33:  1920x1080   25.000000 Hz  16:9     28.125 kHz     74.250000 MHz (native)
                  VIC  34:  1920x1080   30.000000 Hz  16:9     33.750 kHz     74.250000 MHz (native)
                  VIC  31:  1920x1080   50.000000 Hz  16:9     56.250 kHz    148.500000 MHz (native)
                  VIC  16:  1920x1080   60.000000 Hz  16:9     67.500 kHz    148.500000 MHz (native)
                  VIC   3:   720x480    59.940060 Hz  16:9     31.469 kHz     27.000000 MHz
                  VIC  18:   720x576    50.000000 Hz  16:9     31.250 kHz     27.000000 MHz
                  VIC   1:   640x480    59.940476 Hz   4:3     31.469 kHz     25.175000 MHz
                Audio Data Block:
                  Linear PCM:
                    Max channels: 8
                    Supported sample rates (kHz): 48 44.1 32
                    Supported sample sizes (bits): 24
                Speaker Allocation Data Block:
                  FL/FR - Front Left/Right
                  LFE1 - Low Frequency Effects 1
                  FC - Front Center
                  BL/BR - Back Left/Right
                  FLc/FRc - Front Left/Right of Center
                Vendor-Specific Data Block (HDMI), OUI 00-0C-03:
                  Source physical address: 0.0.0.0
                Vendor-Specific Data Block, OUI 00-00-00:
                Colorimetry Data Block:
                  BT2020YCC
                  BT2020RGB
                HDR Static Metadata Data Block:
                  Electro optical transfer functions:
                    Traditional gamma - SDR luminance range
                    SMPTE ST2084
                    Hybrid Log-Gamma
                  Supported static metadata descriptors:
                    Static metadata type 1
              Checksum: 0x3a  Unused space in Extension Block: 84 bytes

                macsforme It will definitely NOT send the same to the capture card as to the monitor unless you spoof the EDID. You can use BetterDisplay to download the monitor EDID. I don't have a capture card myself, but it should be possible to spoof the EDID.

                  macsforme I might have more to add later, but one reason I went for the DVI2PCIe is you can program an arbitrary EDID binary to the card, and it remains persistent.

                  AFAIK that isn't (easily) possible for the Blackmagic cards.

                    async but it should be possible to spoof the EDID.

                    Depending on what "layer" the spoofing is done I wouldn't be surprised if that has an effect on the results.

                    Safest is a capture card that has it's own nonvolatile storage to program an arbitrary EDID binary independent of the host system. From the perspective of the target device it just sees an arbitrary monitor connected.

                    macsforme

                    Very interesting work!

                    I would be careful about making assessments about dithering by eye though. An LCD monitor may have its own native dithering and will have LCD inversion artefacts occurring alongside any dithering from the GPU. Then there is your brain's processing of the visual signals from your eyes: you could be more or less sensitive to some dithering algorithms, if you are used to a particular dithering algorithm from a GPU you used often you may have been desensitised to it. Combining these factors makes it difficult to make a reliable assessment by eye, and may be why we as a community can't agree on what is a good/safe setup.

                    The next step could be to create a system to test image stability on the monitor itself. Many people, myself included, have used sensors to measure a single predominate frequency of monitor flicker. If this could be expanded to measuring a broad band of flickering frequencies it could be used to determine if the monitor's image has become more or less stable without human subjectivity.

                      Seagull if there exists a capture device that similar to a monitor calibration device it would be fairly easy to create a workflow that tests different stuff.

                      A bit more tricky, but in theory it would be possible to create an iPhone app that can do it. Kinda strange that no one created an iPhone app for decent monitor calibration as well. Probably needs to run the test patterns as some different fpses to make sure it captures everything tho

                      macsforme

                      So I am in the same situation and have the same BMD capture device and can only capture dithering from RTX laptop when ColorControl dithering is set to "Auto: 8-bit Temporal" and never when color control dithering is set to disabled. I am of the opinion that I need to find something like DVI2PCIe or some EDID device that goes before the bmd capture device in the signal chain otherwise I'm comparing apples to oranges as you said. I am also using a higher bandwidth HDMI to HDMI cable that in theory should allow dithering to occur.

                      I posted some EDID devices here and not sure if they are useful:
                      https://ledstrain.org/d/2589-products-to-try-or-avoid-pwm-flicker-and-temporal-dithering-testing/185

                      https://www.black-box.de/en-de/page/25843/Resources/Technical-Resources/Black-Box-Explains/multimedia/EDID-Emulation

                      Do you still see dithering from your video analysis when you disable dithering inside color control?

                        async It will definitely NOT send the same to the capture card as to the monitor unless you spoof the EDID. You can use BetterDisplay to download the monitor EDID. I don't have a capture card myself, but it should be possible to spoof the EDID.

                        This suggestion guided me in the right direction. I was able to spoof the EDID of one of my real monitors (with a macOS display EDID override file) to the BMD capture device, and it was still able to work and capture video. What I ultimately discovered was that regardless of EDID spoofing or not, my AMD Radeon Pro WX 5100 will only output a signal with temporal dithering if the capture device is connected at boot time. Disconnecting and reconnecting the capture device causes the output to no longer have temporal dithering until the next reboot… strange! Regardless, this resolves the main discrepancy I encountered regarding the WX 5100 GPU not seeming to have temporal dithering while the visual experience suggested otherwise. The fact that temporal dithering was only detected via DisplayPort and not via HDMI on all GPUs I tried (with the exception of my 2015 MacBook Pro 15" with an AMD dGPU, which exhibited it on both) was another interesting discrepancy which may yield helpful insights.

                        JTL I might have more to add later, but one reason I went for the DVI2PCIe is you can program an arbitrary EDID binary to the card, and it remains persistent.

                        If you mean the $1,400 DVI2PCIe Duo… ouch. 🙁 The Blackmagic device was $70 (used) on eBay, plus the cost of the Thunderbolt cable.

                        Seagull I would be careful about making assessments about dithering by eye though.

                        A valid point, and I completely agree. Correct me if I'm wrong, but my impression was that empirically measuring temporal dithering from a physical LCD panel requires sophisticated cameras/tools which are out of reach for most people. As far as visual assessment, I try to limit my reliance on it to getting a general sensation of image stability, and have that form the basis for further empirical investigation.

                        photon78s Do you still see dithering from your video analysis when you disable dithering inside color control?

                        I never got that far, as it looked like I needed to reboot my workstation to make ColorControl work fully, and I was unable to do so at the time. However, I expect I would have reached the same results that you did. My main concern for that workstation was that dithering is off on the normal display (which was what ColorControl seemed to indicate), and then I had secondary concerns that the BMD capture device may not be getting the same output that a standard monitor would (which now seem mostly resolved, at least on macOS).

                        @DisplaysShouldNotBeTVs, are you willing to share your method for disabling temporal dithering on the AMD GPU on the MacBookPro11,5, as you've mentioned elsewhere? Now that I have a means of reliably measuring temporal dithering, I want to start trying to disable it, leveraging all possible methods.

                          dev