hpst I'm guessing dithering might just be another additive for some people like PWM etc.

However we can resolve that issue by using a PWM-free monitor. Temporal dithering doesn't have an off switch and as the results indicate is being used on practically all modern devices. We also know that dithering involves flickering pixels so on that basis alone I still believe it's potentially a major aggravator. Reverse-engineering PCoIP would hopefully show the different types of pixel activity between a good and a bad device, as well as the current capturing by @Seagull .

    diop Just bumping wondering @Seagull if any further tests have been done?

    Would anybody know how to attempt to capture PCoIP pixel data? 😛

    4 months later

    Seagull
    if you send the videos to shiva wind at proton mail I can take a video of what the dithering looks like on E-ink. will be a secondary confirmation of the method

      ShivaWind

      I will do this, though it won't be till January. Also the files are very large as they are capture losslessly, certainly too large for email. I will find another transfer method.

      2 months later

      Do you mind sharing those 5 frames you captured? I'm guessing you're testing for differences between each frame? I'm curious where dithering is happening the most. I'm guessing on images with smooth transitions/gradients. But what about flat color images?

      Also, do these cards capture lossless/raw? Or do they apply some sort of YUV 4:2:0 conversion? I wonder how this would affect analysis.

        Wallboy

        I am pretty sure the card was capturing losslessly, it had a lot of encoding options. I can't remember what setting I used, but it probably had lossless in its name. It created some huge files for only a few seconds of capture aswell. I don't have the computer with the card or data to hand, but I might be collecting it from my office this month if I have time and space in my travel bag.

        I can't say about where dithering happens the most as I was only capturing the windows desktop.

        2 months later

        @Seagull I wonder how do we square these results with the "consensus" which seems to be that GTX 6xx is more comfy than Intel. That is my experience (Also that ditherig doesn't help with symptoms). I find Microsoft Basic Display Adapter is more comfy than Intel, but what is the cause if Intel is not dithering with ditherig running (or even when it's not as these results indicate)?

          degen

          Could be that the older 6xx cards used a more random dithering algorithm. That's how I explain to myself how the dithering of my 6bit monitor is ok but that of a modern card causes discomfort. I'll see if I can test the basic display driver.

          degen We don't know what we're doing. We need to understand compute side and display side technology better. And we need to analyse each item individually and in combination to get a better idea. So far we're at: PWM often bad, dithering (compute side) often bad, FRC (display side) often bad, sub-pixel rendering often bad, brightness & contrast often important, environmental lighting often important, blue light bad...what else? Backlight patterns, inversion patterns, display input/output technologies (HDMI vs DP vs VGA), response times?

          I've seen a few posts like yours recently. And I feel pretty much the same as I did back in 2012 when this started for me - frustrated, struggling, and not clear on what technologies are even relevant to my situation. Does this sound true: the problem is more complex than our current available means of investigating it. If it is or not, what should we be doing?

          There are obviously smart people on here. But we make terribly slow, haphazard and often lucky progress.

            degen Unfortunately can't test the basic display driver as it won't recognise my capture card, might see if there is some way around it, it does kind of feel different to the intel driver.

            EDIT: Was able to test the windows basic driver on a UHD630 i7 8700k, no dithering. Also no dithering on that gpu in normal use.

            Edward I've seen a few posts like yours recently. And I feel pretty much the same as I did back in 2012 when this started for me - frustrated, struggling, and not clear on what technologies are even relevant to my situation. Does this sound true: the problem is more complex than our current available means of investigating it. If it is or not, what should we be doing?

            I have learned more about displays and graphics in the last 5 years thanks to this site and others than ever before. I share your frustrations, however. Everything in my gut tells me that temporal dithering is the issue for most people here. PWM is very easy to rule out (simply don't use a screen that uses PWM!) but then it gets more complicated with pixel inversion (at this point my knowledge runs out).

            If I have a good device and a bad device using the same HDMI cable and the same monitor, then the techie in me says "OK, it's something to do with the hardware/software in this machine" (monitor has been ruled out as the cause). So at this point we're looking at the 'point of failure' being..

            VBIOS (Mobo/GPU) (which may default to temporal dither, hence why even on POST screens we get discomfort on new tech)
            OS itself (W10 composition spaces).
            OS Drivers (which could be set to temporal dither)

            VBIOS is locked down and generally unaccessible to make changes AFAIK
            OS - Maybe a regedit or fix could disable any OS tomfoolery
            OS Drivers - (Ditherig/Nvidia reg keys which aren't 100%).

            Reading into the patents concerning Temporal Dithering have been somewhat revealatory to me. It does cause flicker, regardless of how miniscule it's claimed to be or how it can "fool the eyes" to see the extra colors. It is snake oil for supposed '10 bit' displays, which in fact are 8bit. Or it's to guarantee uniformity of color on any panel. Which makes sense from a customer satisfaction perspective, but flicker is flicker.

              diop So for a solution we need to engage perhaps:

              • display manufacturers
              • VBIOS writers
              • OS writers
              • OS driver writers (does that mean OEMs for integrated/discrete GPUs, and basically Linux guys?)
              • I/O standards groups (for HDMI, DP, VGA, DVI, etc.)

              That seems like a far more complex approach than what has been achieved so far. PWM was solved by display tetch only, same for blue light.

              Things get more complex when we add that different sufferers are sensitive to different technological issues, and possibly different combinations of issues.

              I really want to know what people think the best next steps are.

              diop Reading into the patents concerning Temporal Dithering have been somewhat revealatory to me.

              On a related note, I have figured out I can test for temporal dithering creating a flicker effect by seeing if the dithering pattern repeats, and how many frames it takes to repeat. I will have a go at it when my work load is a bit lower.

              3 years later

              Seagull I think I've probably adapted to dithering patterns produced by different combinations of monitor/card. Hence, any changes produce discomfort as I'm no longer seeing the same pattern. I will test this out at some point by seeing if I can adjust to an uncomfortable card. I know I can adjust to uncomfortable smartphones, so I am fairly confident this is the case.

              Hope it's ok to bump this old thread but I'm beginning to suspect this could be the issue for some of us. Do you still think this theory is true @Seagull ?

                ryans

                Yes, for many years it have been key to how I manage my symptoms. Above a certain intensity of eyestrain/migraines I will abandon a device because it is unlikely I will be able to adapt to it, but for most devices I will try to get used to the device. I've never been able to adapt to an IPS LCD device, but flicker free TN LCDs and OLEDs now only take me a few weeks/months to be usable.

                It has been harder than it sounds though. A lot of my eye strain symptoms are linked to diet, I have a lot of food intolerances which trigger eyestrain and migraines. Its only since finding those intolerances and managing them that adapting to bad devices has been an option. I need to keep my body, and therefore my brain, as healthy as possible to be able to adapt to new devices.

                  a year later

                  Following @aiaf's project in the thread I disabled dithering on Apple silicon + Introducing Stillcolor macOS M1/M2/M3 (which I believe was partially inspired by the information in this thread), I purchased a Blackmagic Design UltraStudio Recorder 3G and downloaded ffmpeg to perform my own testing for temporal dithering and try to contribute some more data to the community here.  However, a few things are unclear and I want to ensure I am collecting valid data.

                  First, I observed that several GPUs (specifically NVIDIA) on both macOS and Windows exhibit temporal dithering when I use a DisplayPort to HDMI cable, but do not exhibit dithering when using a straight HDMI cable from the HDMI output port of the GPU.  I tried an older HDMI cable as well as an "Ultra High Speed" cable, but it seemed to make no difference.  I read in another thread that older HDMI cables may effectively reduce color depth via their limited bandwidth, but otherwise I was unsure whether this is the expected outcome.

                  Second, I am beginning to question whether the GPU/OS will send the same kind of output to the capture card as to a normal display.  After inspecting the EDID of the capture card, it appears to advertise itself as a "non-RGB color display" taking YCbCr input.  On macOS at least, I suspect the capture card was being detected as a television (if I am interpreting the output of AllRez correctly).  On Windows, I noticed that when both a normal display and the capture card were connected to my NVIDIA GTX 3070, dithering was evident on the capture card side, but ColorControl (with automatic settings) appeared to indicate that dithering was only active on the capture card and not on the normal display:

                  I don't know of a way to determine dithering status in an equivalent way on macOS.  Basically, I am concerned that we may be comparing apples to oranges with this method.

                  Lastly, some of the results simply do not make sense.  A capture from my AMD Radeon Pro WX 5100 (on both macOS High Sierra and Monterey via OpenCore) exhibited no temporal dithering, while the picture on normal monitors seemed (subjectively) very unstable and I strongly suspect temporal dithering is in effect.  On the flip side, my original NVIDIA GT120 exhibits (again, subjectively) the most stable picture of all the GPUs I tested on a normal monitor, but the capture output seemed to show the most significant temporal dithering of all samples.  At this point, I am starting to wonder if slower/deeper temporal dithering may be easier on the eyes than the barely-perceptible kind because our eyes and brains can actually make sense of it.

                  Any thoughts would be appreciated.  Here is the parsed EDID data for the Blackmagic capture card:

                  Block 0, Base EDID:
                    EDID Structure Version & Revision: 1.3
                    Vendor & Product Identification:
                      Manufacturer: BMD
                      Model: 0
                      Serial Number: 1 (0x00000001)
                      Made in: week 52 of 2012
                    Basic Display Parameters & Features:
                      Digital display
                      Maximum image size: 71 cm x 40 cm
                      Gamma: 2.50
                      Non-RGB color display
                      First detailed timing is the preferred timing
                    Color Characteristics:
                      Red  : 0.6396, 0.3447
                      Green: 0.2910, 0.6347
                      Blue : 0.1630, 0.0927
                      White: 0.2880, 0.2958
                    Established Timings I & II:
                      DMT 0x04:   640x480    59.940476 Hz   4:3     31.469 kHz     25.175000 MHz
                    Standard Timings: none
                    Detailed Timing Descriptors:
                      DTD 1:  1920x1080   60.000000 Hz  16:9     67.500 kHz    148.500000 MHz (800 mm x 450 mm)
                                   Hfront   88 Hsync  44 Hback  148 Hpol P
                                   Vfront    4 Vsync   5 Vback   36 Vpol P
                      DTD 2:  1920x1080   50.000000 Hz  16:9     56.250 kHz    148.500000 MHz (800 mm x 450 mm)
                                   Hfront  528 Hsync  44 Hback  148 Hpol P
                                   Vfront    4 Vsync   5 Vback   36 Vpol P
                      Display Product Name: 'BMD HDMI'
                      Display Range Limits:
                        Monitor ranges (GTF): 50-60 Hz V, 15-45 kHz H, max dotclock 80 MHz
                    Extension blocks: 1
                  Checksum: 0x01
                  
                  ----------------
                  
                  Block 1, CTA-861 Extension Block:
                    Revision: 3
                    Basic audio support
                    Supports YCbCr 4:4:4
                    Supports YCbCr 4:2:2
                    Native detailed modes: 2
                    Video Data Block:
                      VIC   5:  1920x1080i  60.000000 Hz  16:9     33.750 kHz     74.250000 MHz (native)
                      VIC   4:  1280x720    60.000000 Hz  16:9     45.000 kHz     74.250000 MHz (native)
                      VIC   6:  1440x480i   59.940060 Hz   4:3     15.734 kHz     27.000000 MHz (native)
                      VIC  20:  1920x1080i  50.000000 Hz  16:9     28.125 kHz     74.250000 MHz (native)
                      VIC  19:  1280x720    50.000000 Hz  16:9     37.500 kHz     74.250000 MHz (native)
                      VIC  21:  1440x576i   50.000000 Hz   4:3     15.625 kHz     27.000000 MHz (native)
                      VIC  32:  1920x1080   24.000000 Hz  16:9     27.000 kHz     74.250000 MHz (native)
                      VIC  33:  1920x1080   25.000000 Hz  16:9     28.125 kHz     74.250000 MHz (native)
                      VIC  34:  1920x1080   30.000000 Hz  16:9     33.750 kHz     74.250000 MHz (native)
                      VIC  31:  1920x1080   50.000000 Hz  16:9     56.250 kHz    148.500000 MHz (native)
                      VIC  16:  1920x1080   60.000000 Hz  16:9     67.500 kHz    148.500000 MHz (native)
                      VIC   3:   720x480    59.940060 Hz  16:9     31.469 kHz     27.000000 MHz
                      VIC  18:   720x576    50.000000 Hz  16:9     31.250 kHz     27.000000 MHz
                      VIC   1:   640x480    59.940476 Hz   4:3     31.469 kHz     25.175000 MHz
                    Audio Data Block:
                      Linear PCM:
                        Max channels: 8
                        Supported sample rates (kHz): 48 44.1 32
                        Supported sample sizes (bits): 24
                    Speaker Allocation Data Block:
                      FL/FR - Front Left/Right
                      LFE1 - Low Frequency Effects 1
                      FC - Front Center
                      BL/BR - Back Left/Right
                      FLc/FRc - Front Left/Right of Center
                    Vendor-Specific Data Block (HDMI), OUI 00-0C-03:
                      Source physical address: 0.0.0.0
                    Vendor-Specific Data Block, OUI 00-00-00:
                    Colorimetry Data Block:
                      BT2020YCC
                      BT2020RGB
                    HDR Static Metadata Data Block:
                      Electro optical transfer functions:
                        Traditional gamma - SDR luminance range
                        SMPTE ST2084
                        Hybrid Log-Gamma
                      Supported static metadata descriptors:
                        Static metadata type 1
                  Checksum: 0x3a  Unused space in Extension Block: 84 bytes

                    macsforme It will definitely NOT send the same to the capture card as to the monitor unless you spoof the EDID. You can use BetterDisplay to download the monitor EDID. I don't have a capture card myself, but it should be possible to spoof the EDID.

                      macsforme I might have more to add later, but one reason I went for the DVI2PCIe is you can program an arbitrary EDID binary to the card, and it remains persistent.

                      AFAIK that isn't (easily) possible for the Blackmagic cards.

                        dev