It would be great if eventually we could compile a 'dither-free' database similar to the tftcentral flicker-free db.

One thing I was thinking - would it be worth testing a single card in different states e.g. During POST/BIOS menu, Win OS Desktop, Linux OS Desktop just for further comparison to see if there is dithering on these cards constantly or when loading specific software.

Also this could also be used on games consoles etc anything that uses HDMI, correct? E.g. a good xbone and a updated (bad) xbone? A good DVD/media player and a bad DVD/media player. The link may not just stretch between PC's but other devices.

    diop

    I was thinking the same thing re-bios screen. I'm not 100% it'll capture consoles as there have been some forum posts about problems with this brand and capturing consoles, though that might have been resolved.

    diop It would be great if eventually we could compile a 'dither-free' database similar to the tftcentral flicker-free db.

    Unfortunately for both of us, since I recall you hoping dithering was the answer as much as I do, its not looking like it is given @Seagull's results so far. So far it seems to have no bearing at all on his symptoms.

      hpst

      I'm not so sure it has no bearing on my symptoms. I think I've probably adapted to dithering patterns produced by different combinations of monitor/card. Hence, any changes produce discomfort as I'm no longer seeing the same pattern. I will test this out at some point by seeing if I can adjust to an uncomfortable card. I know I can adjust to uncomfortable smartphones, so I am fairly confident this is the case.

        Seagull

        I wish it was just a matter of adapting. I have tried to push through symptoms and it only got worse. I've never adapted to ANY device that was painful. I've used devices that looked weird or I expected to be problems based on some factor like a glossy screen etc, that turned out fine. But nothing functionally uncomfortable ever got better. I have seen others say they were able to adapt though. Trying not to be negative, and I know spatial dithering hasn't been ruled out yet, but its feeling like temporal dithering at least isn't as much of a factor as I had hoped. A true major factor will be night and day noticeable when identified. I'm guessing dithering might just be another additive for some people like PWM etc.

        • diop replied to this.

          diop One thing I was thinking - would it be worth testing a single card in different states e.g. During POST/BIOS menu, Win OS Desktop, Linux OS Desktop just for further comparison to see if there is dithering on these cards constantly or when loading specific software.

          Yes

          hpst I'm guessing dithering might just be another additive for some people like PWM etc.

          However we can resolve that issue by using a PWM-free monitor. Temporal dithering doesn't have an off switch and as the results indicate is being used on practically all modern devices. We also know that dithering involves flickering pixels so on that basis alone I still believe it's potentially a major aggravator. Reverse-engineering PCoIP would hopefully show the different types of pixel activity between a good and a bad device, as well as the current capturing by @Seagull .

            diop Just bumping wondering @Seagull if any further tests have been done?

            Would anybody know how to attempt to capture PCoIP pixel data? 😛

            4 months later

            Seagull
            if you send the videos to shiva wind at proton mail I can take a video of what the dithering looks like on E-ink. will be a secondary confirmation of the method

              ShivaWind

              I will do this, though it won't be till January. Also the files are very large as they are capture losslessly, certainly too large for email. I will find another transfer method.

              2 months later

              Do you mind sharing those 5 frames you captured? I'm guessing you're testing for differences between each frame? I'm curious where dithering is happening the most. I'm guessing on images with smooth transitions/gradients. But what about flat color images?

              Also, do these cards capture lossless/raw? Or do they apply some sort of YUV 4:2:0 conversion? I wonder how this would affect analysis.

                Wallboy

                I am pretty sure the card was capturing losslessly, it had a lot of encoding options. I can't remember what setting I used, but it probably had lossless in its name. It created some huge files for only a few seconds of capture aswell. I don't have the computer with the card or data to hand, but I might be collecting it from my office this month if I have time and space in my travel bag.

                I can't say about where dithering happens the most as I was only capturing the windows desktop.

                2 months later

                @Seagull I wonder how do we square these results with the "consensus" which seems to be that GTX 6xx is more comfy than Intel. That is my experience (Also that ditherig doesn't help with symptoms). I find Microsoft Basic Display Adapter is more comfy than Intel, but what is the cause if Intel is not dithering with ditherig running (or even when it's not as these results indicate)?

                  degen

                  Could be that the older 6xx cards used a more random dithering algorithm. That's how I explain to myself how the dithering of my 6bit monitor is ok but that of a modern card causes discomfort. I'll see if I can test the basic display driver.

                  degen We don't know what we're doing. We need to understand compute side and display side technology better. And we need to analyse each item individually and in combination to get a better idea. So far we're at: PWM often bad, dithering (compute side) often bad, FRC (display side) often bad, sub-pixel rendering often bad, brightness & contrast often important, environmental lighting often important, blue light bad...what else? Backlight patterns, inversion patterns, display input/output technologies (HDMI vs DP vs VGA), response times?

                  I've seen a few posts like yours recently. And I feel pretty much the same as I did back in 2012 when this started for me - frustrated, struggling, and not clear on what technologies are even relevant to my situation. Does this sound true: the problem is more complex than our current available means of investigating it. If it is or not, what should we be doing?

                  There are obviously smart people on here. But we make terribly slow, haphazard and often lucky progress.

                    degen Unfortunately can't test the basic display driver as it won't recognise my capture card, might see if there is some way around it, it does kind of feel different to the intel driver.

                    EDIT: Was able to test the windows basic driver on a UHD630 i7 8700k, no dithering. Also no dithering on that gpu in normal use.

                    Edward I've seen a few posts like yours recently. And I feel pretty much the same as I did back in 2012 when this started for me - frustrated, struggling, and not clear on what technologies are even relevant to my situation. Does this sound true: the problem is more complex than our current available means of investigating it. If it is or not, what should we be doing?

                    I have learned more about displays and graphics in the last 5 years thanks to this site and others than ever before. I share your frustrations, however. Everything in my gut tells me that temporal dithering is the issue for most people here. PWM is very easy to rule out (simply don't use a screen that uses PWM!) but then it gets more complicated with pixel inversion (at this point my knowledge runs out).

                    If I have a good device and a bad device using the same HDMI cable and the same monitor, then the techie in me says "OK, it's something to do with the hardware/software in this machine" (monitor has been ruled out as the cause). So at this point we're looking at the 'point of failure' being..

                    VBIOS (Mobo/GPU) (which may default to temporal dither, hence why even on POST screens we get discomfort on new tech)
                    OS itself (W10 composition spaces).
                    OS Drivers (which could be set to temporal dither)

                    VBIOS is locked down and generally unaccessible to make changes AFAIK
                    OS - Maybe a regedit or fix could disable any OS tomfoolery
                    OS Drivers - (Ditherig/Nvidia reg keys which aren't 100%).

                    Reading into the patents concerning Temporal Dithering have been somewhat revealatory to me. It does cause flicker, regardless of how miniscule it's claimed to be or how it can "fool the eyes" to see the extra colors. It is snake oil for supposed '10 bit' displays, which in fact are 8bit. Or it's to guarantee uniformity of color on any panel. Which makes sense from a customer satisfaction perspective, but flicker is flicker.

                      diop So for a solution we need to engage perhaps:

                      • display manufacturers
                      • VBIOS writers
                      • OS writers
                      • OS driver writers (does that mean OEMs for integrated/discrete GPUs, and basically Linux guys?)
                      • I/O standards groups (for HDMI, DP, VGA, DVI, etc.)

                      That seems like a far more complex approach than what has been achieved so far. PWM was solved by display tetch only, same for blue light.

                      Things get more complex when we add that different sufferers are sensitive to different technological issues, and possibly different combinations of issues.

                      I really want to know what people think the best next steps are.

                      dev