jasonpicard The monitor has no bearing on the capture card itself since if it works the way I think it does, it feeds the GPU signal directly into the capture card, or into a splitter of which it also goes into the monitor.

I could see VERY bad quality cables interfering with the results. I once had an HDMI splitter that would cause "snow sparkles" on the image without dropping out.

jasonpicard

As @JTL says the monitor doesn't affect it. The output from the GPU goes directly into the capture card as if it were a second monitor. I don't use any kind of splitter.

It would be great if eventually we could compile a 'dither-free' database similar to the tftcentral flicker-free db.

One thing I was thinking - would it be worth testing a single card in different states e.g. During POST/BIOS menu, Win OS Desktop, Linux OS Desktop just for further comparison to see if there is dithering on these cards constantly or when loading specific software.

Also this could also be used on games consoles etc anything that uses HDMI, correct? E.g. a good xbone and a updated (bad) xbone? A good DVD/media player and a bad DVD/media player. The link may not just stretch between PC's but other devices.

    diop

    I was thinking the same thing re-bios screen. I'm not 100% it'll capture consoles as there have been some forum posts about problems with this brand and capturing consoles, though that might have been resolved.

    diop It would be great if eventually we could compile a 'dither-free' database similar to the tftcentral flicker-free db.

    Unfortunately for both of us, since I recall you hoping dithering was the answer as much as I do, its not looking like it is given @Seagull's results so far. So far it seems to have no bearing at all on his symptoms.

      hpst

      I'm not so sure it has no bearing on my symptoms. I think I've probably adapted to dithering patterns produced by different combinations of monitor/card. Hence, any changes produce discomfort as I'm no longer seeing the same pattern. I will test this out at some point by seeing if I can adjust to an uncomfortable card. I know I can adjust to uncomfortable smartphones, so I am fairly confident this is the case.

        Seagull

        I wish it was just a matter of adapting. I have tried to push through symptoms and it only got worse. I've never adapted to ANY device that was painful. I've used devices that looked weird or I expected to be problems based on some factor like a glossy screen etc, that turned out fine. But nothing functionally uncomfortable ever got better. I have seen others say they were able to adapt though. Trying not to be negative, and I know spatial dithering hasn't been ruled out yet, but its feeling like temporal dithering at least isn't as much of a factor as I had hoped. A true major factor will be night and day noticeable when identified. I'm guessing dithering might just be another additive for some people like PWM etc.

        • diop replied to this.

          diop One thing I was thinking - would it be worth testing a single card in different states e.g. During POST/BIOS menu, Win OS Desktop, Linux OS Desktop just for further comparison to see if there is dithering on these cards constantly or when loading specific software.

          Yes

          hpst I'm guessing dithering might just be another additive for some people like PWM etc.

          However we can resolve that issue by using a PWM-free monitor. Temporal dithering doesn't have an off switch and as the results indicate is being used on practically all modern devices. We also know that dithering involves flickering pixels so on that basis alone I still believe it's potentially a major aggravator. Reverse-engineering PCoIP would hopefully show the different types of pixel activity between a good and a bad device, as well as the current capturing by @Seagull .

            diop Just bumping wondering @Seagull if any further tests have been done?

            Would anybody know how to attempt to capture PCoIP pixel data? 😛

            4 months later

            Seagull
            if you send the videos to shiva wind at proton mail I can take a video of what the dithering looks like on E-ink. will be a secondary confirmation of the method

              ShivaWind

              I will do this, though it won't be till January. Also the files are very large as they are capture losslessly, certainly too large for email. I will find another transfer method.

              2 months later

              Do you mind sharing those 5 frames you captured? I'm guessing you're testing for differences between each frame? I'm curious where dithering is happening the most. I'm guessing on images with smooth transitions/gradients. But what about flat color images?

              Also, do these cards capture lossless/raw? Or do they apply some sort of YUV 4:2:0 conversion? I wonder how this would affect analysis.

                Wallboy

                I am pretty sure the card was capturing losslessly, it had a lot of encoding options. I can't remember what setting I used, but it probably had lossless in its name. It created some huge files for only a few seconds of capture aswell. I don't have the computer with the card or data to hand, but I might be collecting it from my office this month if I have time and space in my travel bag.

                I can't say about where dithering happens the most as I was only capturing the windows desktop.

                2 months later

                @Seagull I wonder how do we square these results with the "consensus" which seems to be that GTX 6xx is more comfy than Intel. That is my experience (Also that ditherig doesn't help with symptoms). I find Microsoft Basic Display Adapter is more comfy than Intel, but what is the cause if Intel is not dithering with ditherig running (or even when it's not as these results indicate)?

                  degen

                  Could be that the older 6xx cards used a more random dithering algorithm. That's how I explain to myself how the dithering of my 6bit monitor is ok but that of a modern card causes discomfort. I'll see if I can test the basic display driver.

                  degen We don't know what we're doing. We need to understand compute side and display side technology better. And we need to analyse each item individually and in combination to get a better idea. So far we're at: PWM often bad, dithering (compute side) often bad, FRC (display side) often bad, sub-pixel rendering often bad, brightness & contrast often important, environmental lighting often important, blue light bad...what else? Backlight patterns, inversion patterns, display input/output technologies (HDMI vs DP vs VGA), response times?

                  I've seen a few posts like yours recently. And I feel pretty much the same as I did back in 2012 when this started for me - frustrated, struggling, and not clear on what technologies are even relevant to my situation. Does this sound true: the problem is more complex than our current available means of investigating it. If it is or not, what should we be doing?

                  There are obviously smart people on here. But we make terribly slow, haphazard and often lucky progress.

                    degen Unfortunately can't test the basic display driver as it won't recognise my capture card, might see if there is some way around it, it does kind of feel different to the intel driver.

                    EDIT: Was able to test the windows basic driver on a UHD630 i7 8700k, no dithering. Also no dithering on that gpu in normal use.

                    dev