KM

Essentially yes. 'LodePNG' turns each .png frame into a big blob of colour values. I then compare each value of one frame, with the next and record any values which dither. Slightly worryingly, each pixel has four colour values, which I presume are RGBA, A being Alpha (transparency). This is a worry as the Alpha ranges between 0 and 2, but really it should always be zero. So potentially its not entirely giving me the right colour values.

The capture card software can output .tga, which I think will be better but i'll need to sort out decoding those.

On a slightly more positive note, I'm thinking of hardware solutions to temporal dithering still and one has promise. I took some of the raw video from a GTX660, and compressed it. I then retested it and found the level of dithering reduced by about 98-99%. So now I am wondering if there are any capture cards with a pass-through that'll output with compression. The company that made the capture card I have make such a device, but it only works through their software, something usb powered that just works would be a lot more convenient.

    Seagull Really does beg the question, if not temporal dithering, why do so many people get discomfort with intel? Perhaps I should try the test again with something like a web browser open and see if the results change.

    I was hoping for something more definitive but it's not surprising its another mixed bag. I have tried all the refresh rates available to me and none have made a difference. I don't believe 59 vs 60 could possible take something from comfortable to painful. It is just illogical. Here's to hoping something stands out or clicks. I appreciate you doing this.

    Seagull I could probably make something to do that but it might cause

    a) excruciating input lag
    b) blurriness of the image (blurry fonts cause eyestrain)

    Seagull What kind of monitor do you use when testing? Wouldn't you generate a different result with different wires and different monitors as well? Also how long as the monitor been on is it in a cool basement or on the main floor? Temperature can have a major difference on monitor performance. I think VA is the worst for this and needs 30 minutes to reach a normal level for testing.

      jasonpicard The monitor has no bearing on the capture card itself since if it works the way I think it does, it feeds the GPU signal directly into the capture card, or into a splitter of which it also goes into the monitor.

      I could see VERY bad quality cables interfering with the results. I once had an HDMI splitter that would cause "snow sparkles" on the image without dropping out.

      jasonpicard

      As @JTL says the monitor doesn't affect it. The output from the GPU goes directly into the capture card as if it were a second monitor. I don't use any kind of splitter.

      It would be great if eventually we could compile a 'dither-free' database similar to the tftcentral flicker-free db.

      One thing I was thinking - would it be worth testing a single card in different states e.g. During POST/BIOS menu, Win OS Desktop, Linux OS Desktop just for further comparison to see if there is dithering on these cards constantly or when loading specific software.

      Also this could also be used on games consoles etc anything that uses HDMI, correct? E.g. a good xbone and a updated (bad) xbone? A good DVD/media player and a bad DVD/media player. The link may not just stretch between PC's but other devices.

        diop

        I was thinking the same thing re-bios screen. I'm not 100% it'll capture consoles as there have been some forum posts about problems with this brand and capturing consoles, though that might have been resolved.

        diop It would be great if eventually we could compile a 'dither-free' database similar to the tftcentral flicker-free db.

        Unfortunately for both of us, since I recall you hoping dithering was the answer as much as I do, its not looking like it is given @Seagull's results so far. So far it seems to have no bearing at all on his symptoms.

          hpst

          I'm not so sure it has no bearing on my symptoms. I think I've probably adapted to dithering patterns produced by different combinations of monitor/card. Hence, any changes produce discomfort as I'm no longer seeing the same pattern. I will test this out at some point by seeing if I can adjust to an uncomfortable card. I know I can adjust to uncomfortable smartphones, so I am fairly confident this is the case.

            Seagull

            I wish it was just a matter of adapting. I have tried to push through symptoms and it only got worse. I've never adapted to ANY device that was painful. I've used devices that looked weird or I expected to be problems based on some factor like a glossy screen etc, that turned out fine. But nothing functionally uncomfortable ever got better. I have seen others say they were able to adapt though. Trying not to be negative, and I know spatial dithering hasn't been ruled out yet, but its feeling like temporal dithering at least isn't as much of a factor as I had hoped. A true major factor will be night and day noticeable when identified. I'm guessing dithering might just be another additive for some people like PWM etc.

            • diop replied to this.

              diop One thing I was thinking - would it be worth testing a single card in different states e.g. During POST/BIOS menu, Win OS Desktop, Linux OS Desktop just for further comparison to see if there is dithering on these cards constantly or when loading specific software.

              Yes

              hpst I'm guessing dithering might just be another additive for some people like PWM etc.

              However we can resolve that issue by using a PWM-free monitor. Temporal dithering doesn't have an off switch and as the results indicate is being used on practically all modern devices. We also know that dithering involves flickering pixels so on that basis alone I still believe it's potentially a major aggravator. Reverse-engineering PCoIP would hopefully show the different types of pixel activity between a good and a bad device, as well as the current capturing by @Seagull .

                diop Just bumping wondering @Seagull if any further tests have been done?

                Would anybody know how to attempt to capture PCoIP pixel data? 😛

                4 months later

                Seagull
                if you send the videos to shiva wind at proton mail I can take a video of what the dithering looks like on E-ink. will be a secondary confirmation of the method

                  ShivaWind

                  I will do this, though it won't be till January. Also the files are very large as they are capture losslessly, certainly too large for email. I will find another transfer method.

                  2 months later

                  Do you mind sharing those 5 frames you captured? I'm guessing you're testing for differences between each frame? I'm curious where dithering is happening the most. I'm guessing on images with smooth transitions/gradients. But what about flat color images?

                  Also, do these cards capture lossless/raw? Or do they apply some sort of YUV 4:2:0 conversion? I wonder how this would affect analysis.

                    dev