Hi all,

I have bought a capture card to test for dithering, specifically the BlackMagic Decklink Mini Recorder 4k. It'll arrive next week, and I'll update you when I get some results. Hopefully no one chimes in telling me I've bought the wrong card, it looks like it can do lossless capture at 60hz 1080p.

@Everyone for testing I have access to several gtx660s, a gtx1060, 1st 6th & 8th gen intel i3/5/7 chips, GT720, quadro 295/310. Assuming this all works and you want me to test a card I can, but you'll need to post one to me in the UK - unless its a cheap card I can afford to buy.

@JTL This is mostly directed at you, since you were/are going to do this.
I intend to use VLC player to play back the captured video and save several frames using the VLC snapshot facility as .png - apparently this is lossless though I am not sure if is sufficient. From there I have a C++ library that'll decode .png into colour values that I can used to compare each frame and spot any differences. If you have a better analysis method that you can share with me that'd be great. I can of course send you the video files which I capture for your own personal analysis - though they'll be large files, even with only a few seconds of video.

So the card arrived today, and I've spent the afternoon experimenting. Fortunately, getting it work wasn't too hard. The results so far are largely what I expected. For testing, I captured a second of uncompressed 8-bit quicktime uncompressed YUV using BlackMagic's software, of which five consecutive frames were analysed. The video was of the windows 10 desktop with default W10 wallpaper. The program I wrote tests only for temporal dithering, I will make something for spatial dithering but that's a slightly trickier task.

Initial results:

IntegratedModelDriverPortDitherigResult
intel integrated graphicsi5 6400DisplayPort (DP)Ditherig set to 'No Dithering'No temportal dithering detected
intel integrated graphicsi5 6400DPDitherig set to 'Spatial (default)'No temportal dithering detected
intel integrated graphicsi5 6400DPDitherig set to 'Spatial Temporal 1'Temporal dithering detected
GT210not sure which driversHDMITemporal Dithering Detected
GTX 1060latest driversHDMITemporal Dithering Detected
GTX660driver: 388.13DPTemporal dithering detected
GT720Driver 341.74HDMITemporal Dithering Detected
AMD firepro W700latest driversTemporal dithering detected

I am somewhat confident these results are right, I was surprised that ditherig/intel actually worked as I've had discomfort using a setup like this - but the results match ditherig's settings. Also no surprise that the 1060 dithered.

There's quite a lot of things for me to test: Different ports/computers, recording text, different software, cards and operating systems. Will keep you all updated.

edit by: @Slacor to include table

    Here's an unexpected result:

    GTX660, using DP, driver: 388.13 = Temporal dithering detected.

    This is a card I've used a lot with no more or less strain than anything else. But it does go someway to explain the peculiar experiences I've had. I cannot use anyother 600 series cards, so I wonder if I've just adjusted to this one's particular dithering pattern having used it for so long. Conflicts a bit with the GT720 being free of dithering - which I am going to double check!

    edit GT720 and 210 both dither, I think I just confused some of the video files.

      Seagull

      Thanks for doing this. Lots of good info in just a few days. Do you know when you will have the spatial testing done? I am feeling like this will just be more depressing ambiguity based on your testing so far. But until we have spatial results too it's too soon to say "it's not dithering". But the fact a safe setup has it, and one that is painful does not, doesn't bode well for temporal dithering being a root cause.

      Also do you have a way to test a linux distro? Personally that's very relevant.

        hpst

        I can try it, linux is free after all. I have no experience with Linux though, so you'll need to write me an idiots guide for installing specific drivers or changes you want made.

        Just tested: AMD firepro W700, latest drivers = Temporal dithering detected.

        I also realised I must have mixed up some of the video files for the nvidia gt720, re-tried them and they all dither. So far the only non-temporal dithering option I have found is the intel integrated graphics with/without ditherig.exe. So now I am wondering why the intel integrated graphics causes me problems in setups that were otherwise fine. it seems to default to 59hz refresh rate which is a bit odd as everything else sets my monitors to 60hz.

          Seagull

          Re Linux I am not great at it, just a user who can follow directions. I'd be interested to see what the default setups of common distros (Ubuntu, Fedora, Manjaro etc) show in your tests. I have tried to disable dithering in Xubuntu and a few others with the linux section of the link you posted, and also tried adjusting the default drivers in config files, and there were no obvious improvements, some changes even broke things, but having proof of whether its turning it off or not would be good.

          I have only used laptops for age so it limits my testing. I have tried changing refresh rate to whatever options the distro offered (usually 59,60, Auto...sometimes there is a 120) but none have ever made visual or strain differences. hard to believe 1mhz in your example would make a difference.

          So am I understanding the update so far is stating you can only turn dithering off with Dithering.exe on Intel integrated? But those setups strain you either way if only on integrated? And on those same devices with Nvidia card and Dithering enabled (but testing as showing dithering actually exists) you are having no strain?

            hpst

            All the nvidia cards bar the gtx660 I use everyday cause me strain, all the nvidia cards had temporal dithering. Not all strain me equally though, other 600 series cards strain me, but I'd say half the intensity of the 1060 I have tried. I think the temporal dithering on the gtx600 series is a bit subtler perhaps, enough that I've been able to adjust to it over the 6years i've had it. However, I have found no actual evidence that it is subtler.

            No temporal dithering on intel integrated graphics. The only way to make it dither temporally appears to be enabling that feature in the ditherig.exe options. I have one office pc setup that I can use with intel integrated graphics, but another pc at home that I can't. Right now, I am wondering about the intel graphics defaulting to 59hz on my monitors. I don't think I can perceive any kind of difference that small, but it does seem possible some of my monitors are misbehaving because they are optimised for 60hz input. No point trying that today though, my eyes are pretty sore from looking at bad graphics cards all day.

            Really does beg the question, if not temporal dithering, why do so many people get discomfort with intel? Perhaps I should try the test again with something like a web browser open and see if the results change.

              How do you test the frames? Do you check if all bits are identical?

                KM

                Essentially yes. 'LodePNG' turns each .png frame into a big blob of colour values. I then compare each value of one frame, with the next and record any values which dither. Slightly worryingly, each pixel has four colour values, which I presume are RGBA, A being Alpha (transparency). This is a worry as the Alpha ranges between 0 and 2, but really it should always be zero. So potentially its not entirely giving me the right colour values.

                The capture card software can output .tga, which I think will be better but i'll need to sort out decoding those.

                On a slightly more positive note, I'm thinking of hardware solutions to temporal dithering still and one has promise. I took some of the raw video from a GTX660, and compressed it. I then retested it and found the level of dithering reduced by about 98-99%. So now I am wondering if there are any capture cards with a pass-through that'll output with compression. The company that made the capture card I have make such a device, but it only works through their software, something usb powered that just works would be a lot more convenient.

                  Seagull Really does beg the question, if not temporal dithering, why do so many people get discomfort with intel? Perhaps I should try the test again with something like a web browser open and see if the results change.

                  I was hoping for something more definitive but it's not surprising its another mixed bag. I have tried all the refresh rates available to me and none have made a difference. I don't believe 59 vs 60 could possible take something from comfortable to painful. It is just illogical. Here's to hoping something stands out or clicks. I appreciate you doing this.

                  Seagull I could probably make something to do that but it might cause

                  a) excruciating input lag
                  b) blurriness of the image (blurry fonts cause eyestrain)

                  Seagull What kind of monitor do you use when testing? Wouldn't you generate a different result with different wires and different monitors as well? Also how long as the monitor been on is it in a cool basement or on the main floor? Temperature can have a major difference on monitor performance. I think VA is the worst for this and needs 30 minutes to reach a normal level for testing.

                    jasonpicard The monitor has no bearing on the capture card itself since if it works the way I think it does, it feeds the GPU signal directly into the capture card, or into a splitter of which it also goes into the monitor.

                    I could see VERY bad quality cables interfering with the results. I once had an HDMI splitter that would cause "snow sparkles" on the image without dropping out.

                    jasonpicard

                    As @JTL says the monitor doesn't affect it. The output from the GPU goes directly into the capture card as if it were a second monitor. I don't use any kind of splitter.

                    It would be great if eventually we could compile a 'dither-free' database similar to the tftcentral flicker-free db.

                    One thing I was thinking - would it be worth testing a single card in different states e.g. During POST/BIOS menu, Win OS Desktop, Linux OS Desktop just for further comparison to see if there is dithering on these cards constantly or when loading specific software.

                    Also this could also be used on games consoles etc anything that uses HDMI, correct? E.g. a good xbone and a updated (bad) xbone? A good DVD/media player and a bad DVD/media player. The link may not just stretch between PC's but other devices.

                      diop

                      I was thinking the same thing re-bios screen. I'm not 100% it'll capture consoles as there have been some forum posts about problems with this brand and capturing consoles, though that might have been resolved.

                      diop It would be great if eventually we could compile a 'dither-free' database similar to the tftcentral flicker-free db.

                      Unfortunately for both of us, since I recall you hoping dithering was the answer as much as I do, its not looking like it is given @Seagull's results so far. So far it seems to have no bearing at all on his symptoms.

                        hpst

                        I'm not so sure it has no bearing on my symptoms. I think I've probably adapted to dithering patterns produced by different combinations of monitor/card. Hence, any changes produce discomfort as I'm no longer seeing the same pattern. I will test this out at some point by seeing if I can adjust to an uncomfortable card. I know I can adjust to uncomfortable smartphones, so I am fairly confident this is the case.

                          dev