I have bought a capture card
Seagull What kind of monitor do you use when testing? Wouldn't you generate a different result with different wires and different monitors as well? Also how long as the monitor been on is it in a cool basement or on the main floor? Temperature can have a major difference on monitor performance. I think VA is the worst for this and needs 30 minutes to reach a normal level for testing.
jasonpicard The monitor has no bearing on the capture card itself since if it works the way I think it does, it feeds the GPU signal directly into the capture card, or into a splitter of which it also goes into the monitor.
I could see VERY bad quality cables interfering with the results. I once had an HDMI splitter that would cause "snow sparkles" on the image without dropping out.
As @JTL says the monitor doesn't affect it. The output from the GPU goes directly into the capture card as if it were a second monitor. I don't use any kind of splitter.
- Edited
It would be great if eventually we could compile a 'dither-free' database similar to the tftcentral flicker-free db.
One thing I was thinking - would it be worth testing a single card in different states e.g. During POST/BIOS menu, Win OS Desktop, Linux OS Desktop just for further comparison to see if there is dithering on these cards constantly or when loading specific software.
Also this could also be used on games consoles etc anything that uses HDMI, correct? E.g. a good xbone and a updated (bad) xbone? A good DVD/media player and a bad DVD/media player. The link may not just stretch between PC's but other devices.
diop It would be great if eventually we could compile a 'dither-free' database similar to the tftcentral flicker-free db.
Unfortunately for both of us, since I recall you hoping dithering was the answer as much as I do, its not looking like it is given @Seagull's results so far. So far it seems to have no bearing at all on his symptoms.
I'm not so sure it has no bearing on my symptoms. I think I've probably adapted to dithering patterns produced by different combinations of monitor/card. Hence, any changes produce discomfort as I'm no longer seeing the same pattern. I will test this out at some point by seeing if I can adjust to an uncomfortable card. I know I can adjust to uncomfortable smartphones, so I am fairly confident this is the case.
- Edited
I wish it was just a matter of adapting. I have tried to push through symptoms and it only got worse. I've never adapted to ANY device that was painful. I've used devices that looked weird or I expected to be problems based on some factor like a glossy screen etc, that turned out fine. But nothing functionally uncomfortable ever got better. I have seen others say they were able to adapt though. Trying not to be negative, and I know spatial dithering hasn't been ruled out yet, but its feeling like temporal dithering at least isn't as much of a factor as I had hoped. A true major factor will be night and day noticeable when identified. I'm guessing dithering might just be another additive for some people like PWM etc.
hpst I'm guessing dithering might just be another additive for some people like PWM etc.
However we can resolve that issue by using a PWM-free monitor. Temporal dithering doesn't have an off switch and as the results indicate is being used on practically all modern devices. We also know that dithering involves flickering pixels so on that basis alone I still believe it's potentially a major aggravator. Reverse-engineering PCoIP would hopefully show the different types of pixel activity between a good and a bad device, as well as the current capturing by @Seagull .
Seagull
if you send the videos to shiva wind at proton mail I can take a video of what the dithering looks like on E-ink. will be a secondary confirmation of the method
- Edited
Do you mind sharing those 5 frames you captured? I'm guessing you're testing for differences between each frame? I'm curious where dithering is happening the most. I'm guessing on images with smooth transitions/gradients. But what about flat color images?
Also, do these cards capture lossless/raw? Or do they apply some sort of YUV 4:2:0 conversion? I wonder how this would affect analysis.
I am pretty sure the card was capturing losslessly, it had a lot of encoding options. I can't remember what setting I used, but it probably had lossless in its name. It created some huge files for only a few seconds of capture aswell. I don't have the computer with the card or data to hand, but I might be collecting it from my office this month if I have time and space in my travel bag.
I can't say about where dithering happens the most as I was only capturing the windows desktop.
- Edited
@Seagull I wonder how do we square these results with the "consensus" which seems to be that GTX 6xx is more comfy than Intel. That is my experience (Also that ditherig doesn't help with symptoms). I find Microsoft Basic Display Adapter is more comfy than Intel, but what is the cause if Intel is not dithering with ditherig running (or even when it's not as these results indicate)?