I'm not so sure it has no bearing on my symptoms. I think I've probably adapted to dithering patterns produced by different combinations of monitor/card. Hence, any changes produce discomfort as I'm no longer seeing the same pattern. I will test this out at some point by seeing if I can adjust to an uncomfortable card. I know I can adjust to uncomfortable smartphones, so I am fairly confident this is the case.
I have bought a capture card
- Edited
I wish it was just a matter of adapting. I have tried to push through symptoms and it only got worse. I've never adapted to ANY device that was painful. I've used devices that looked weird or I expected to be problems based on some factor like a glossy screen etc, that turned out fine. But nothing functionally uncomfortable ever got better. I have seen others say they were able to adapt though. Trying not to be negative, and I know spatial dithering hasn't been ruled out yet, but its feeling like temporal dithering at least isn't as much of a factor as I had hoped. A true major factor will be night and day noticeable when identified. I'm guessing dithering might just be another additive for some people like PWM etc.
hpst I'm guessing dithering might just be another additive for some people like PWM etc.
However we can resolve that issue by using a PWM-free monitor. Temporal dithering doesn't have an off switch and as the results indicate is being used on practically all modern devices. We also know that dithering involves flickering pixels so on that basis alone I still believe it's potentially a major aggravator. Reverse-engineering PCoIP would hopefully show the different types of pixel activity between a good and a bad device, as well as the current capturing by @Seagull .
Seagull
if you send the videos to shiva wind at proton mail I can take a video of what the dithering looks like on E-ink. will be a secondary confirmation of the method
- Edited
Do you mind sharing those 5 frames you captured? I'm guessing you're testing for differences between each frame? I'm curious where dithering is happening the most. I'm guessing on images with smooth transitions/gradients. But what about flat color images?
Also, do these cards capture lossless/raw? Or do they apply some sort of YUV 4:2:0 conversion? I wonder how this would affect analysis.
I am pretty sure the card was capturing losslessly, it had a lot of encoding options. I can't remember what setting I used, but it probably had lossless in its name. It created some huge files for only a few seconds of capture aswell. I don't have the computer with the card or data to hand, but I might be collecting it from my office this month if I have time and space in my travel bag.
I can't say about where dithering happens the most as I was only capturing the windows desktop.
- Edited
@Seagull I wonder how do we square these results with the "consensus" which seems to be that GTX 6xx is more comfy than Intel. That is my experience (Also that ditherig doesn't help with symptoms). I find Microsoft Basic Display Adapter is more comfy than Intel, but what is the cause if Intel is not dithering with ditherig running (or even when it's not as these results indicate)?
degen We don't know what we're doing. We need to understand compute side and display side technology better. And we need to analyse each item individually and in combination to get a better idea. So far we're at: PWM often bad, dithering (compute side) often bad, FRC (display side) often bad, sub-pixel rendering often bad, brightness & contrast often important, environmental lighting often important, blue light bad...what else? Backlight patterns, inversion patterns, display input/output technologies (HDMI vs DP vs VGA), response times?
I've seen a few posts like yours recently. And I feel pretty much the same as I did back in 2012 when this started for me - frustrated, struggling, and not clear on what technologies are even relevant to my situation. Does this sound true: the problem is more complex than our current available means of investigating it. If it is or not, what should we be doing?
There are obviously smart people on here. But we make terribly slow, haphazard and often lucky progress.
- Edited
degen Unfortunately can't test the basic display driver as it won't recognise my capture card, might see if there is some way around it, it does kind of feel different to the intel driver.
EDIT: Was able to test the windows basic driver on a UHD630 i7 8700k, no dithering. Also no dithering on that gpu in normal use.
- Edited
Edward I've seen a few posts like yours recently. And I feel pretty much the same as I did back in 2012 when this started for me - frustrated, struggling, and not clear on what technologies are even relevant to my situation. Does this sound true: the problem is more complex than our current available means of investigating it. If it is or not, what should we be doing?
I have learned more about displays and graphics in the last 5 years thanks to this site and others than ever before. I share your frustrations, however. Everything in my gut tells me that temporal dithering is the issue for most people here. PWM is very easy to rule out (simply don't use a screen that uses PWM!) but then it gets more complicated with pixel inversion (at this point my knowledge runs out).
If I have a good device and a bad device using the same HDMI cable and the same monitor, then the techie in me says "OK, it's something to do with the hardware/software in this machine" (monitor has been ruled out as the cause). So at this point we're looking at the 'point of failure' being..
VBIOS (Mobo/GPU) (which may default to temporal dither, hence why even on POST screens we get discomfort on new tech)
OS itself (W10 composition spaces).
OS Drivers (which could be set to temporal dither)
VBIOS is locked down and generally unaccessible to make changes AFAIK
OS - Maybe a regedit or fix could disable any OS tomfoolery
OS Drivers - (Ditherig/Nvidia reg keys which aren't 100%).
Reading into the patents concerning Temporal Dithering have been somewhat revealatory to me. It does cause flicker, regardless of how miniscule it's claimed to be or how it can "fool the eyes" to see the extra colors. It is snake oil for supposed '10 bit' displays, which in fact are 8bit. Or it's to guarantee uniformity of color on any panel. Which makes sense from a customer satisfaction perspective, but flicker is flicker.
diop So for a solution we need to engage perhaps:
- display manufacturers
- VBIOS writers
- OS writers
- OS driver writers (does that mean OEMs for integrated/discrete GPUs, and basically Linux guys?)
- I/O standards groups (for HDMI, DP, VGA, DVI, etc.)
That seems like a far more complex approach than what has been achieved so far. PWM was solved by display tetch only, same for blue light.
Things get more complex when we add that different sufferers are sensitive to different technological issues, and possibly different combinations of issues.
I really want to know what people think the best next steps are.
diop Reading into the patents concerning Temporal Dithering have been somewhat revealatory to me.
On a related note, I have figured out I can test for temporal dithering creating a flicker effect by seeing if the dithering pattern repeats, and how many frames it takes to repeat. I will have a go at it when my work load is a bit lower.
- Edited
Seagull I think I've probably adapted to dithering patterns produced by different combinations of monitor/card. Hence, any changes produce discomfort as I'm no longer seeing the same pattern. I will test this out at some point by seeing if I can adjust to an uncomfortable card. I know I can adjust to uncomfortable smartphones, so I am fairly confident this is the case.
Hope it's ok to bump this old thread but I'm beginning to suspect this could be the issue for some of us. Do you still think this theory is true @Seagull ?
Yes, for many years it have been key to how I manage my symptoms. Above a certain intensity of eyestrain/migraines I will abandon a device because it is unlikely I will be able to adapt to it, but for most devices I will try to get used to the device. I've never been able to adapt to an IPS LCD device, but flicker free TN LCDs and OLEDs now only take me a few weeks/months to be usable.
It has been harder than it sounds though. A lot of my eye strain symptoms are linked to diet, I have a lot of food intolerances which trigger eyestrain and migraines. Its only since finding those intolerances and managing them that adapting to bad devices has been an option. I need to keep my body, and therefore my brain, as healthy as possible to be able to adapt to new devices.