Eyestrain solved after 6 years and multiple panels - LG 32gk850g
- Edited
Link In my mind it's not about brightness wrt to dithering..its about movement. Watch @degen's videos. You can see movement everywhere because the E-ink panel makes it visible unlike an LCD. If you were reading a book where the white backround and black letters were always moving a little bit and changing shades even a little etc it would be tiring and hard to focus on. I think this is what dithering is like for us but on a much larger scale since even what looks like a solid white background is really a bunch of wiggly points rather than a solid area. Compound that with fonts, images, video etc its constant movement and that's exhausting for the eyes. All that wiggling, movement, changing of colors etc are not obvious to the naked eye but are straining the visual system nonetheless. It's like "seeing" a solid white paper and when zooming in realizing its thousands of maggots crawling all over each other. We may not recognize that movement when zoomed out, but the eye muscles and brain do and are filtering it. We are already filtering out so much when looking at LCDs on a macro scale that this might just be the straw breaking the camel's back.
Obviously this is entirely speculation, the dithering IS there but it might not bother us at all...or it might be the root cause and all the other stuff like PWM/eye accomodation etc are just additive factors or triggered by it. People were positive PWM was the root cause as well. But we need to rule this out if nothing else and there isn't any other obvious theory that at present doens't have a definitive way to test.
Link Currently I work with this around 9 year old CCFL backlit laptop running Windows 7 and I am pretty sure dithering is a big thing for me. My eyes can tell after some seconds (or even right away) if it is comfy or not.
I do not have proof like color banding screenshots or such but I can surely tell that setting the graphics driver to "enable 10bit color" made me able to work with Opera and Capture One Pro again.
Otherwise I got a bad eye strain after some minutes like most of us do.
Running the current version of Manjaro Linux gives me strain again. So I am pretty sure it is software based in my case. That tells me that it is possible to turn it off in Linux as well. Good thing here is that it is open source and it is possible to modify the source...
PWM is its own beast and I found a laptop with CCFL and PWM bad as well. I also own a PixelQi 10,1'' panel that uses PWM and it is not as good as I hoped it would be.
Also @Gurm stated in a different thread that Destiny 1 is fine on his Xbox One whereas Destiny 2 is not.
( https://ledstrain.org/d/358-xbox-one/11 ). I am no expert here but as he also thought the graphics card changes its graphics mode and enables dithering. I would suspect it is the color depth...
- Edited
deepflame I am no expert here but as he also thought the graphics card changes its graphics mode and enables dithering. I would suspect it is the color depth...
From another non-experts point of view it DOES seem a logical theory. And since we are never going to get every manufacturer and software maker to agree to turn dithering off or give us a switch...it seems tricking things into believing the needs can be met without dithering is the onloy sweeping approach possible. You have a reproducable relief within those apps...now we need to figure out how to get the OS to react like the apps and make this a system wide trick. I don't know if this is a plausible idea or not...tricking things into believing they don't need dithering, vs somehow blocking it.
- Edited
hpst Yeah, I think it makes sense as every manufacturer is about HDR, high contrast, sharpness bla bla... And I think the hardware cannot deliver this completely yet so they need to help with software voodoo like dithering.
(yes, blue light and PWM are their own things but I think that we are in the HDR aera now where dithering/FRC plays a big role).
hpst I didn't even know that forcing 10-bit color could stop dithering. I was assuming that by forcing 10-bit color it would try and dither a 6-bit panel to 10-bits and fail. Or in the case of an 8-bit panel it might work using dithering to make a 10-bit image, but that's not what we want.
It's kind of complicated but my approach was just reverse engineering and editing the driver code to stop dithering altogether. Harder then it sounds.
- Edited
JTL So do you think @deepflame's experience is a one off specific to his/her hardware and apps? Or logically reproducable in general with the idea that the system is requesting a certain level of color and that lying to the system (as his setup seems to be doing by saying its a 10-bit panel in a setting even if the hardware really isn't) would work on a macro scale? How would we proceed to determine this and then how would we extrapolate this to a larger solution like telling the whole system the panel is high bit to hope for the same response the apps are giving and not asking for dithering?
- Edited
JTL If you know any way I could check it on a 6 bit panel running a linux distro please let me know. If there is some way to trick the OS into thinking the panel is 8-bit/10-bit etc etc to see if this stops the 6+2 dithering crap. This could be a real breakthrough tactic wise. I just don't know where linux makes this request..like is it in the various drivers? In @deepflame's case it sounds like the individual apps are requesting 10-bit and are accepting being told the panel is 10-bit by that software setting JUST for those app windows, thus not needing to dither (which I am not sure if its a Windows graphics setting or what). It's probably a lot harder than I am imagining...but it would be amazing if this was a viable way to get around dithering for now.
Adding my experiences to the wider discussion here. I'm A-OK with my Plasma TV and I'm OK with my Dell 2407/2410 or similar CCFL screens on a Nvidia 970 (and AMD 6950), as well as my 2013 MacBook Pro's IPS screen.
Otherwise several new LCD TVs, most iPads and iPhones (haven't tried OLED), various laptops, and several new computer monitors (high refresh, low blue, AMVA, IPS), Xbox One, and 980, 980Ti, 1070, 1080 cards all give me eye strain. At this point I'm thinking wow - it's both the screens AND the hardware that's driving them. Realising it was also the video cards was really a sad moment. But then you add the new version of Chrome, Firefox and Prepar3D Flight Sim V4 that ALSO give me the same eye strain and the whole situation gets twice as convoluted, and frustrating.
Now, I really don't know what to make of it. Screens do it. Video cards do it. And now windows (part or full screen) can do it too. For the life of me I cannot decide if it's blue light, PWM, dithering or something else. Right now, my money's on dithering or maybe something else. And you know it's damned expensive buying gear to TRY only to have to on-sell it at a loss. Where I live we don't have easy try-before-you-buy or money back returns (except Apple - which I've taken up on a couple of iPad tests).
And yes I've had my eyes tested, blah blah blah, and I counter that by saying - "I can watch/use my older hardware for hours with no problems at all - so it's not my eyes!"
Personally, all I really want right now is to be able to go out and buy a more powerful GPU for my flight sims - I can live with 24" screen(s). But then if I ever change job... I guess I'll need to buy this MacBook Pro 2013 that I'm happily using - from my current employers.
- Edited
AgentX20 I wonder what's going on there...if its app/site dithering or something else. On my super old laptop which is myonly safe device those apps/sites don't strain me even though I can definitely tell they "look" different in some odd way now. Some sort of smooth/shiney/glowing effect that wasn't there before. Maybe my laptop is too old to do the dithering tricks or to do them as intensely so its just not bothering me. I don't have ANY modern hardware that doesn't suck so can't even compare sites/apps as everything on them strains.
- Edited
hpst I just don't know where linux makes this request..like is it in the various drivers?
Monitors can report their bits as part of their EDID info they send to the computer. It can be faked. However, I'm not sure if this approach keeps blocking/faking the true EDID forever while the system is running. And it is a lot to read. I tried reading the monitor EDID some years ago and saw that my Dell U2515H reported true 8 bit while the BenQ EW2440L didn't report anything regarding his color capabilities. If software/drivers always respect that info is not clear. Some cards probably ignore it and always dither.
https://wiki.archlinux.org/index.php/kernel_mode_setting#Forcing_modes_and_EDID
- Edited
KM @JTL I am clearly out of my depth but in @deepflame's situation he said a software setting on the GPU was set to "10-bit mode"...and even though the laptop's panel wasn't 10-bit this fooled the apps requesting it into believing they didn't need to dither to get that 10-bits so it sounds like the apps trusted the gpu and not the display?
IF trickery would even work universally, rather than having to make hundreds or more of individual panels to report false capability back to whatever is requesting that quality, my thought (again based in very shallow ideas and no programming ability so might be nonsense) was could we somehow come from the other direction and extrapolate what his gpu setting was doing (fooling the requesting apps so they don't bother to dither as they don't think they need to to achieve the 10 bits) to a higher level global behavior so when whatever it is in the OS that says "show the desktop and file system and everything etc in this quality and dither to get that quality if you can't natively do it" (just like the apps in his case, and which has to be happening somewhere in software since in just a bog standard distro makes 6 bit laptop panels dither to more colors right?) that we get a system wide change rather than app by app.
Is that just obviously stupid and I am too ignorant to how everything comes together for this to even be possible? What is it in linux for example that says what quality to display the graphical DE in and triggers dithering to get it? Is it the gpu driver? Xorg? As far as I know laptop panels are "dumb" and aren't making that choice internally like some desktop monitors can right?
hpst @JTL I am clearly out of my depth but in @deepflame's situation he said a software setting on the GPU was set to "10-bit mode"...and even though the laptop's panel wasn't 10-bit this fooled the apps requesting it into believing they didn't need to dither to get that 10-bits so it sounds like the apps trusted the gpu and not the display?
I don't have the same software & hardware to test in front of me so I don't know what his setup is doing.
hpst Is that just obviously stupid and I am too ignorant to how everything comes together for this to even be possible? What is it in linux for example that says what quality to display the graphical DE in and triggers dithering to get it? Is it the gpu driver? Xorg? As far as I know laptop panels are "dumb" and aren't making that choice internally like some desktop monitors can right?
No. @deepflame's experiment is the first I've heard of this, but I'd need the same hardware in front of me in order to understand what it's doing and if I can port it to be more universal.