Eyestrain solved after 6 years and multiple panels - LG 32gk850g
- Edited
AgentX20 I wonder what's going on there...if its app/site dithering or something else. On my super old laptop which is myonly safe device those apps/sites don't strain me even though I can definitely tell they "look" different in some odd way now. Some sort of smooth/shiney/glowing effect that wasn't there before. Maybe my laptop is too old to do the dithering tricks or to do them as intensely so its just not bothering me. I don't have ANY modern hardware that doesn't suck so can't even compare sites/apps as everything on them strains.
- Edited
hpst I just don't know where linux makes this request..like is it in the various drivers?
Monitors can report their bits as part of their EDID info they send to the computer. It can be faked. However, I'm not sure if this approach keeps blocking/faking the true EDID forever while the system is running. And it is a lot to read. I tried reading the monitor EDID some years ago and saw that my Dell U2515H reported true 8 bit while the BenQ EW2440L didn't report anything regarding his color capabilities. If software/drivers always respect that info is not clear. Some cards probably ignore it and always dither.
https://wiki.archlinux.org/index.php/kernel_mode_setting#Forcing_modes_and_EDID
- Edited
KM @JTL I am clearly out of my depth but in @deepflame's situation he said a software setting on the GPU was set to "10-bit mode"...and even though the laptop's panel wasn't 10-bit this fooled the apps requesting it into believing they didn't need to dither to get that 10-bits so it sounds like the apps trusted the gpu and not the display?
IF trickery would even work universally, rather than having to make hundreds or more of individual panels to report false capability back to whatever is requesting that quality, my thought (again based in very shallow ideas and no programming ability so might be nonsense) was could we somehow come from the other direction and extrapolate what his gpu setting was doing (fooling the requesting apps so they don't bother to dither as they don't think they need to to achieve the 10 bits) to a higher level global behavior so when whatever it is in the OS that says "show the desktop and file system and everything etc in this quality and dither to get that quality if you can't natively do it" (just like the apps in his case, and which has to be happening somewhere in software since in just a bog standard distro makes 6 bit laptop panels dither to more colors right?) that we get a system wide change rather than app by app.
Is that just obviously stupid and I am too ignorant to how everything comes together for this to even be possible? What is it in linux for example that says what quality to display the graphical DE in and triggers dithering to get it? Is it the gpu driver? Xorg? As far as I know laptop panels are "dumb" and aren't making that choice internally like some desktop monitors can right?
hpst @JTL I am clearly out of my depth but in @deepflame's situation he said a software setting on the GPU was set to "10-bit mode"...and even though the laptop's panel wasn't 10-bit this fooled the apps requesting it into believing they didn't need to dither to get that 10-bits so it sounds like the apps trusted the gpu and not the display?
I don't have the same software & hardware to test in front of me so I don't know what his setup is doing.
hpst Is that just obviously stupid and I am too ignorant to how everything comes together for this to even be possible? What is it in linux for example that says what quality to display the graphical DE in and triggers dithering to get it? Is it the gpu driver? Xorg? As far as I know laptop panels are "dumb" and aren't making that choice internally like some desktop monitors can right?
No. @deepflame's experiment is the first I've heard of this, but I'd need the same hardware in front of me in order to understand what it's doing and if I can port it to be more universal.
- Edited
JTL I believe it is a Thinkpad W500? If you don't need one in awesome condition that's under 100usd on eBay. If nobody else would pitch in a few bucks toward it, I'd be willing to just order it for you myself if you believe it could at least rule this in or out as an option since it/along with your work/ are the only actionable ideas I have seen. I can eat beans for a while if it could even possibly lead to a breakthrough.
- Edited
JTL Sorry for being pushy or overeager sounding. I'm desperate as you know so I tend to get intense and a burst of "hope energy" when I think there is a possibility in the darkness. If you know anyone else who would be good to approach for this particular angle please let me know. I am still interested in contributing to the card as well since any solution is a godsend...and I am surely missing something about this idea as things are never as easy as toilet logic makes them sound.
If the EDID was respected and reliable then the FireGL/Firepro cards would not have the option to enable 10bit color mode. If it was reliable then the card would just rely on the info it got from the screen, right?
I think enabling this "10-bit color" mode just tells the graphics card what to do when 10bit color is requested. Should it dither or not.
So I am not sure if fooling the system to believe it has a 10bit display attached solves the dithering issue.
For me as a software engineer the better approach would be to see when the problem occurred.
My idea would be the following:
- get a setup that does not dither on an old Linux box
- update the graphics driver/kernel so long until dithering starts.
- check the graphics driver source and look for DITHERING constants or other code changes that might have affected this.
- reverting the code change, compile the driver/kernel module and see if it helped.
Think this would help us to see what really is going on. Again, I think the issue is color depth (temporal dithering)
The same we could do for Chrome as well. As people have pointed out old versions of Chrome were working well. Maybe we can find old versions of Chromium (best portable versions) somewhere that we can test and see which version introduced the different rendering method that causes strain.
As Chromium is also open source we could check for code changes here as well. I would assume they could be massive but it is work a try. In the case of Chrome I think we are talking about an updated version of harfbuzz for font rendering here ( https://www.freedesktop.org/wiki/Software/HarfBuzz/ ).
AgentX20 I'm OK with my Dell 2407/2410 or similar CCFL screens
I heard that the 2410 has PWM and the 2407 does not. Can you confirm that? Are you ok with PWM on the 2410 If it has?
deepflame If the EDID was respected and reliable then the FireGL/Firepro cards would not have the option to enable 10bit color mode. If it was reliable then the card would just rely on the info it got from the screen, right?
In theory.
A good friend of mine had one of those off-brand Korean TV's that claimed by the EDID it supported 1920x1080 resolution, but from memory it could only scale 1280x720 to a 1920x1080 image.
- Edited
deepflame Would you be willing to work on this? My primary need is desktop linux but I am not a power user or programmer. I have a desperate and urgent need for a solution because of my life situation and will lose a once in a lifetime chance at getting a life back under me if I can't use a damn computer anymore. At this point I don't even care what distro. Is there anything we could do to help? If it's a funding issue I know I would put in some and hope others can too. Personally I think we should crowdfund this in general so it's someone's primary focus. We have a few very smart people here but there isn't any full time work being done on this.
- Edited
JTL Oh I hope you didn't think that was a dig at you in any way! You are one of the smart people I was referencing...and I knew you didn't have the time or energy to work on this full time. I wish I had the knowledge and skills or knew someone we could crowdfund payment to so it's their job to solve this. I hate having theories and hopes with no real way to act on them directly and it seems any attempt to get outsiders to help, like the intel thread, get shut down or time out. It took forever to get them to even interact...then they did and did a lot of testing but not the one thing most likely to be our problem and said basically "cant reproduce not a problem".
JTL Hmm, not sure if I understand. I meant that if the EDID was respected then there was no need for the driver to offer the 10bit mode. It is offered because the EDID info is not reliable and the card cannot tell if it is really a 10bit screen that it connected. Hence there is an option to turn 10bit mode on explicitly.
So yeah, your example shows that the EDID is not reliable. Guess this is what you wanted to say?
- Edited
hpst Would you be willing to work on this?
Well, I think it would be interesting to find out. Not sure if I have too much time though and I do not want to feel the pressure of someone whose life depends on this in my back. Hope you understand me correctly here. I am glad to help but I would assume that there are other possibilities as well.
E.g. currently I work with Windows 7. If I would have to do development work I would rather work with a Linux machine (easier to use Git, Ruby, Node, ...). So my idea is to use my Windows machine to connect through SSH to a Linux machine (or my Mac) and do development stuff there.
Hmm, ok guess your problem would also be the code editor. Well, I use VIM in the terminal. So this would not be an issue for me. However you could also use SMB or something to expose the Linux files over the network and you could use Sublime Text or whatever you prefer on Windows then. Or you can run a Linux virtual machine with VMware or Virtual Box.
- Edited
deepflame I guess I should have said I stongly WANT linux rather than NEED it. I simply need a browser and text editor to work. For me at least no modern OS and hardware is strain free so if dithering is our problem and it can be shut off then I'd just as soon do that in linux and keep using it since it removes other issues I don't want to deal with like walled gardens, expensive hardware, spyware OS nonsense etc.
I had only asked because you mentioned being a SWE and brought linux up so I assumed it was your preference. About the pressure...right now we don't have any organized effort toward this so anything done is a positive if nothing else to rule out a theory. I mentioned funding because it could motivate someone to work in a more dedicted manner where they aren't squeezing it in between al the other demands on their time.