Eyestrain when switching from Windows to Linux
I just went back to the Nouveau driver and disabled dithering through xrandr, and made a .xprofile in my HOME folder to run the commands to disable it on Login for both my displays:
xrandr --output HDMI-1 --set "dithering mode" "off"
xrandr --output DVI-I-1 --set "dithering mode" "off"
and I logged out and logged back and I THINK dithering might be disabled now using this opensource driver. I'm not getting any strain so far. I can focus on things without my head pounding.
Not going to say I found a solution for Nvidia just yet. I'm not prone to placebo type effects, but we'll see after a few hours...
It's just too bad this driver runs like crap compared to the proprietary.
- Edited
Nouveau driver + disabling dithering in it has FIXED the problem for me without question. Everything now feels "still" and calm. No more of that nearly imperceptible shimmering or intensity to the color.
Something must definitely be broken with the proprietary Nvidia drivers. When I open nvidia-settings and change to disable dithering, as soon as I close nvidia-settings it saves the .nvidia-settings-rc with dithering re-enabled again. So I'm pretty damn sure something is borked. I'm gonna go digging in the nvidia-settings source code to see if I can figure it out.
Can't believe I spent so long fussing with the font settings when they weren't the problem at all. I should have known when I was still getting eyestrain watching Twitch/Youtube videos.
So PLEASE, whoever is in a similar situation as mine with Linux + Nvidia. Try the Nouveau driver and disable dithering through that. And use the commands in my above post in a .xprofile file for persistence across reboots. Replace HDMI-1, etc with your correct connected display which you can see when you run the command: xrandr --props
. You can than check with the same command afterwards to see if it's disabled.
- Edited
Wallboy I had a similar experience some time ago with my Quadro NVS 295. The official proprietary driver introduced some eye strain, even though dithering was disabled in the driver program. Nouveau, much better. However, there was still some small eye strain that was just not there in Windows 7. Maybe one day we can reveal the differences with a capture card. I'm not sure if temporal dithering is enabled on my card by default since xrandr would only say "auto" instead of on or off. This is different for different cards. I tend to think the driver itself is easier on the eyes, not the dithering options. May be worth a try to turn dynamic dithering on and see if the eye strain is coming back or not.
Wallboy Nouveau driver + disabling dithering in it has FIXED the problem for me without question. Everything now feels "still" and calm. No more of that nearly imperceptible shimmering or intensity to the color.
Would you say it's turned your machine from unusable to usable? It may be an idea to use it for some time (a week or two) to be sure it's good for long-term use.
@Gleb Had similar issues using his Intel Mac and found a way to disable dithering, it then became usable and much more comfortable.
This is adding to the theory that the issue is temporal dithering.
The bad side to all this is gamers and companies think temporal dithering is a good idea, almost essential, because nobody in 2019 wants to see banding .
Yes it's usable now. I will for sure take a week or two with it like this to really make sure it's all good. Their are a few downsides to this driver though such as it doesn't feel like it's as good performance. UI isn't quite as snappy, mouse kind of lags in Firefox when HW rendering is force enabled. Also the fan always spins now since I'm guessing the proprietary driver is the only one that can control the "Fan Stop" mode.
But for the next week, I'm not messing with it anymore. My head feels like it's been in a microwave for the last week. Gotta let the eyes/brain repair.
- Edited
Did anybody try to shoot slo-mo video and compare good/bad OS? (Many of recent smartphones support such feature with 240fps)
Update: it's maybe worth to extract frames from each video and compare serial frames within same video
- Edited
Glad this has finally been brought up. I've been experiencing this annoying issue for YEARS since around 2011 when I first started using Linux. This, in fact, has been keeping me from having Linux as my main OS, which is something you have probably heard before. The worst thing about it is its elusive nature, which often leads to fruitless arguments and criticism on part of so-called Linux savants who are perfectly immune to and ignorant of the issue. I don't think they all have just the right hardware/software, which renders it imperceptible, I think it's individual after all, and to our great misfortune people like us are an exceeding minority. Due to that our feedback mostly results in generic advice i. e. "check your GPU", "change the font" or worse yet unfounded criticism and taunting. Wake up you people, this isn't a placebo, it's a REAL goddamn problem, which, in fact, never existed on Windows. And it's not just annoying, I reckon it's pretty hazardous too as my sight suddenly deteriorated years back when I finally began to feel weird while using Linux. Yes, this is NOT a placebo but it may still take some time for you to acknowledge its existence, especially if you're new to Linux and don't expect anything like that. I've seen some sparse forum threads where people struggle to research this ugly phenomenon and provide solutions mostly hopelessly. Here is a good example: (https://forums.freebsd.org/threads/eye-strain-from-certain-video-modes-drivers.53468/)
The suspected culprits thus are typically the following:
- Temporal dithering
- Pulse-width modulation
- Backlighting
- Hardware acceleration
I could also add that it may have nothing to do with Nvidia or proprietary drivers per se, I've experienced this with ATI, too. I'm not sure, but something tells me it could have something to do with kernel or *nix display manager, X or otherwise. Yes, it doesn't happen just on Linux, but FreeBSD, too. Maybe even on mac OS. Can't really tell, I've never used it.
Anyhow, I still believe we are on the same exact page. I'm not sure I can help with any technical advice, but I could try and describe the issue as best as I can, that is how I personally perceive it:
- Seeming inability to focus sight at one spot
- Perceived "fuzziness" of all screen elements
- Constant tension in temple-side ocular muscles (lateral rectus)
- Tinkling and "wet" sensation in the forehead and temples
And if you go back to Windows the tension immediately subsides as the screen suddenly becomes crisp and "calm".
To sum it up thus: you can't see it, only FEEL it, therefore making screens of it is pretty useless. It also has nothing to do with our sight, GPU, monitors and refresh rate. And, of course, it would never happen in a VM. Been there ,done that.
Hope this minute testimony of mine will assist all of us in the long run.
Pudentane not all versions of Windows, but yes Linux kind of has two modes - "basic" where the drivers are barely good enough to put the screen into the correct resolution, and "bleeding edge" where they use hardware accelerated vector surfaces to render the letter "Q" just because they're available.
Both of these present problems. The "basic" mode probably calls some generic awful "show me SVGA 1600x1200" without providing any extra information that might put the monitor into the correct rendering/polarity/sync mode. I've seen this a LOT on NUC or small-form-factor Linux machines. These are used primarily for compute, and nobody ever logs into them. It looks really dreadful, even to people without our problem.
The other problem, however, is even worse because "normal" people just see that it's lightning fucking fast and don't care that it has full motion antialiasing and temporal dithering which are probably ok for video games but not text.
The likely culprits you mentioned line up well with what we look at here. It very likely has far more to do with dithering and rendering (what you call hardware acceleration we call "rendering artifacts") than with PWM or the lighting, but some people are sensitive to those, also.
The good news is that people in here have had SOME luck turning off those advanced features in certain Linux distros. I know this forum isn't easy to search, so hopefully they chime in here.
Im a Windows 7 user. I have lately tried Ubuntu 18.4 on my old machine which I havent used for a long time... i5-3470 and GT210 and it was nice I have tested it for couple of hours learning new things on linux... so I thought I will maybe make this my OS for internet and dualboot W7 for my gaming library (I like old games). I remember I was in GPU options and driver installed was not from nvidia I have changed it to nvidia native.
Next - I havent straight away installed it on my main PC instead I just did run it without install to test. And STRANGLY I have felt eye strain by just 10 minutes at desktop!? I have noticed here the driver wasnt nvidia too couldnt change it cause I would have to install it but no free hdd as for now.
Could driver make that kind of difference? Or just different PC/different gpu+ubuntu is just giving eye strain and nothing can be done about it?
- Edited
Yeah I'm still around.
Basically the following day after finding the Nouveau driver to be ok, I got mad at the performance of it compared to the proprietary driver. So I decided to try a completely different branch of distros. I had tried various Debian based ones thus far. So I went with an Arch based distro to see if that would make any difference. Ended up going with Manjaro KDE. Unfortunately with the Nvidia proprietary driver, the same eyestrain was present there.
At this point I was burned out trying different things and I just went back to Windows. I installed Virtualbox and then installed Manjaro KDE on it. Absolutely no eyestrain in Vbox + fullscreen at native res. Can work in it for hours. Of course the performance of Vbox isn't nearly as good as the actual install. But it was ok enough to mess around in and learn Linux for the time being.
Last week I was reading through the 970 thread and noticed the discussion about different vBIOS, and I went ahead and installed the newest vBIOS for my cards ( 84.04.36.00.F1 / MSINV316H.186 ). Though I haven't got around to actually testing to see if it makes a difference. I guess I'm expecting it won't make a difference.
I was also researching how to actually go about measuring what the few of us are seeing. Since we can't SEE the problem, but rather "feel" it. I don't know much about analog electronics, but through research I found we would probably need to use a photodetector + oscilloscope to measure the monitor. The same hardware used by monitor review sites to measure pixel response time. This would likely show any sort of dithering or other impercievable flashing problems. Unfortunately it isn't exactly cheap. Around $300 for an amplified detector + around $150 for an oscilloscope. Another problem is, even if these devices did measure a difference between certain video drivers + OSes, what would it really solve other then proving the problem truly does exist?
- Edited
Wallboy Another problem is, even if these devices did measure a difference between certain video drivers + OSes, what would it really solve other then proving the problem truly does exist?
Proving there has been a change (e.g. no dithering/spatial dithering in older OS/Driver and the jump to temporal or similar in new OS/Driver) is a big step. Our personal experiences bear more weight when complimented with hard evidence.
Also, the fact remains that dithering is still the only component that isn't user controllable. I can buy a PWM-Free monitor and make sensible hardware choices, which are all pointless if dither/rendering is the real issue.
The Intel Linux drivers are open-source. You can rebuild drivers to your taste and spec, if you were rolling your own distro or just wanted to experiment. There probably is a way to build a dither-free Intel driver right now, all the code is online, we just need the right guru to help us out.
Intel has a large prescence on Social Media, and the graphics driver team are around. I have recently reached out to a member from Intel with my concern's re: dithering/being unable to use my NUC. I didn't want to flood /r/Intel so sent PM's - and got a response.
Hi - thanks so much for reaching out to me about ledstrain.org. I'm sorry to hear about your eyestrain issue, and hope you can find a solution. I'm currently on medical leave, so unfortunately I can't do any work right now. Please send the same message to X who is covering for me while I'm out; and he can connect with the right folks.
..and then a futher comment
P.S. And thank you personally for the link to that site. I'm off work due to visual issues which make it difficult for me to look at screens. Still in the diagnosis phase though.
The irony wasn't lost on me....
Wallboy I believe you can't capture pixel flicker with that setup. I tried it with the setup described in our oscilloscope thread (BPW34) and it barely showed anything. Holding it against the screen (a still image, or text for example) won't work well enough. I found almost no difference at all while turning on and off Nvidias temporal dithering in Linux (and even if it's off, the official driver still produces some eye strain). The best way should be to intercept the video signal in between monitor and graphics card. It isn't cheap either. We need a lossless capture device (card or external) and a fast enough PC. Such a purchase ideally should be planned in a way that any device can be captured (also gaming consoles and other external devices), not just the PC's own graphics card. And then we need a way to compare captured video frames one by one to see what exactly changes between frames. Comparing screenshots (taken from the lossless capture device, not the PC) should work, too.
I didn't even realize there was an oscilloscope thread. You mentioned it barely shown anything, but that may be because the photodiode you were using wasn't amplified. I was talking to people that measure monitor response times and they said any photodetector for measuring light sources such as a monitor really does need to be amplified for an oscilloscope to pick it up, otherwise the current is too low to show anything meaningful on an oscilloscope.
I did also look into measuring the signal going into the monitor, but those HDMI analyzers cost a fortune. I'm going to dig into the Linux DRM/DRI code to see if there any any debugging flags I could enable to possibly dump each frame sent to the GPU somehow. Problem is I'm sure this happens in Nvidia's proprietary binaries, so it may not be possible.
- Edited
Gurm The likely culprits you mentioned line up well with what we look at here. It very likely has far more to do with dithering and rendering (what you call hardware acceleration we call "rendering artifacts") than with PWM or the lighting, but some people are sensitive to those, also.
I'm not even sure it's dithering anymore. I've played with nvidia-settings, disabled it, used 6 and 8bpc (though, my display only supports 6bps), changed modes to static, dynamic, temporal etc. None of that made any difference except disabling dithering altogether DOES work. I've tested it on a gradient and clearly saw banding in off mode, but that didn't do away with eye-strain. I've used every mode extensively to rule out placebo factor and it's the same thing every time. My ocular muscles were tense all the way, I couldn't relax them. It could be hardware acceleration coupled with something else, and that something could be rooted deeply in *nix systems themselves. I'm guessing either some kernel presets or display manager's fault. The truth is, we don't really know whom to blame, Nvidia, ATI or open-source devs.
Incidentally, if anyone wants to test dithering in on and off modes feel free to check the gradient picture below:
Gurm The good news is that people in here have had SOME luck turning off those advanced features in certain Linux distros. I know this forum isn't easy to search, so hopefully they chime in here.
Definitely better than nothing. Only again, I'm not so sure if it's all about disabling stuff, maybe something is altogether missing to maintain proper hardware acceleration for instance. We need to know exactly what we are dealing with here.
Luki99 Could driver make that kind of difference? Or just different PC/different gpu+ubuntu is just giving eye strain and nothing can be done about it?
I can't really account for your issue, but I'm guessing it's not at all exotic and what you're facing is probably the same problem as mine, and no, installing proprietary driver shouldn't help in in this situation. The bottomline is, there should be NO eye strain from LCD display apart from faulty PWM or backlighting, which should be present in Windows, too. Not even low refresh rate. Also, it's not Ubuntu specific at all, it should happen in absolutely ALL *nix systems, not Linux alone. A few years ago I tried out FreeBSD and it was exactly the same thing there, if not worse. All these systems have something in common, and like earlier, that something could be shared kernel presets or display manager settings.
Wallboy Ended up going with Manjaro KDE. Unfortunately with the Nvidia proprietary driver, the same eyestrain was present there.
Seriously, don't waste your time. It's present on ALL *nix OSs. I've never tried Ubuntu or Manjaro but plenty of other distros I have. I've seen some threads before where people claim there is less eye-strain on Arch and Manjaro, but at least your very experience doesn't quite live up to some of those claims. Currently I'm on opensuse, which is a distro of my choice, but I'm having the same problem there. And why shouldn't I, really? It's all the same Linux and same problem that nobody tends to due to small number of related bug reports or any solid evidence since this thing is too delicate to investigate and only select few people are affected by it. Who cares about them, right? I'd say, we need to build a whole goddamn community to draw more attention. I see no other solution. That is, if we really care about it and wish to use Linux in the future.
Wallboy installed Virtualbox and then installed Manjaro KDE on it. Absolutely no eyestrain in Vbox + fullscreen at native res. Can work in it for hours.
Naturally, it's using your host OS's video engine, it can't be overriden. You can only change your guest machine's resolution, which is merely scaling on your side. But it's still a good way to prove it's not font- or theme-related at least.
Wallboy I was also researching how to actually go about measuring what the few of us are seeing. Since we can't SEE the problem, but rather "feel" it. I don't know much about analog electronics, but through research I found we would probably need to use a photodetector + oscilloscope to measure the monitor.
If it's really possible I expect it to register something like this:
It looks (feels) like high-frequency tearing, actually. And of course, it has nothing to do with anti-aliasing and hinting. I used to disable all that and the problem was still there. And no, it doesn't feel like flashing akin to low refresh rate, it's some kind of a rapid motion, which your eyes can't focus on. You don't know what there is, but your eyes do and they simply can't stop looking for a complete and still shape, which is why they are constantly at work and it leads to strain. That's as far as my theory goes.
Wallboy Another problem is, even if these devices did measure a difference between certain video drivers + OSes, what would it really solve other then proving the problem truly does exist?
Like @diop said already, we would know what to complain about exactly and provide solid evidence. The devs would know what to work on. The real question is: WHICH devs? We would have no other choice but to contact both Nvidia and Nouveau. Perhaps even Linux kernel staff, because I seriously believe it could be something inherent in the system core itself and preceding to driver installation.
I found this thread interesting: https://forum.manjaro.org/t/disable-dithering-for-intel-graphics/79774/30
They talk about fonts and dithering (agaaaain), but there are still some good points in-between.