• AbstractOS
  • Eyestrain when switching from Windows to Linux

Wallboy If you're still here I have something with checking VBIOS versions you might want to try.

    Yeah I'm still around.

    Basically the following day after finding the Nouveau driver to be ok, I got mad at the performance of it compared to the proprietary driver. So I decided to try a completely different branch of distros. I had tried various Debian based ones thus far. So I went with an Arch based distro to see if that would make any difference. Ended up going with Manjaro KDE. Unfortunately with the Nvidia proprietary driver, the same eyestrain was present there.

    At this point I was burned out trying different things and I just went back to Windows. I installed Virtualbox and then installed Manjaro KDE on it. Absolutely no eyestrain in Vbox + fullscreen at native res. Can work in it for hours. Of course the performance of Vbox isn't nearly as good as the actual install. But it was ok enough to mess around in and learn Linux for the time being.

    Last week I was reading through the 970 thread and noticed the discussion about different vBIOS, and I went ahead and installed the newest vBIOS for my cards ( 84.04.36.00.F1 / MSINV316H.186 ). Though I haven't got around to actually testing to see if it makes a difference. I guess I'm expecting it won't make a difference.

    I was also researching how to actually go about measuring what the few of us are seeing. Since we can't SEE the problem, but rather "feel" it. I don't know much about analog electronics, but through research I found we would probably need to use a photodetector + oscilloscope to measure the monitor. The same hardware used by monitor review sites to measure pixel response time. This would likely show any sort of dithering or other impercievable flashing problems. Unfortunately it isn't exactly cheap. Around $300 for an amplified detector + around $150 for an oscilloscope. Another problem is, even if these devices did measure a difference between certain video drivers + OSes, what would it really solve other then proving the problem truly does exist?

      Wallboy Another problem is, even if these devices did measure a difference between certain video drivers + OSes, what would it really solve other then proving the problem truly does exist?

      Proving there has been a change (e.g. no dithering/spatial dithering in older OS/Driver and the jump to temporal or similar in new OS/Driver) is a big step. Our personal experiences bear more weight when complimented with hard evidence.

      Also, the fact remains that dithering is still the only component that isn't user controllable. I can buy a PWM-Free monitor and make sensible hardware choices, which are all pointless if dither/rendering is the real issue.

      The Intel Linux drivers are open-source. You can rebuild drivers to your taste and spec, if you were rolling your own distro or just wanted to experiment. There probably is a way to build a dither-free Intel driver right now, all the code is online, we just need the right guru to help us out.

      Intel has a large prescence on Social Media, and the graphics driver team are around. I have recently reached out to a member from Intel with my concern's re: dithering/being unable to use my NUC. I didn't want to flood /r/Intel so sent PM's - and got a response.

      Hi - thanks so much for reaching out to me about ledstrain.org. I'm sorry to hear about your eyestrain issue, and hope you can find a solution. I'm currently on medical leave, so unfortunately I can't do any work right now. Please send the same message to X who is covering for me while I'm out; and he can connect with the right folks.

      ..and then a futher comment

      P.S. And thank you personally for the link to that site. I'm off work due to visual issues which make it difficult for me to look at screens. Still in the diagnosis phase though.

      The irony wasn't lost on me....

        Wallboy I believe you can't capture pixel flicker with that setup. I tried it with the setup described in our oscilloscope thread (BPW34) and it barely showed anything. Holding it against the screen (a still image, or text for example) won't work well enough. I found almost no difference at all while turning on and off Nvidias temporal dithering in Linux (and even if it's off, the official driver still produces some eye strain). The best way should be to intercept the video signal in between monitor and graphics card. It isn't cheap either. We need a lossless capture device (card or external) and a fast enough PC. Such a purchase ideally should be planned in a way that any device can be captured (also gaming consoles and other external devices), not just the PC's own graphics card. And then we need a way to compare captured video frames one by one to see what exactly changes between frames. Comparing screenshots (taken from the lossless capture device, not the PC) should work, too.

          I didn't even realize there was an oscilloscope thread. You mentioned it barely shown anything, but that may be because the photodiode you were using wasn't amplified. I was talking to people that measure monitor response times and they said any photodetector for measuring light sources such as a monitor really does need to be amplified for an oscilloscope to pick it up, otherwise the current is too low to show anything meaningful on an oscilloscope.

          I did also look into measuring the signal going into the monitor, but those HDMI analyzers cost a fortune. I'm going to dig into the Linux DRM/DRI code to see if there any any debugging flags I could enable to possibly dump each frame sent to the GPU somehow. Problem is I'm sure this happens in Nvidia's proprietary binaries, so it may not be possible.

          Gurm The likely culprits you mentioned line up well with what we look at here. It very likely has far more to do with dithering and rendering (what you call hardware acceleration we call "rendering artifacts") than with PWM or the lighting, but some people are sensitive to those, also.

          I'm not even sure it's dithering anymore. I've played with nvidia-settings, disabled it, used 6 and 8bpc (though, my display only supports 6bps), changed modes to static, dynamic, temporal etc. None of that made any difference except disabling dithering altogether DOES work. I've tested it on a gradient and clearly saw banding in off mode, but that didn't do away with eye-strain. I've used every mode extensively to rule out placebo factor and it's the same thing every time. My ocular muscles were tense all the way, I couldn't relax them. It could be hardware acceleration coupled with something else, and that something could be rooted deeply in *nix systems themselves. I'm guessing either some kernel presets or display manager's fault. The truth is, we don't really know whom to blame, Nvidia, ATI or open-source devs.

          Incidentally, if anyone wants to test dithering in on and off modes feel free to check the gradient picture below:

          Gurm The good news is that people in here have had SOME luck turning off those advanced features in certain Linux distros. I know this forum isn't easy to search, so hopefully they chime in here.

          Definitely better than nothing. Only again, I'm not so sure if it's all about disabling stuff, maybe something is altogether missing to maintain proper hardware acceleration for instance. We need to know exactly what we are dealing with here.

          Luki99 Could driver make that kind of difference? Or just different PC/different gpu+ubuntu is just giving eye strain and nothing can be done about it?

          I can't really account for your issue, but I'm guessing it's not at all exotic and what you're facing is probably the same problem as mine, and no, installing proprietary driver shouldn't help in in this situation. The bottomline is, there should be NO eye strain from LCD display apart from faulty PWM or backlighting, which should be present in Windows, too. Not even low refresh rate. Also, it's not Ubuntu specific at all, it should happen in absolutely ALL *nix systems, not Linux alone. A few years ago I tried out FreeBSD and it was exactly the same thing there, if not worse. All these systems have something in common, and like earlier, that something could be shared kernel presets or display manager settings.

          Wallboy Ended up going with Manjaro KDE. Unfortunately with the Nvidia proprietary driver, the same eyestrain was present there.

          Seriously, don't waste your time. It's present on ALL *nix OSs. I've never tried Ubuntu or Manjaro but plenty of other distros I have. I've seen some threads before where people claim there is less eye-strain on Arch and Manjaro, but at least your very experience doesn't quite live up to some of those claims. Currently I'm on opensuse, which is a distro of my choice, but I'm having the same problem there. And why shouldn't I, really? It's all the same Linux and same problem that nobody tends to due to small number of related bug reports or any solid evidence since this thing is too delicate to investigate and only select few people are affected by it. Who cares about them, right? I'd say, we need to build a whole goddamn community to draw more attention. I see no other solution. That is, if we really care about it and wish to use Linux in the future.

          Wallboy installed Virtualbox and then installed Manjaro KDE on it. Absolutely no eyestrain in Vbox + fullscreen at native res. Can work in it for hours.

          Naturally, it's using your host OS's video engine, it can't be overriden. You can only change your guest machine's resolution, which is merely scaling on your side. But it's still a good way to prove it's not font- or theme-related at least.

          Wallboy I was also researching how to actually go about measuring what the few of us are seeing. Since we can't SEE the problem, but rather "feel" it. I don't know much about analog electronics, but through research I found we would probably need to use a photodetector + oscilloscope to measure the monitor.

          If it's really possible I expect it to register something like this:

          It looks (feels) like high-frequency tearing, actually. And of course, it has nothing to do with anti-aliasing and hinting. I used to disable all that and the problem was still there. And no, it doesn't feel like flashing akin to low refresh rate, it's some kind of a rapid motion, which your eyes can't focus on. You don't know what there is, but your eyes do and they simply can't stop looking for a complete and still shape, which is why they are constantly at work and it leads to strain. That's as far as my theory goes.

          Wallboy Another problem is, even if these devices did measure a difference between certain video drivers + OSes, what would it really solve other then proving the problem truly does exist?

          Like @diop said already, we would know what to complain about exactly and provide solid evidence. The devs would know what to work on. The real question is: WHICH devs? We would have no other choice but to contact both Nvidia and Nouveau. Perhaps even Linux kernel staff, because I seriously believe it could be something inherent in the system core itself and preceding to driver installation.

          I found this thread interesting: https://forum.manjaro.org/t/disable-dithering-for-intel-graphics/79774/30

          They talk about fonts and dithering (agaaaain), but there are still some good points in-between.

            kammerer I have seen that thread before but I currently have nothing to add at this time.

            Pudentane Well it definitelly has something to do with OS. I would never thought of something like that in my life! Same machine but difference OS and eye strain ...

            kammerer But if you could add some comments from your side it could add importness and attention to this issue.

            Surething, I will try.

            Update: I decided to go back to hybrid video mode on my HP Zbook 15, which got 2 GPUs: HD Intel & Nvidia Quadro. I've been using the latter up till now, but have switched to HD Intel to see the difference. It felt a bit less strainy, but it could be a placebo this time. Anyhow, that's not what I wanted to report. I decided to record Windows and Opensuse video outputs with my phone and here is what I discovered to my great surprise:

            Windows: https://youtu.be/szdwtDhVegI

            Opensuse: https://youtu.be/55Qn0uBkvCY

            I didn't expect that at all. I do recall doing the same when I was still using Nvidia, but I didn't detect any issues then. So I'm thinking, the reasons for eye-strain on different GPU's can differ as well. I haven't played with brightness or xrandr yet, I will do it a bit later and post another update. But at least from the looks of it I have the same brightness levels in both setups.

              Pudentane Linux/MacOS has always been worse than Windows for me, however now I am at the point where I can use no modern OS/PC in comfort. Do you use Windows 10 and experience discomfort? Also, do you have any known eye conditions? (e.g. amblyopia/strabismus)

              This issue which many devs seem to think is just a pesky little ‘bug’ is affecting people’s health. I can’t overstate the seriousness of this issue, however I’m sure you’re well aware of the discomfort we’re all going through.

              I just feel like in the right hands this problem could be patched within 24hours.

              Another way to measure differences is using the PCoIP protocol. It transmits a pixel perfect Remote Desktop to the client. When I tried a cloud gaming trial (Shadow PC) I suddenly developed the same symptoms on my good machine as a bad one. The cloud machine was latest W10 with Nvidia Quadro graphics. Others here noticed when streaming using Steam there are symptoms streaming from a bad machine but no problems from an older build. IMO it’s all about the pixels.

                diop Linux/MacOS has always been worse than Windows for me, however now I am at the point where I can use no modern OS/PC in comfort.

                Are you using a laptop? Which panel is it, TN or IPS? It matters, because this Zbook of mine got the worst possible TN-panel, which alone is causing me plenty of discomfort. However, I can distinguish between those things and crappy panel is something I can still live with, but the eye-strain on Linux is a total no-go and I've had it on all kinds of displays. It's always there. I can't work like that, but I seriously want to leave Windows. I'm tired of their experiments on people.

                diop Do you use Windows 10 and experience discomfort?

                I'm still on 8.1. However, when I first got this laptop it had W10 preinstalled and no, I didn't detect any visual problems other than those related to the panel.

                diop Also, do you have any known eye conditions? (e.g. amblyopia/strabismus)

                I do have a little problem, which I believe I got from Linux years ago when I was still unaware of it. I think it's antimetropia/anisometropia. One is plus, the other one is minus. Maybe it was just a coincidence, but I repeat, my sight got worse when I was still using Linux (Debian), and because of that I had to move back to Windows because I couldn't take it no more. However, since it doesn't bother me when I'm Windows, I presume, it hardly matters? Not to say I don't have to go see ophthalmologist, but it's not going to change how things are on *nix systems, is it?

                diop This issue which many devs seem to think is just a pesky little ‘bug’ is affecting people’s health. I can’t overstate the seriousness of this issue, however I’m sure you’re well aware of the discomfort we’re all going through.

                Obviously, the low-profile nature of the issue, a number of collateral factors and exceedingly small number of people affected by it leads to disbelief and negligence on part of devs, therefore it is our duty to bring this to urgency.

                diop I suddenly developed the same symptoms on my good machine as a bad one.

                It happened with me once too back on Windows 7 and GeForce GT240. Something in Nvidia settings went off and the screen weirdly enough started looking just like Linux with the same level of fuzziness which soon enough made my eyes tense. I DON'T KNOW what happened, but when I rebooted the machine things were back to normal. It never happened again since.

                I think we should also contact Xorg devs, I have a hunch it may have something to do with it. Also, I will try using Wayland instead and see if the issue is present there. I honestly doubt Nvidia and ATI are to blame here. If there are any IRC users around feel free to join #xorg, #xorg-devel and #wayland.

                Pudentane Are those videos the same computer? The second one looks like PWM banding you can sometimes see with a quick phone camera test. I don't know if that is what you mean by "I didn't expect that", but if it is then that implies Open Suse is using PWM at whatever brightness setting that is where Windows is not. I have never heard of that happening. I can tell you xrandr gamma and brightness settings did nothing for me. Nor did using PWM free displays. My comfortable computer has banding like that visible on a phone camera as well...AND I am not entirely sure that test is accurate because if you rotate the phone the banding will rotate with it which makes me think its the phone shutter rolling rather than the screen flickering? So far it seems you are just discovering the things everyone does when they start trying different settings...that nothing seems to matter for the actual strain.

                  vaz Are those videos the same computer?

                  Same laptop.

                  vaz ND I am not entirely sure that test is accurate because if you rotate the phone the banding will rotate with it which makes me think its the phone shutter rolling rather than the screen flickering?

                  No, that thing is pretty consistent, have no doubts about it. I also noticed it happens on brighter backgrounds, on darker ones it's all good. But you probably know it anyway.

                  Update: Switched to hybrid mode, using Nvidia Quadro now. No banding or anything weird registered with the phone. The picture looks still. However, I'm sure eye strain will be coming back pretty soon. So, it's not PWM after all. I'm wondering though if eye-strain with HD Intel could be caused by PWM, whereas with Nvidia it's something else. Is it possible that we are dealing with different issues here which produce similar side-effects?

                  Update: Strain confirmed. I still feel uncomfortable on Linux and I wake up with strained ocular muscles. Since it's definitely not PWM or dithering for that matter, it must be pixel related, after all. I can only add one more thing: it seems that things on Windows are slightly more detailed in general. On Linux it looks bleached out as it were. Please, take a look at the following picture in both OS's and tell me if you can see any difference:

                  It appears tad more visible on Windows and on Linux I have to move my head all the way down to see the pattern clearly, otherwise it's practically invisible. Yes, I got a crappy TN panel with poorest viewing angles you can imagine, but how come they vary in different OS's? That doesn't make a lot of sense. And no, it got nothing to do with brightness and contrast. It's more or less the same and ajdusting those doesn't make a whole lot of difference anyway. It could be the gamma alright as tweaking that did some trick, but to my understanding there should be no such difference in default settings albeit in different systems. I think something is really off here.

                  a month later

                  It's been some months since I used my PC. I'm currently trying Linux distros on my Windows-7-known-good Quadro NVS 295 setup, but the Linux eye strain seems to be everywhere and can't be turned off. That's independent of the browsers, which introduce additional eye strain.
                  It seems the installers are OK, but as soon as I boot from HDD into the X session, it starts to hurt.
                  I tried Ubuntu Mate 18.04, 19.10, Deepin 15.11, Manjaro 19 Architect so far.

                  It's my left eye that hurts after 1-2 minutes looking at the desktop.

                    I connected my ODROID-C2, which is an ARM board similar to the Raspberry Pi, runs Armbian and has a GPU totally different from Nvidia, Intel, or AMD (and Raspberry). Normally I use it as a server without any monitor attached. Eye strain within 5 minutes. I looked at the Xfce screen for 15 minutes and now have a borderline headache and tense eye muscles. I think there's something entirely wrong with the way Linux talks to the monitor, independent of what GPU is used.

                      KM Try Mint also. Its the only Linux that does not give me eye strain. Currently i use the Mint 19 LTS

                      KM , I too had similar experience with booting from installers vs from hard disk!

                      dev