• OSLinux
  • Linux - investigating eye strain problem

Hi all!

I noticed half a year ago that I have eye strain and headaches from Linux, no matter which distro and desktop manager I used.

The distro I started to notice it it’s pop os. I had this os and in the beginning I didn’t had any problem with it. It started half year ago. BTW, I’m using kde plasma as desktop manager

So I had an idea, why not to install the pop os distro from the same usb I installed the fresh one half year ago and then check what can be the differences.

So I did it. Now I have a the pop os before the eye strain problem and the pop is after the eye strain problem.

My PC have nvidia rtx 3060, i9 without hdmi.

Excluding the nvidia dithering which I disabled from the nvidia server(which helped),

First thing I noticed between the distros is the nvidia driver which was newer in the eye strain os. So I downgraded the nvidia driver to the same version of the regular os without eye strain and still I had eye strain.

The second thing I noticed is the kernel version. The old one is 6.2.6 and the new one is 6.8. But I assume the kernel version isn’t related to how the screen show the elements, it’s related to deep stuff of Linux.

The two differences I wrote is the only one I saw between the two. I tried to change display settings, font settings, and some stuff related but didn’t helped to the eye strain.

So now I’m consult with you guys, what can be the thing which can cause the eye strain in your opinion? Now that I have this two os, I can test it and see if it’s really help or not.

    For me it often happens when something is using the GPU. The solution currently for me is Xfce + Xfce's compositing disabled (in Settings - Window Manager Tweaks) + Firefox as web browser + font antialiasing disabled (not sure if this is necessary) + to avoid anything that uses the GPU. With the current AMD Ryzen 5600 GPU Firefox is OK for me. Some GPU applications can be OK (like in my case Firefox), but you'd have to test each and every one. You can quickly see when something is using the GPU by installing nvtop (also works for AMD). For the problematic GPU applications, using their "software rendering" option (if existing) is not always helping.

    The Plasma desktop, for example, is GPU-accelerated. I can only speculate that in between the different OS versions something was added that caused pixel flicker that causes eye strain. No one here really knows what's going on in such cases. We'd need a graphics driver expert, but we don't have one here in this forum yet for Linux.

    Perhaps it's your GPU-accelerated web browser that is causing the eye strain after it received an update, because nowadays the thing we look at most is usually a web browser. Or whatever GPU-accelerated applications you primarily use.

      KM i will give a shot for the Xfce + all the settings you mentioned, thanks for the suggestion!

      how can we be that the Xfce will be safe always? i mean, in some point, Kde is not safe for us, it can occur also with Xfce?
      how do you knew that kde is GPU-accelerated while the Xfce is not? there are another desktop manger which is like Xfce?

      i thought that in linux it will be simple to solve our eye strain because is a system which we build on our hands, not like windows. im disappointed from the part we cant really control on our linux system

      so if the Xfce is safe for me, your right and its related to the GPU-accelerate. how can we solve it and use again in Kde?

      DisplaysShouldNotBeTVs so how can i investigate if its in the kernel part? all we know is that i had kernel 6.2.6 version and now i have 6.8.0.

        twomee there are another desktop manger which is like Xfce?

        I've usually had good luck with the MATE desktop environment in the past, it's very similar to Xfce as both use the "panels" approach to the desktop, and I remember that it had a toggle to disable the compositor entirely

        FWIW, on my old 2012 Windows laptop with Intel HD 4000:

        • MATE desktop + version of arch linux from around mid-2019 was usable (back when I used this, I was very productive on the laptop). Back then I used it combined with swapping out the default compositor with some modern fork of the 2000s-style Compiz+Emerald compositor — forgot which. Other times I used it without any compositor at all. Both were pretty good. Not sure about the "default" compositor as I didn't use it for long before changing it due to wanting to be able to more heavily customize the UI.

        • Windows 8.1 + ditherig.exe is very usable (laptop is old enough to actually get real improvement from ditherig). I am also very productive today using this setup and Win8.1 is also working great to remote desktop into Macs without strain via the NoMachine app.

        • Ubuntu 22.04 with the default GNOME desktop is NOT usable! (I can see it dither with my own eyes)

        This doesn't really say much about modern laptops, but I've always found MATE much more "easy to read" than GNOME even going back to the era before I knew that my vision problems could be connected to display output.

        Some findings:

        I found that the problem I have is related to the kernel. I updated the os with kernel 6.2 to 6.8 and found that the eye strain occurred. I tried to investigate inside the config of the kernel and found that there is a value CONFIG_DRM_ACCL which is enabled from kernel 6.6.10. There are more parameter related to this drm but I thing this is the main issue. I read in the internet that this drm is connect between the pc to the gpu to draw in the screen the video. It seems it uses acceleration now. I tried to disable hardware acceleration but it didn’t changed anything, still got eye strain.

        I also tried wayland, and it seems although I was in the safe kernel, the 6.2 version, I got an eye strain. I think that the investigation of wayland vs x11 will be more complicated because it more a huge thing. But I believe it uses acceleration as the new version of the kernel uses now.

        • KM likes this.
        22 days later

        That does not sound good. Having mentioned my "good" setup, I should add I still use the older kernel 6.1.0, which comes with Debian Bookworm.

          KM when I configured one of my crypto mining PCs which runs Linux it felt ok for the short amount of time I viewed it's output. 7950x Ryzen iGPU and also was using 6.10 kernel. So is 6.10 kernel the latest that is safe?

          twomee wrote that 6.2 was good for him but 6.8 isn't anymore and that he suspects the change for the worse occurred in 6.6.10.

          Some new finding:

          I tried kernel version 6.3 and I felt a little eye pain versus kernel version 6.2.

          After that I tried kernel version 6.4 and from this version I felt the a big jump in the eye pain.

          So the big problematic version for me it’s kernel version 6.4.

          Safe on is until version 6.2

          So from what I've been reading in other forum posts, more and more of this GPU acceleration in web browsers and just "in the OS" is happening? I can use Windows 10 with the latest 22H2 update for many hours, but this same hardware I have I have tried many different "popular" versions of Linux with different DE and all of them give me mostly the same feelings of eyestrain, foggy head, inability to focus on text. It sounds like we need to be able to dig into the Linux kernels and drivers in order to start turning things off so we can comfortably use these Linux OSes? I have really bad eyestrain in Windows 11 and any Mac of any kind (m1/m2/m3, mac mini, etc.). I'm convinced this has to be at an OS level and for Linux, maybe even deeper like at the Kernel level. I'm trying to learn more about the technicals of Linux and hardware, so if anyone has some things to read, I'm all about it 🙂

            whystrainwhy I can use Windows 10 with the latest 22H2 update for many hours, but this same hardware I have I have tried many different "popular" versions of Linux with different DE and all of them give me mostly the same feelings of eyestrain, foggy head, inability to focus on text. It sounds like we need to be able to dig into the Linux kernels and drivers in order to start turning things off so we can comfortably use these Linux OSes?

            Yep, this is because Linux especially on Intel forces dithering to on in many cases, such as on laptops with 6bpc panels

            Here's a good thread from the intel graphics driver development mailing list to start researching, some code is at the bottom which I'm pretty sure is the file in the intel linux driver source code where the decision happens

            https://www.mail-archive.com/intel-gfx@lists.freedesktop.org/msg253558.html

            ^ By searching "dither" on this site's search field you can find more interesting threads about the intel drivers

            One quote that confirms what many of us have thought:

            The problem with dithering on Intel is that various tested Intel gpu's (Ironlake, IvyBridge, Haswell, Skylake iirc.) are dithering when they shouldn't. If one has a standard 8 bpc framebuffer feeding into a standard (legacy) 256 slots, 8 bit wide lut which was loaded with an identity mapping, feeding into a standard 8 bpc video output (DVI/HDMI/DP), the expected result is that pixels rendered into the framebuffer show up unmodified at the video output. What happens instead is that some dithering is needlessly applied. This is bad for various neuroscience/medical research equipment that requires pixels to pass unmodified in a pure 8 bpc configuration

            https://github.com/torvalds/linux/blob/f06ce441457d4abc4d76be7acba26868a2d02b1c/drivers/gpu/drm/i915/display/intel_display.c#L4749

            https://github.com/torvalds/linux/blob/f06ce441457d4abc4d76be7acba26868a2d02b1c/drivers/gpu/drm/i915/display/intel_lvds.c#L288

              DisplaysShouldNotBeTVs this is great information! I'm so glad that my theories have some kind of basis 😃 I mean you can read and read about anecdotal evidence but after trying all of these different OSes on 10+ different laptops (and my own PC with external monitors), it surely has to be both a GPU "just doing the dithering anyway" or at the OS "just doing the dithering anyway" even after "disabling it" - I remember early on when I was only focused on laptops, I thought it was just a PWM thing but after many laptops that were shown to have "no PWM", I thought that maybe it was the fact that it was an integrated GPU that might have been doing weird stuff with signal. After trying laptops with both integrated GPU AND an "external" GPU like Nvidia and many failed attempts at using Windows 11 on those laptops, that's when I tried Windows 11 on my desktop PC with the same exact problems with eyestrain.

              As I'm still a bit new to this, when it comes to the specifics of the intel drivers, I don't believe my desktop PC has any Intel specific components, though I do have my AMD Ryzen processor and my Nvidia GeForce 3080 TI. After looking at this particular link that you linked: https://www.mail-archive.com/intel-gfx@lists.freedesktop.org/msg253558.html

              ^ Is there a way to completely disable all dithering at the GPU or driver level via command line? Also, is it possible that true 8 or 10 bit displays would make the dithering unnecessary or do you think the dithering would just happen anyway? I'd love to find some more information about the implications of the future of this kind of technology because if just "doing the simple things" like watching videos in a web browser require beefier and beefier hardware or even to just play a game, then is it inevitable that this dithering will be everywhere as well?

              And fyi, while using the nouveau drivers, I ran this command:

              xrandr --output DP-3 --set "dithering mode" "off"

              ^ When I check xrandr --prop it appears that the dithering mode was switched from auto to off - Would this be the correct way to disable dithering at the GPU level even though it's targeting the monitor?

                whystrainwhy Also, is it possible that true 8 or 10 bit displays would make the dithering unnecessary or do you think the dithering would just happen anyway?

                Unfortunately there are still "reasons" to dither even if the desktop and monitor have the exact same bit depth - mainly color calibration.

                An 8 bit desktop has 256 shades of each RGB component, but if a calibration is applied e.g. to apply "night shift" settings, gamma correction, automatic contrast enhancement, displaying sRGB gamut content "mapped within" a P3 gamut environment, LCD panel uniformity compensation and such… suddenly, you can't directly display all 256 shades of gray anymore because the color range is being "crushed" by the calibration.

                The calibration will result in a ton of colors getting mapped to floating point values instead of whole numbers, meaning that "just rounding them up" would cause some form of subtle banding, no matter how high the bit depth.

                GPU companies are wanting to eliminate banding whenever they can in order to make images look "smoother" so dithering is applied to "fake" these in-between shades of colors. Because even if you have 8 bit content connected to a true 8 bit monitor, there's going to be enough layers of calibration between the desktop and the output that a 1-to-1 mapping isn't possible without either rounding up/down (banding) or dithering.

                The push for HDR makes all of this even worse, because the goal of HDR is to represent even more colors, such as a shade of white that's "brighter than typical pure white". All of the math required to map pretty much the "totally arbitrary and relative floating-point shades of colors" that HDR requires into even a 10 bit output while still remaining able to "eliminate" banding = dithering everywhere

                This can still all be avoided by simply rounding up/down instead of dithering, but GPU companies don't want to do that as it would make their image "quality" and color "accuracy" look "worse" compared to competitors.

                Today, they just throw dithering on everything as a catch-all "solution" - even when the banding would be slight enough to the point where no one aside from tech reviewers and the film/photography/publishing sector (AKA professions where color "accuracy" is important above all else and text clarity and readability do not matter as much) would care.

                This is also why expensive "Pro"/artist/studio-branded products are even worse in regards to text clarity, eye strain, and ability to clearly present information-dense UI and content, compared to cheaper and "less color accurate" products.

                  DisplaysShouldNotBeTVs This is really interesting and once again validates my own research and just what I've noticed when I go to buy and try things. Do you have a particular setup that you feel is eyesafe for you?

                  Also, do you have any predictions of how things might be in 4-5 years? My assumption is that GPU manufacturers will keep doing exactly what they've always done and basically "we sell the prettiest picture" and since the majority of people are not sensitive to this kind of thing (even though I've had people tell me their eyes have gotten worse in just the last year due to screens), they wouldn't be motivated to do anything that could the few of us who suffer like this. There was another post I saw where someone was able to make Windows 11 comfortable by basically completely disabling a bunch of settings (https://ledstrain.org/d/2421-windows-11-eyestarin-and-headches) - I have been wanting to try this out but naturally when I try to install Windows 11 on a secondary drive, I run into tons of issues preventing the install from being successful haha.

                  I'm not the biggest fan of a "wait and see and hope" approach to these manufacturers who run the show, but I've also seen that some people will have some luck with the same OS or configuration on one machine or GPU/monitor and then it has little effect on another set of hardware. So is it possible that some GPUs and monitor combinations are handling it better than others? I don't think there's much we can do for kernels and video drivers to tweak things to prevent all of this happening is there? At least not for Windows and Mac OSX heh.

                    whystrainwhy So is it possible that some GPUs and monitor combinations are handling it better than others?

                    I think, yes

                    My z690 ( I have 2 pcs ) with 2022 year BIOS or newest one, gives eye-strain with all GC I tried ( 1060, 2070s, 3080, 3080 ti ) with old or new nvidia drivers.

                    But without GC, using iGPU only - no issues at all. It seems PCI-E 5.0 have some troubles, but I read here troubles with same lga1700 chipset motherboard but with 12400f - it have PCI-E 4.0 slot

                    So, issue is in motherboard. Ppl also got eye-strain after monitor change with same PC build ( I think issue is in software brightness level model called ~ T-PWM ).

                    Also some of them got eye-strain after GC change only ( 1060 to 3070 )

                    Maybe old MB's had color render in 8bit only, when in new modern MB or GC or monitors - they use wide gamut & 10bit. I also read how microsoft present CCCS as wide colorspace where each program upscale color to this colorspace and then downscale to monitor colorspace / colordepth can accept. In theory, all up/downscalers need powerfull GPU or they can use fast math models ( float 16bit half preceision instead of 32bit single precision ) and dithering to smooth banding

                    old PC's have only monitor's FRC issue, when new PC's have : 1) motherboard dithering 2) windows/linux/macOS dithering 3) monitor's t-pwm dithering.... signal got too many noise after 3 layers of dithering, results headache when eye tries to focus looking in chaotic pixel movement

                      simplex this is exactly what I thought! I even wondered if the BIOS has anything to do with it. It’s interesting you brought up any BIOS from 2022 and newer. I seem to be fine with a computer I built in 2022 with Windows 10 but windows 11 and every Linux version I’ve tried has given me issues. The last thing I’m going to try is different AMD cards with my Linux distro to see if that does anything. I saw that System76 uses an open source BIOS called CoreBoot which comes pre installed with Pop OS if you buy any of their laptops, which I believe are just Clevo laptops. I haven’t tried any System76 machines but I’ve definitely tried PopOS on different machines and had the same issues with headaches and inability to focus on text after just a couple of hours. Do you think there could be some value in exploring those specific combinations of System76 machines with Coreboot and PopOS?

                        simplex old PC's have only monitor's FRC issue

                        yup, plus on e.g. Windows XP/Vista/7 and PowerPC-era OS X, you actually had an easily accessible option for true 6-bit "thousands of colors" mode — meaning it used to be possible to generate a signal that was able to avoid activating FRC on some older monitors entirely.

                        On the Windows side the "thousands of colors" option was removed with Windows 8. Since then, any old CCFL that used to only activate FRC dithering only when receiving an 8-bit signal is now essentially forced to apply FRC at all times.

                        If I think all the way back to the Windows XP I'm almost certain that I used to have the thousands of colors mode selected in display properties. I actually remember having some sort of vague preference for it (instead of using millions of colors) even back then.

                        I still own the same Windows XP era monitor and of course, nowadays… I can see it apply its own internal "checkerboard-style" FRC whenever its fed a modern 8-bit DVI signal even if connected to a GPU that doesn't dither.

                          dev