• OSLinux
  • Linux - investigating eye strain problem

twomee wrote that 6.2 was good for him but 6.8 isn't anymore and that he suspects the change for the worse occurred in 6.6.10.

Some new finding:

I tried kernel version 6.3 and I felt a little eye pain versus kernel version 6.2.

After that I tried kernel version 6.4 and from this version I felt the a big jump in the eye pain.

So the big problematic version for me it’s kernel version 6.4.

Safe on is until version 6.2

So from what I've been reading in other forum posts, more and more of this GPU acceleration in web browsers and just "in the OS" is happening? I can use Windows 10 with the latest 22H2 update for many hours, but this same hardware I have I have tried many different "popular" versions of Linux with different DE and all of them give me mostly the same feelings of eyestrain, foggy head, inability to focus on text. It sounds like we need to be able to dig into the Linux kernels and drivers in order to start turning things off so we can comfortably use these Linux OSes? I have really bad eyestrain in Windows 11 and any Mac of any kind (m1/m2/m3, mac mini, etc.). I'm convinced this has to be at an OS level and for Linux, maybe even deeper like at the Kernel level. I'm trying to learn more about the technicals of Linux and hardware, so if anyone has some things to read, I'm all about it πŸ™‚

    whystrainwhy I can use Windows 10 with the latest 22H2 update for many hours, but this same hardware I have I have tried many different "popular" versions of Linux with different DE and all of them give me mostly the same feelings of eyestrain, foggy head, inability to focus on text. It sounds like we need to be able to dig into the Linux kernels and drivers in order to start turning things off so we can comfortably use these Linux OSes?

    Yep, this is because Linux especially on Intel forces dithering to on in many cases, such as on laptops with 6bpc panels

    Here's a good thread from the intel graphics driver development mailing list to start researching, some code is at the bottom which I'm pretty sure is the file in the intel linux driver source code where the decision happens

    https://www.mail-archive.com/intel-gfx@lists.freedesktop.org/msg253558.html

    ^ By searching "dither" on this site's search field you can find more interesting threads about the intel drivers

    One quote that confirms what many of us have thought:

    The problem with dithering on Intel is that various tested Intel gpu's (Ironlake, IvyBridge, Haswell, Skylake iirc.) are dithering when they shouldn't. If one has a standard 8 bpc framebuffer feeding into a standard (legacy) 256 slots, 8 bit wide lut which was loaded with an identity mapping, feeding into a standard 8 bpc video output (DVI/HDMI/DP), the expected result is that pixels rendered into the framebuffer show up unmodified at the video output. What happens instead is that some dithering is needlessly applied. This is bad for various neuroscience/medical research equipment that requires pixels to pass unmodified in a pure 8 bpc configuration

    https://github.com/torvalds/linux/blob/f06ce441457d4abc4d76be7acba26868a2d02b1c/drivers/gpu/drm/i915/display/intel_display.c#L4749

    https://github.com/torvalds/linux/blob/f06ce441457d4abc4d76be7acba26868a2d02b1c/drivers/gpu/drm/i915/display/intel_lvds.c#L288

      DisplaysShouldNotBeTVs this is great information! I'm so glad that my theories have some kind of basis πŸ˜ƒ I mean you can read and read about anecdotal evidence but after trying all of these different OSes on 10+ different laptops (and my own PC with external monitors), it surely has to be both a GPU "just doing the dithering anyway" or at the OS "just doing the dithering anyway" even after "disabling it" - I remember early on when I was only focused on laptops, I thought it was just a PWM thing but after many laptops that were shown to have "no PWM", I thought that maybe it was the fact that it was an integrated GPU that might have been doing weird stuff with signal. After trying laptops with both integrated GPU AND an "external" GPU like Nvidia and many failed attempts at using Windows 11 on those laptops, that's when I tried Windows 11 on my desktop PC with the same exact problems with eyestrain.

      As I'm still a bit new to this, when it comes to the specifics of the intel drivers, I don't believe my desktop PC has any Intel specific components, though I do have my AMD Ryzen processor and my Nvidia GeForce 3080 TI. After looking at this particular link that you linked: https://www.mail-archive.com/intel-gfx@lists.freedesktop.org/msg253558.html

      ^ Is there a way to completely disable all dithering at the GPU or driver level via command line? Also, is it possible that true 8 or 10 bit displays would make the dithering unnecessary or do you think the dithering would just happen anyway? I'd love to find some more information about the implications of the future of this kind of technology because if just "doing the simple things" like watching videos in a web browser require beefier and beefier hardware or even to just play a game, then is it inevitable that this dithering will be everywhere as well?

      And fyi, while using the nouveau drivers, I ran this command:

      xrandr --output DP-3 --set "dithering mode" "off"

      ^ When I check xrandr --prop it appears that the dithering mode was switched from auto to off - Would this be the correct way to disable dithering at the GPU level even though it's targeting the monitor?

        whystrainwhy Also, is it possible that true 8 or 10 bit displays would make the dithering unnecessary or do you think the dithering would just happen anyway?

        Unfortunately there are still "reasons" to dither even if the desktop and monitor have the exact same bit depth - mainly color calibration.

        An 8 bit desktop has 256 shades of each RGB component, but if a calibration is applied e.g. to apply "night shift" settings, gamma correction, automatic contrast enhancement, displaying sRGB gamut content "mapped within" a P3 gamut environment, LCD panel uniformity compensation and such… suddenly, you can't directly display all 256 shades of gray anymore because the color range is being "crushed" by the calibration.

        The calibration will result in a ton of colors getting mapped to floating point values instead of whole numbers, meaning that "just rounding them up" would cause some form of subtle banding, no matter how high the bit depth.

        GPU companies are wanting to eliminate banding whenever they can in order to make images look "smoother" so dithering is applied to "fake" these in-between shades of colors. Because even if you have 8 bit content connected to a true 8 bit monitor, there's going to be enough layers of calibration between the desktop and the output that a 1-to-1 mapping isn't possible without either rounding up/down (banding) or dithering.

        The push for HDR makes all of this even worse, because the goal of HDR is to represent even more colors, such as a shade of white that's "brighter than typical pure white". All of the math required to map pretty much the "totally arbitrary and relative floating-point shades of colors" that HDR requires into even a 10 bit output while still remaining able to "eliminate" banding = dithering everywhere

        This can still all be avoided by simply rounding up/down instead of dithering, but GPU companies don't want to do that as it would make their image "quality" and color "accuracy" look "worse" compared to competitors.

        Today, they just throw dithering on everything as a catch-all "solution" - even when the banding would be slight enough to the point where no one aside from tech reviewers and the film/photography/publishing sector (AKA professions where color "accuracy" is important above all else and text clarity and readability do not matter as much) would care.

        This is also why expensive "Pro"/artist/studio-branded products are even worse in regards to text clarity, eye strain, and ability to clearly present information-dense UI and content, compared to cheaper and "less color accurate" products.

          DisplaysShouldNotBeTVs This is really interesting and once again validates my own research and just what I've noticed when I go to buy and try things. Do you have a particular setup that you feel is eyesafe for you?

          Also, do you have any predictions of how things might be in 4-5 years? My assumption is that GPU manufacturers will keep doing exactly what they've always done and basically "we sell the prettiest picture" and since the majority of people are not sensitive to this kind of thing (even though I've had people tell me their eyes have gotten worse in just the last year due to screens), they wouldn't be motivated to do anything that could the few of us who suffer like this. There was another post I saw where someone was able to make Windows 11 comfortable by basically completely disabling a bunch of settings (https://ledstrain.org/d/2421-windows-11-eyestarin-and-headches) - I have been wanting to try this out but naturally when I try to install Windows 11 on a secondary drive, I run into tons of issues preventing the install from being successful haha.

          I'm not the biggest fan of a "wait and see and hope" approach to these manufacturers who run the show, but I've also seen that some people will have some luck with the same OS or configuration on one machine or GPU/monitor and then it has little effect on another set of hardware. So is it possible that some GPUs and monitor combinations are handling it better than others? I don't think there's much we can do for kernels and video drivers to tweak things to prevent all of this happening is there? At least not for Windows and Mac OSX heh.

            whystrainwhy So is it possible that some GPUs and monitor combinations are handling it better than others?

            I think, yes

            My z690 ( I have 2 pcs ) with 2022 year BIOS or newest one, gives eye-strain with all GC I tried ( 1060, 2070s, 3080, 3080 ti ) with old or new nvidia drivers.

            But without GC, using iGPU only - no issues at all. It seems PCI-E 5.0 have some troubles, but I read here troubles with same lga1700 chipset motherboard but with 12400f - it have PCI-E 4.0 slot

            So, issue is in motherboard. Ppl also got eye-strain after monitor change with same PC build ( I think issue is in software brightness level model called ~ T-PWM ).

            Also some of them got eye-strain after GC change only ( 1060 to 3070 )

            Maybe old MB's had color render in 8bit only, when in new modern MB or GC or monitors - they use wide gamut & 10bit. I also read how microsoft present CCCS as wide colorspace where each program upscale color to this colorspace and then downscale to monitor colorspace / colordepth can accept. In theory, all up/downscalers need powerfull GPU or they can use fast math models ( float 16bit half preceision instead of 32bit single precision ) and dithering to smooth banding

            old PC's have only monitor's FRC issue, when new PC's have : 1) motherboard dithering 2) windows/linux/macOS dithering 3) monitor's t-pwm dithering.... signal got too many noise after 3 layers of dithering, results headache when eye tries to focus looking in chaotic pixel movement

              simplex this is exactly what I thought! I even wondered if the BIOS has anything to do with it. It’s interesting you brought up any BIOS from 2022 and newer. I seem to be fine with a computer I built in 2022 with Windows 10 but windows 11 and every Linux version I’ve tried has given me issues. The last thing I’m going to try is different AMD cards with my Linux distro to see if that does anything. I saw that System76 uses an open source BIOS called CoreBoot which comes pre installed with Pop OS if you buy any of their laptops, which I believe are just Clevo laptops. I haven’t tried any System76 machines but I’ve definitely tried PopOS on different machines and had the same issues with headaches and inability to focus on text after just a couple of hours. Do you think there could be some value in exploring those specific combinations of System76 machines with Coreboot and PopOS?

                simplex old PC's have only monitor's FRC issue

                yup, plus on e.g. Windows XP/Vista/7 and PowerPC-era OS X, you actually had an easily accessible option for true 6-bit "thousands of colors" mode β€” meaning it used to be possible to generate a signal that was able to avoid activating FRC on some older monitors entirely.

                On the Windows side the "thousands of colors" option was removed with Windows 8. Since then, any old CCFL that used to only activate FRC dithering only when receiving an 8-bit signal is now essentially forced to apply FRC at all times.

                If I think all the way back to the Windows XP I'm almost certain that I used to have the thousands of colors mode selected in display properties. I actually remember having some sort of vague preference for it (instead of using millions of colors) even back then.

                I still own the same Windows XP era monitor and of course, nowadays… I can see it apply its own internal "checkerboard-style" FRC whenever its fed a modern 8-bit DVI signal even if connected to a GPU that doesn't dither.

                  5 months later

                  I'm late to the party, but I've suffered from eye strain/pain for several months. Tried different distros and different laptops with Intel and NVIDIA GPUs. No matter what settings I tried I could never fix my issue. So, I kept going back to Windows/WSL2.
                  Now, I'd like to report that my eye strain is finally gone! My solution was to buy a QD-OLED monitor (specifically, the HP Omen Transcend.) But I assume any other QD-OLED monitor should work.
                  On this new monitor, everything looks crystal clear! No matter which laptop or Graphics card I use. Even my Raspberry Pi 400 looks clear.
                  It did take a few days for my eyes to adjust to the new monitor, but it was worth it!
                  While I was struggling with this, besides trying software/hardware configurations, I also had my eyes checked.
                  Now, I'm running Ubuntu 24.04 with the default setup, except I installed fonts-ubuntu-classic, nvidia-driver-550 (although the nouveau driver was working fine as well.) My laptop does not have a DiplayPort, so I'm using USB-C (But HDMI also works fine)
                  In any case, I thought I'd share this hoping that it helps someone else!

                    jaymdev Hi. Can you explain what symptoms did you have? Burning eyes? Red eyes? Or something else? What did you mean by "it did take a few days for my eyes to adjust…"?

                    I had burning eyes and eye pain (mostly on my left eye) I also had headaches concentrated around the left eye.

                    These symptoms always happened while using Linux, but not windows 10/11 on the same exact hardware.

                    I was using LED IPS monitors (BenQ GW2485TC) for a couple of years. I kept dual booting Linux (Ubuntu, Mint, Pop OS!, Fedora, Arch, Manjaro). Arch/XFCE worked best out of all distros, but still not as good as Windows.

                    The first couple of days with the new qd-oled monitor I still had the same mild symptoms, to the point where I thought it wouldn’t work for me and I was going to have to return it. But the following day, the symptoms were gone and I can use it for extended sessions comfortably. I’ve only had it for about a week now though, and I’m hoping the symptoms won’t return.

                    I switched from LED IPS FHD to QD-OLED UHD. I’m now wondering if a good quality IPS (maybe UHD or 4K) may also solve this problem.

                      jaymdev

                      Your new monitor may have a true 10-bit panel. According to the Linux kernel code, some graphics cards have limit at 10-bit, and at this limit, dithering is not used. However, this may not apply to your situation, as it all depends on the driver and the model of the graphics card.

                      a month later

                      jaymdev do you still use this screen without any eyestrain?

                      If so, could you please check the frequency PWM on this monitor?

                      dev