• OSLinux
  • Linux - investigating eye strain problem

whystrainwhy Also, is it possible that true 8 or 10 bit displays would make the dithering unnecessary or do you think the dithering would just happen anyway?

Unfortunately there are still "reasons" to dither even if the desktop and monitor have the exact same bit depth - mainly color calibration.

An 8 bit desktop has 256 shades of each RGB component, but if a calibration is applied e.g. to apply "night shift" settings, gamma correction, automatic contrast enhancement, displaying sRGB gamut content "mapped within" a P3 gamut environment, LCD panel uniformity compensation and such… suddenly, you can't directly display all 256 shades of gray anymore because the color range is being "crushed" by the calibration.

The calibration will result in a ton of colors getting mapped to floating point values instead of whole numbers, meaning that "just rounding them up" would cause some form of subtle banding, no matter how high the bit depth.

GPU companies are wanting to eliminate banding whenever they can in order to make images look "smoother" so dithering is applied to "fake" these in-between shades of colors. Because even if you have 8 bit content connected to a true 8 bit monitor, there's going to be enough layers of calibration between the desktop and the output that a 1-to-1 mapping isn't possible without either rounding up/down (banding) or dithering.

The push for HDR makes all of this even worse, because the goal of HDR is to represent even more colors, such as a shade of white that's "brighter than typical pure white". All of the math required to map pretty much the "totally arbitrary and relative floating-point shades of colors" that HDR requires into even a 10 bit output while still remaining able to "eliminate" banding = dithering everywhere

This can still all be avoided by simply rounding up/down instead of dithering, but GPU companies don't want to do that as it would make their image "quality" and color "accuracy" look "worse" compared to competitors.

Today, they just throw dithering on everything as a catch-all "solution" - even when the banding would be slight enough to the point where no one aside from tech reviewers and the film/photography/publishing sector (AKA professions where color "accuracy" is important above all else and text clarity and readability do not matter as much) would care.

This is also why expensive "Pro"/artist/studio-branded products are even worse in regards to text clarity, eye strain, and ability to clearly present information-dense UI and content, compared to cheaper and "less color accurate" products.

    DisplaysShouldNotBeTVs This is really interesting and once again validates my own research and just what I've noticed when I go to buy and try things. Do you have a particular setup that you feel is eyesafe for you?

    Also, do you have any predictions of how things might be in 4-5 years? My assumption is that GPU manufacturers will keep doing exactly what they've always done and basically "we sell the prettiest picture" and since the majority of people are not sensitive to this kind of thing (even though I've had people tell me their eyes have gotten worse in just the last year due to screens), they wouldn't be motivated to do anything that could the few of us who suffer like this. There was another post I saw where someone was able to make Windows 11 comfortable by basically completely disabling a bunch of settings (https://ledstrain.org/d/2421-windows-11-eyestarin-and-headches) - I have been wanting to try this out but naturally when I try to install Windows 11 on a secondary drive, I run into tons of issues preventing the install from being successful haha.

    I'm not the biggest fan of a "wait and see and hope" approach to these manufacturers who run the show, but I've also seen that some people will have some luck with the same OS or configuration on one machine or GPU/monitor and then it has little effect on another set of hardware. So is it possible that some GPUs and monitor combinations are handling it better than others? I don't think there's much we can do for kernels and video drivers to tweak things to prevent all of this happening is there? At least not for Windows and Mac OSX heh.

      whystrainwhy So is it possible that some GPUs and monitor combinations are handling it better than others?

      I think, yes

      My z690 ( I have 2 pcs ) with 2022 year BIOS or newest one, gives eye-strain with all GC I tried ( 1060, 2070s, 3080, 3080 ti ) with old or new nvidia drivers.

      But without GC, using iGPU only - no issues at all. It seems PCI-E 5.0 have some troubles, but I read here troubles with same lga1700 chipset motherboard but with 12400f - it have PCI-E 4.0 slot

      So, issue is in motherboard. Ppl also got eye-strain after monitor change with same PC build ( I think issue is in software brightness level model called ~ T-PWM ).

      Also some of them got eye-strain after GC change only ( 1060 to 3070 )

      Maybe old MB's had color render in 8bit only, when in new modern MB or GC or monitors - they use wide gamut & 10bit. I also read how microsoft present CCCS as wide colorspace where each program upscale color to this colorspace and then downscale to monitor colorspace / colordepth can accept. In theory, all up/downscalers need powerfull GPU or they can use fast math models ( float 16bit half preceision instead of 32bit single precision ) and dithering to smooth banding

      old PC's have only monitor's FRC issue, when new PC's have : 1) motherboard dithering 2) windows/linux/macOS dithering 3) monitor's t-pwm dithering.... signal got too many noise after 3 layers of dithering, results headache when eye tries to focus looking in chaotic pixel movement

        simplex this is exactly what I thought! I even wondered if the BIOS has anything to do with it. It’s interesting you brought up any BIOS from 2022 and newer. I seem to be fine with a computer I built in 2022 with Windows 10 but windows 11 and every Linux version I’ve tried has given me issues. The last thing I’m going to try is different AMD cards with my Linux distro to see if that does anything. I saw that System76 uses an open source BIOS called CoreBoot which comes pre installed with Pop OS if you buy any of their laptops, which I believe are just Clevo laptops. I haven’t tried any System76 machines but I’ve definitely tried PopOS on different machines and had the same issues with headaches and inability to focus on text after just a couple of hours. Do you think there could be some value in exploring those specific combinations of System76 machines with Coreboot and PopOS?

          simplex old PC's have only monitor's FRC issue

          yup, plus on e.g. Windows XP/Vista/7 and PowerPC-era OS X, you actually had an easily accessible option for true 6-bit "thousands of colors" mode — meaning it used to be possible to generate a signal that was able to avoid activating FRC on some older monitors entirely.

          On the Windows side the "thousands of colors" option was removed with Windows 8. Since then, any old CCFL that used to only activate FRC dithering only when receiving an 8-bit signal is now essentially forced to apply FRC at all times.

          If I think all the way back to the Windows XP I'm almost certain that I used to have the thousands of colors mode selected in display properties. I actually remember having some sort of vague preference for it (instead of using millions of colors) even back then.

          I still own the same Windows XP era monitor and of course, nowadays… I can see it apply its own internal "checkerboard-style" FRC whenever its fed a modern 8-bit DVI signal even if connected to a GPU that doesn't dither.

            5 months later

            I'm late to the party, but I've suffered from eye strain/pain for several months. Tried different distros and different laptops with Intel and NVIDIA GPUs. No matter what settings I tried I could never fix my issue. So, I kept going back to Windows/WSL2.
            Now, I'd like to report that my eye strain is finally gone! My solution was to buy a QD-OLED monitor (specifically, the HP Omen Transcend.) But I assume any other QD-OLED monitor should work.
            On this new monitor, everything looks crystal clear! No matter which laptop or Graphics card I use. Even my Raspberry Pi 400 looks clear.
            It did take a few days for my eyes to adjust to the new monitor, but it was worth it!
            While I was struggling with this, besides trying software/hardware configurations, I also had my eyes checked.
            Now, I'm running Ubuntu 24.04 with the default setup, except I installed fonts-ubuntu-classic, nvidia-driver-550 (although the nouveau driver was working fine as well.) My laptop does not have a DiplayPort, so I'm using USB-C (But HDMI also works fine)
            In any case, I thought I'd share this hoping that it helps someone else!

              jaymdev Hi. Can you explain what symptoms did you have? Burning eyes? Red eyes? Or something else? What did you mean by "it did take a few days for my eyes to adjust…"?

              I had burning eyes and eye pain (mostly on my left eye) I also had headaches concentrated around the left eye.

              These symptoms always happened while using Linux, but not windows 10/11 on the same exact hardware.

              I was using LED IPS monitors (BenQ GW2485TC) for a couple of years. I kept dual booting Linux (Ubuntu, Mint, Pop OS!, Fedora, Arch, Manjaro). Arch/XFCE worked best out of all distros, but still not as good as Windows.

              The first couple of days with the new qd-oled monitor I still had the same mild symptoms, to the point where I thought it wouldn’t work for me and I was going to have to return it. But the following day, the symptoms were gone and I can use it for extended sessions comfortably. I’ve only had it for about a week now though, and I’m hoping the symptoms won’t return.

              I switched from LED IPS FHD to QD-OLED UHD. I’m now wondering if a good quality IPS (maybe UHD or 4K) may also solve this problem.

                jaymdev

                Your new monitor may have a true 10-bit panel. According to the Linux kernel code, some graphics cards have limit at 10-bit, and at this limit, dithering is not used. However, this may not apply to your situation, as it all depends on the driver and the model of the graphics card.

                a month later

                jaymdev do you still use this screen without any eyestrain?

                If so, could you please check the frequency PWM on this monitor?

                2 months later

                twomee can you tell which exact kernel version is still ok for you?

                Did you find any updates on the topic?

                I'm rolling through the kernel changelogs atm, but its pretty hard to find anything related - so far nothing.

                Thank you in advance!

                I am using Linux Mint Cinnamon 22.1 with the default 6.8 kernel and a custom 6.10 kernel. The issues only occur on some 6bit+FRC displays, but when using a true 8-bit display, the image does not strain the eyes even after 10+ hours of use per day.

                However, I only have access to an Arc A770, UHD 630, and UHD Xe. I do not have any Nvidia or AMD graphics cards.

                dev