• OS
  • Surprisingly safe Linux build (plain GNOME46)

autobot

how many bits of color without dithering

In this regard, the kernel 6.6 is not different from newer versions; dithering is also present at only 6 bits in the i915 kernel module for Intel iGPU: https://github.com/torvalds/linux/blame/v6.6/drivers/gpu/drm/i915/display/intel_display.c#L4774.

What's interesting is that in Linux, as well as in Windows, there seems to be a post-processing stage for the frame. I don't have exact data on whether such post-processing can cause eye strain, but the fact is that the image changes significantly in colors if you switch frame calculation from GPU to CPU in Mesa. Meanwhile, the graphics card functions correctly under the driver, only the frame processing has been switched from GPU to CPU. The difference in colors is marked in red (Ubuntu 24.04 Gnome, X11, 8-bit): https://ibb.co/71kq9Vt, https://ibb.co/5v9zqxk.

If I switch to CPU rendering, the text becomes slightly easier to read. In the image above, the text is surrounded by a bunch of red dots. It may imply that some kind of post-processing is applied to it.

    jordan
    No, I haven't. I'm not a Linux enthusiast and am generally happy with Windows. NixOS is quite unique. That's what triggered my venture into desktop Linux. By the way, the current GNOME and, say, Hyprland UI experience is quite robust and is intuitive enough when switching from Windows.

    What really bothered me was the absence of a simple dimmer app. It is a basic defence against PWM. But then I found Iris app which now has CLI interface working on any distro.

      autobot
      I've tried this setup with two rather dated displays. One being Zen AIO 24 ZN242GD (all-in-one PC) and the other is Phillips 234E released in 2014. No idea how many bits, possibly 6 both.
      It is worth saying that the two monitors are safe for me under the safe software setups. On Windows 10, Ubuntu, KDE Neon and MacOS (M1 tested) they still give strain and headache.
      Unfortunately, I can't use most (or any?) modern monitors.

      The post-processing you're talking about might just be the "IT Content" flag of the HDMI spec. I have it turned off in my Intel drivers in Windows (Display>General>Advanced). It's on by default.

        DisplaysShouldNotBeTVs

        I'm currently in therapy without my laptop; I think I can do it at the beginning of next week.

        Sunspark

        I haven't seen such setting on Arc, but I'll take a closer look. Thanks for the advice.

        zlhr

        The environment variable LIBGL_ALWAYS_SOFTWARE is responsible for enabling or disabling CPU rendering. It needs to be set to 1 to enable CPU rendering.

        WhisperingWind

        Interesting, I do see what you mean by the slightly different pixels in the wallpaper -- but on both of them, including the CPU-rendered version, I see color fringing around everything including the app icons.

        Is this because you recorded in a YCbCr mode with chroma subsampling? Is it possible to lossless capture without chroma subsampling i.e. 4:4:4? For example getting Linux to recognize the capture card EDID as only supporting RGB.

        This is not related to full range vs. limited range, and is determined based on EDID in most cases. Or it's possibly a limitation with your capture card.

        That would make the differences a lot easier to analyze

        (Or, on the other hand, did you already try to do that yet this color fringing is still appearing and coming from something different instead of subsampling?)

          DisplaysShouldNotBeTVs

          I recorded this using QuickTime's Uncompressed 10-bit RGB format.

          Last time I captured a signal from the MBP M1, I noticed some color fringing. Interestingly, BetterDisplay indicated that the signal was in RGB, so there shouldn't have been any subsampling.

          The same color fringing issue is present here. The color spaces available to me on the Intel Arc A770 are

          Default, SMPTE_170M_YCC, BT709_YCC, XVYCC_601, XVYCC_709, SYCC_601, opYCC_601, opRGB (aka Adobe RGB), BT2020_CYCC, BT2020_RGB, BT2020_YCC, DCI-P3_RGB_D65, DCI-P3_RGB_Theate.

          I made this recording with the Colorspace setting on the Linux side set to Default. I'll need to check the i915 code to find out which Colorspace is considered Default in this situation.
          I'm still trying to figure out why the image from the recorder looks like this. I'll try dumping the recorder's EDID using BetterDisplay and do some experiments to get to the bottom of it.

          Nickonomic What are the benefits of NixOS for a non-developer ? And how complex is it ?

          12 days later

          Nickonomic

          Since nobody else tried this yet, I tried the same setup on my known good hardware.

          Didn't work for me at all LOL.

          (Disclaimer: I'm comparing to "Basic Display Adapter" Windows, not "accelerated" Windows.)

          I'm on a laptop with UHD Graphics 620 and a (custom-installed) AUO B140RTN03.0 panel, which is working pretty great for me on Windows 1809 with Basic Display Adapter.

          I know that the panel is good because it's the only one out of 8 different LCD panels I tried that's both pretty comfortable in Windows — and it's the only LCD that feels "totally normal" in the BIOS.


          Tried same exact NixOS 24.05.5630.c505ebf77752 GNOME on Live USB.

          Immediately felt something wrong just looking at the wallpaper.

          To rule out UI/font issues, I temporaily installed NoMachine to connect to the same remote desktop I connect to on Windows, with identical quality settings and software decoding (which feels very nice on Windows 1809 + Basic Display).

          (FYI I used a screen dimming overlay on the remote desktop's side to emulate the same effects of the "TN white color fix" I usually do on Windows, so that's not the the issue either. Made sure to try this "remote-side dimming method" on Windows too and it was totally fine there.)

          Also tried both 60hz and 40hz display modes.


          Used for a few hours and ended up feeling super disoriented and lots of brain fog. Felt like I couldn't relax. Drop shadows in apps looked way more intense/"thicker" than they're supposed to be. Generally looked oversharpened.

          All photos looked totally wrong and had a "false 3D depth effect" — [[even though this is one of the only LCDs I tried that managed to NOT have that issue on Windows!]]

          It honestly felt very similar to "MacBook symptoms" which I didn't even known my TN panel was capable of creating, since it doesn't cause anything like that at all under Windows + Basic Display.

          Before running NoMachine I was also using GNOME desktop and the browser for a bit and had the same symptoms, so it's not a problem with the NoMachine Linux version.

          Returning to Windows 1809 + Basic Display Adapter INSTANTLY fixed the problems and my screen was back to normal.


          TLDR NixOS 24.05.5630.c505ebf77752 doesn't work at all on a UHD 620 laptop even with a known good panel.

          I'm actually surprised you're fine with it on a UHD 720 desktop.

          Possibly the way NixOS decides to output to internal laptop panels (vs. desktop video output) is entirely different.


          Note that I do not think the problem here is flicker-related. Huge chance it's color/image-processing related instead. I'm not ruling out LCD clock/timing issues, but the way photos looked was so different which makes me think the difference is not just "incorrect LCD timings" or whatever.

          NixOS was using spatial dithering (not temporal), but I've ruled out dithering since I've tried Basic Display Windows 1809 with spatial dithering and it's fine (although I slightly prefer dithering disabled). Dithering is not what makes photos "look 3D" — enabling dithering doesn't affect that on Windows.

          This TN panel also has some flicker anyway (mild PWM and inversion) but again fine in Windows and the BIOS.

          Running a Linux VM "within Windows" is also fine.

          Yet somehow, running NixOS natively is able to ruin the otherwise fine screen

          dev