• OS
  • who can use Linux but not windows? Share your Linux setup

jordan changed the title to who can use Linux but not windows? Share your Linux setup .

I use latest Lubuntu , 22.04 , on Intel nuc95 as is(even live USB worked well) , chrome with no HW acceleration. The display is: VA249HE .

Lubuntu also worked without strain on another desktop, stronger one, Intel integrated graphics, now HW not working, so I can't tell more about it.

A couple of months ago, I was experimenting with the Blackmagic UltraStudio Recorder 3G, recording uncompressed 10-bit video from my MBP M1. At that time, I was already using Stillcolor (the application that disables dithering in macOS).

I also had Ubuntu 22.04, which I installed on a Parallels virtual machine in macOS. I decided to try recording and analyzing the desktop in Ubuntu. There was no dithering, neither in macOS nor in Ubuntu 22.04. When I enabled dithering in Stillcolor, dithering also appeared in Ubuntu. I don't remember exactly, but it seems Ubuntu was using Wayland, not X11.

My hypothesis is this: dithering is influenced only by a driver and a video adapter. In Parallels, the virtual video adapter driver is used, which does not dither, leading to no dithering in Ubuntu when it is run in the virtual machine. However, there's a question about the presence of hardware acceleration in this setup, which I can't yet comment on. But OpenGL is responsible for hardware acceleration. According to the documentation, OpenGL has the GL_DITHER  option.

“GL_DITHER - If enabled, dither color components or indexes before they are written to the color buffer.”

I think it is possible to recompile the Linux OpenGL library with GL_DITHER forcibly set to false. At first glance, modifications need to be made only in two places in the library code:

  1. prohibit enabling dithering from outside: https://gitlab.freedesktop.org/mesa/mesa/-/blob/main/src/mesa/main/enable.c#L519;
  2. find where ctx->Color.DitherFlag is initialized to set it to GL_FALSE.

In the kernel, dithering is enabled during the initialization stage of the video adapter (I'm only talking about UHD Intel here). As I understand it, this happens in this i915 module file: https://github.com/torvalds/linux/blob/master/drivers/gpu/drm/i915/display/intel_display.c. I quickly went through the code, and it seems there is a possibility to force 8-bit and disable dithering.

The question of hardware acceleration directly in the browser remains. As far as I know, some browsers use WebGL. The question here is whether this library calls the system's OpenGL library or not.

I found information that Intel Arc also uses the i915 kernel module, and I'm interested in whether it's possible to disable dithering not only for UHD but also for ARCs.

I recently ordered a mini PC based on the i5-12450H processor. I want to set up an additional workspace using it. Since Windows is bad for my eyes, I want to try Linux and see if I can disable dithering for UHD / ARC.

    I installed the latest version of Manjaro XFCE on a mini PC (i5-12450H, UHD Graphics (48EU)). Right out of the box, it was very uncomfortable; after a couple of hours, my eyes started to hurt, and it became difficult to work. After making edits in the i915 module, it got better. Noticeable transitions in gradients appeared: https://ibb.co/q1f6Pw4

    As for OpenGL, it's still unclear because Manjaro XFCE has several different OpenGL builds, and some of them are loaded, including in the Firefox process. It's not yet clear where to get the code for them (so far, I've identified and rebuild only one of several libraries). But if I make it so that the OpenGL libraries are not used by the OS (hardware acceleration is switched off), it seems to make it easier on the eyes, but this is still unproven and based on feelings.

    I am currently on a business trip and have limited access to equipment. In a couple of weeks, I will be home and will try to understand the difference between the adjustments to eliminate subjectivity. I will also continue to explore the possibility of rebuilding OpenGL.

    I really don't think you need to do that.. static dithering has never been an issue. It's temporal FRC that people have an issue with. It's drivers that might use FRC or hardware in the panel.

    This gl_dither thing has primarily been used when working with a colour depth that doesn't match e.g. 4 bit and 8 bit.

    And really, there's an easy way to test.. just don't use opengl as the compositor and you have your answer. XRender is a software-only path for x.org.

      Sunspark

      You are right about OpenGL, I can confirm that rebuilding OpenGL libs with dithering disabled did not yield results. Visible changes only occurred after modifying the i915 code. Currently, MBP M1 + Stillcolor seems slightly more comfortable for my eyes than Manjaro with i915 module tweaks (although I'm not 100% sure yet).

      And really, there's an easy way to test.. just don't use opengl

      Without using OpenGL, the text looks different and is easier for me to read. However, I suspect this is not because of specific OpenGL libraries, but rather due to the system rendering fonts differently without any hardware acceleration. I think I'll dig into font rendering and see what can be done here.

        Sunspark static dithering has never been an issue. It's temporal FRC that people have an issue with

        this is mostly true with older devices — but some more modern versions of "spatial dithering" are "semi" dynamic where even though they're still if nothing is happening, the dithering will slightly move around if you're moving your mouse cursor across things (or any similar kind of change to the screen)

        for example, the dithering in e-ink devices is not really temporal as it's "usually" still, but areas of the screen will "regenerate" an entirely different dithering pattern when any motion happens, which can still cause issues

          WhisperingWind experimenting with the Blackmagic UltraStudio Recorder 3G, recording uncompressed 10-bit video from my MBP M1

          can you send me some still PNG frames of some macOS apps taken directly from the lossless capture? i want to analyze the colors themselves to see if there are any other (non-temporal) anomalies like color fringing or shadows/halos where there shouldn't be

          i want to see if there's anything different about the colors that actually get sent to the video output, compared to what e.g. a normal screenshot of the same app looks like

            DisplaysShouldNotBeTVs

            The HDMI recorder is unfortunately at my home. But I am currently on a business trip. At the end of September, I will return home and will be able to record and send the images.

            WhisperingWind that would be useful. I have a Intel arc a770 LE and can experiment for you if you can help me turn the dithering off. I think it's spatial but I still would like it off so there's no chance of it bothering by scrolling or cursor movement. I did order a epiphan dvi2usb 3.0 framegrabber to test output of the cards I have. I do have a dvi2pcie as well but haven't set it up in a PC yet so gonna try the USB one since it was cheap

              jordan I have a Intel arc a770 LE and can experiment for you if you can help me turn the dithering off.

              That would be great!

              I took a closer look at the i915 module code. It appears that dithering is disabled by default and only enabled for 6-bit displays:

              https://github.com/torvalds/linux/blob/master/drivers/gpu/drm/i915/display/intel_display.c#L4777C2-L4782C37

              In other words, my edits theoretically should not have had any effect.

              Hmm, that's strange. Why do my eyes seem to strain a bit less after the edits? It might be related to my hardware setup: Tecno MEGA MINI M1 PC -> Thunderbolt 4 -> Belkin USB-C Video Adapter -> DVI cable -> BenQ GL2450 monitor (6-bit+FRC). I might have accidentally changed some settings elsewhere, which had such an effect, or it could just be the new kernel version. There are many questions.

              I think I need to start by getting a monitor without dithering and see how Linux looks by default on it (maybe I'm not fighting Linux, but my current monitor). And I need to understand how all this works. Otherwise, there will be a lot of magic and unicorns here), which is not good.

              I think I should first learn how to detect dithering in the video card's output signal (I think I can create some kind of detector in Python) and then build further work based on that.

              I will hold off on fully presenting the results until I am completely sure that I have achieved some outcome and it is 100% not a placebo.

              DisplaysShouldNotBeTVs

              Can you please tell me if you have any developments in the field of image processing for detecting dithering? As I understand it, temporal dithering can be detected using the method proposed by aiaf. But what about other types of dithering? Do you possibly have any links to some materials?

                jordan

                Yes, I did.

                The Thunderbolt 4/USB-C 4 output is recognized as a DisplayPort. Therefore, it is possible to set it to 6-bit through xrandr. In this case, the image on the monitor is less straining on the eyes (in the edits I disabled dithering for 6 bits, at least, I hope it works as I expected).

                I removed this lines https://github.com/torvalds/linux/blob/master/drivers/gpu/drm/i915/display/intel_display.c#L4777C2-L4782C37 from the code. I want to return it later and see if there are any changes.

                  WhisperingWind interesting. I own a sun vision 32" rlcd that's 6+2frc that I never tried after learning its not true 8 bit. I wonder if it's truly safe as 6bit only in Linux with the arc after driver edit

                    jordan

                    Does the monitor have a DisplayPort? Otherwise, it might not work, as HDMI (at least in Linux) does not support 6-bit.

                    You can try using a video adapter with a DP input, like the one I have, but I'm not sure which one will work in this case. Mine has a USB-C connector, and it won't be suitable for Intel Arc.

                    I came across an interesting thing: https://github.com/skawamoto0/ditherig/blob/master/ditherig/database.csv. As I understand it, this is for integrated graphics cards from Intel and AMD. We can play with addresses to disable dithering. This can be done in Linux using the devmem2 program, which can be downloaded from https://github.com/VCTLabs/devmem2. It can be built by running make in the directory with the source code.

                    The app can be run like this (you need to substitute the addresses of your own chip, presumably where it says "Disabled"):

                    ./devmem2 0x70030 w 0x00000000

                    As I understand, the address can be taken from the 8th column, the size is 4 bytes. For simplicity, you can try filling it with zeros (although, of course, one should properly apply a mask from the 10th column).

                    I wonder if this will work.

                    UPD. In reality, as usual, it turned out to be more complicated: in the 8th column, it's not the actual address but an offset relative to it. The actual address is calculated based on the values of Bar1/Bar2.

                      WhisperingWind if i understand you correctly, you want to use a cature card, cature some video frames straight from the display connection and than analyze them in python?

                        dev