• OS
  • who can use Linux but not windows? Share your Linux setup

jordan When using Gnome you can install the gnome-tweaks package to access font settings easily.

    zlhr Use dconf-editor instead, the Gnome 46 (Archlinux) doesn't support gnome-tweaks

      eDenon-2 do you use display port or HDMI? I forgot what you told me in the past. Also any reason you went with the Rx 6400 over the Intel arc?

        To switch to 6 or 8 bits, I used the xrandr utility (it works with X11).

        Running

        xrandr --verbose

        will print a list of outputs and settings for each of them. You can also see the active output there (usually, the EDID field is filled in for it, indicating that a monitor is connected).

        You can set 8 bits with the command:

        xrandr --output HDMI-1 --set "max bpc" 8

        HDMI-1 in this command is just an example; you might have a different output.

        For HDMI, the minimum value is 8 bits.

        For Display Port, the minimum available value is 6 bits.

          In Ubuntu Gnome, the default display server is Wayland. You can choose between Wayland and X11 on the login page. I don't remember exactly whether X11 is available out of the box or if it needs to be installed.

          I personally had a challenging experience with the RX6600 Sapphire Pulse on macOS (hackintosh) and Linux due to strong dithering, which caused my eyes to hurt quickly. Because of this, I tend to avoid the RX series, but others may have had better experiences with it.

            I tried this guide, but it didn't work for me.

            I tried setting different bit depths for the output, but that didn't help either. Unfortunately, the RX6600 can output 12 bits to the monitor. This means that with temporal dithering enabled, it can dither even on a 10-bit display, since its internal buffer is 12-bit or higher.

            autobot

            I tried the other DP to DVI adapter (a regular cable with DP on one side and DVI on the other), and the minimum bit depth shown by the

            xrandr --verbose

            command was 8 for this DP output. I used an RX6600 card for the test. This means not all adapters are suitable for this purpose.

            Meanwhile, a Type-C DP (Intel Xe) -> Belkin USB-C Video Adapter -> DVI allowed switching to 6 bits.

            It turns out that the minimum 8-bit limitation for HDMI on my Intel UHD Xe (12th gen CPU) is not a hardware issue; it's a driver limitation in both Linux and Windows. In Windows, I was able to force a switch to 6-bit mode over HDMI by writing a specific value to the video controller pipeline state register. The following bits determine the bit depth (the rightmost bit indicates this mode, besides 8 bits, zero is used there):

            - 00000100000000000000000001000000 - 6-bit
            - 00000100000000000000000000000000 - 8-bit
            - 00000100000000000000000000100000 - 10-bit
            - 00000100000000000000000010000000 - 12-bit

            For my hardware, the value for 6 bits is 0x4000040, and it works for HDMI connection.

            P.S. To enable 6 bits over HDMI, I assigned the variable 'New' the value 0x4000040, which was written to the register at this line: https://github.com/skawamoto0/ditherig/blob/master/ditherig/ditherig.cpp#L630C60-L630C63 (this is from the original ditherig app code). I will post the modified source code later in case anyone else wants to experiment.

            Here are photos showing the screen in both 6-bit and 8-bit modes over HDMI. Banding is visible in the 6-bit mode:

            https://ibb.co/qsGgD4Q

            https://ibb.co/0Mjp5Ry

            P.P.S. If it works for me, it will most likely work on other generations of UHD as well, although I can't say exactly which ones, since I only have Intel UHD Xe (12th gen CPU) for testing. I'll try to ask friends to find other UHD models for testing.

              Prefer KDE Neon for my eyes. Am convinced KDE Neon (dark) with Wayland is about as good as it gets with my hardware in terms of eye comfort.

                Crunch bang ++. Lightweght Debian Openbox dsitro. Turning off compositing makes it pretty good

                  jordan

                  Today, my ASRock Arc A770 arrived. It seems that HDMI on it is routed through DP. Check your A770 LE in Ubuntu Desktop (X11); if it's the same for you, you won't have any issues setting 6 bits with the xrandr utility when connected via HDMI.

                  They mention here that LE also doesn't have a 'true' HDMI (https://www.vegascreativesoftware.info/us/forum/intel-arc-general-discussion-of-newest-gpu-competitor--140323/), and most likely, inside the graphics card, it is connected via a converter to DP.

                  You can check with the command

                  xrandr --verbose

                  You should see something like "DP-X connected primary…". If you see something like that, you can set it to 6 bits with the command

                  xrandr --output DP-X --set "max bpc" 6

                  where X is the number of the DP output.

                  My HDMI connection is detected as DP-1, which indicates that the HDMI port in it is not genuine:

                  DP-1 connected primary 1920x1080+0+0 (0x4b) normal (normal left inverted right x axis y axis) 531mm x 298mm

                  Clones: HDMI-1

                  max bpc: 6

                  range: (6, 12)

                  subconnector: HDMI

                  P.S. Regarding dithering, I can't say for certain yet. It seems that after switching to 6 bits, everything is okay, but more time is needed for testing.

                    WhisperingWind my ASRock Arc A770 arrived

                    are you planning to test A770 in win10 builds…? the graphic card looks promising without dithering, compared to nvidia rtx20 and newer series cards

                    dev