• OS
  • who can use Linux but not windows? Share your Linux setup

Stock Fedora (40, Wayland, Gnome) improved a previously unusable device for me (running HD 3000, now with i915 driver).

The only tweaks of note is that I always set font antialiasing to grayscale and turn off the hinting, as it helps with my astigmatism.

Suprisingly on distros that use X that many people have success on here, I fare much worse.

    I'm using Debian 12 Gnome, I previously use Archlinux but installation is a hassle.

    Display: ASUS VG258QM
    Graphics Card: AMD RX6400

      Donux did you have a negative experience with a newer igpu with same software setup? Also so x11 is default on latest Ubuntu? I haven't used Linux in soooo long but it seems very promising. I am previously a windows user. I just want something that works I don't care for any bells and whistles anymore tbh haha.

      WhisperingWind that happens to me too sometimes and catches up to me later

      zlhr is the font hinting a usual setting in settings on Linux distros ? Or is it something that has to be added?

      eDenon-2 did you alter any settings? DP or HDMI?

      • zlhr replied to this.

        jordan When using Gnome you can install the gnome-tweaks package to access font settings easily.

          zlhr Use dconf-editor instead, the Gnome 46 (Archlinux) doesn't support gnome-tweaks

            eDenon-2 do you use display port or HDMI? I forgot what you told me in the past. Also any reason you went with the Rx 6400 over the Intel arc?

              To switch to 6 or 8 bits, I used the xrandr utility (it works with X11).

              Running

              xrandr --verbose

              will print a list of outputs and settings for each of them. You can also see the active output there (usually, the EDID field is filled in for it, indicating that a monitor is connected).

              You can set 8 bits with the command:

              xrandr --output HDMI-1 --set "max bpc" 8

              HDMI-1 in this command is just an example; you might have a different output.

              For HDMI, the minimum value is 8 bits.

              For Display Port, the minimum available value is 6 bits.

                In Ubuntu Gnome, the default display server is Wayland. You can choose between Wayland and X11 on the login page. I don't remember exactly whether X11 is available out of the box or if it needs to be installed.

                I personally had a challenging experience with the RX6600 Sapphire Pulse on macOS (hackintosh) and Linux due to strong dithering, which caused my eyes to hurt quickly. Because of this, I tend to avoid the RX series, but others may have had better experiences with it.

                  I tried this guide, but it didn't work for me.

                  I tried setting different bit depths for the output, but that didn't help either. Unfortunately, the RX6600 can output 12 bits to the monitor. This means that with temporal dithering enabled, it can dither even on a 10-bit display, since its internal buffer is 12-bit or higher.

                  autobot

                  I tried the other DP to DVI adapter (a regular cable with DP on one side and DVI on the other), and the minimum bit depth shown by the

                  xrandr --verbose

                  command was 8 for this DP output. I used an RX6600 card for the test. This means not all adapters are suitable for this purpose.

                  Meanwhile, a Type-C DP (Intel Xe) -> Belkin USB-C Video Adapter -> DVI allowed switching to 6 bits.

                  It turns out that the minimum 8-bit limitation for HDMI on my Intel UHD Xe (12th gen CPU) is not a hardware issue; it's a driver limitation in both Linux and Windows. In Windows, I was able to force a switch to 6-bit mode over HDMI by writing a specific value to the video controller pipeline state register. The following bits determine the bit depth (the rightmost bit indicates this mode, besides 8 bits, zero is used there):

                  - 00000100000000000000000001000000 - 6-bit
                  - 00000100000000000000000000000000 - 8-bit
                  - 00000100000000000000000000100000 - 10-bit
                  - 00000100000000000000000010000000 - 12-bit

                  For my hardware, the value for 6 bits is 0x4000040, and it works for HDMI connection.

                  P.S. To enable 6 bits over HDMI, I assigned the variable 'New' the value 0x4000040, which was written to the register at this line: https://github.com/skawamoto0/ditherig/blob/master/ditherig/ditherig.cpp#L630C60-L630C63 (this is from the original ditherig app code). I will post the modified source code later in case anyone else wants to experiment.

                  Here are photos showing the screen in both 6-bit and 8-bit modes over HDMI. Banding is visible in the 6-bit mode:

                  https://ibb.co/qsGgD4Q

                  https://ibb.co/0Mjp5Ry

                  P.P.S. If it works for me, it will most likely work on other generations of UHD as well, although I can't say exactly which ones, since I only have Intel UHD Xe (12th gen CPU) for testing. I'll try to ask friends to find other UHD models for testing.

                    Prefer KDE Neon for my eyes. Am convinced KDE Neon (dark) with Wayland is about as good as it gets with my hardware in terms of eye comfort.

                      dev