I'm using Debian 12 Gnome, I previously use Archlinux but installation is a hassle.
Display: ASUS VG258QM
Graphics Card: AMD RX6400
I'm using Debian 12 Gnome, I previously use Archlinux but installation is a hassle.
Display: ASUS VG258QM
Graphics Card: AMD RX6400
Donux did you have a negative experience with a newer igpu with same software setup? Also so x11 is default on latest Ubuntu? I haven't used Linux in soooo long but it seems very promising. I am previously a windows user. I just want something that works I don't care for any bells and whistles anymore tbh haha.
WhisperingWind that happens to me too sometimes and catches up to me later
zlhr is the font hinting a usual setting in settings on Linux distros ? Or is it something that has to be added?
eDenon-2 did you alter any settings? DP or HDMI?
You might also find intel_reg
useful, there's a thread about Intel graphics dithering registers + Linux here: https://ledstrain.org/d/1017-registers-to-control-intel-graphics-dithering-on-linux-located/17.
@WhisperingWind how do you set 8bpc on Linux i915 driver? (8+2 frc monitor)
To switch to 6 or 8 bits, I used the xrandr utility (it works with X11).
Running
xrandr --verbose
will print a list of outputs and settings for each of them. You can also see the active output there (usually, the EDID field is filled in for it, indicating that a monitor is connected).
You can set 8 bits with the command:
xrandr --output HDMI-1 --set "max bpc" 8
HDMI-1 in this command is just an example; you might have a different output.
For HDMI, the minimum value is 8 bits.
For Display Port, the minimum available value is 6 bits.
In Ubuntu Gnome, the default display server is Wayland. You can choose between Wayland and X11 on the login page. I don't remember exactly whether X11 is available out of the box or if it needs to be installed.
I personally had a challenging experience with the RX6600 Sapphire Pulse on macOS (hackintosh) and Linux due to strong dithering, which caused my eyes to hurt quickly. Because of this, I tend to avoid the RX series, but others may have had better experiences with it.
WhisperingWind thanks for that info! Super helpful. Oh wow really ? Did you ever try any of these to disable dithering on the Rx6600? https://docs.vpixx.com/vocal/disabling-dithering
https://anyware.hp.com/knowledge/how-do-i-turn-off-temporal-dithering-in-an-amd-graphics-card
It seems the Intel Arc could be the most safe option
I tried this guide, but it didn't work for me.
I tried setting different bit depths for the output, but that didn't help either. Unfortunately, the RX6600 can output 12 bits to the monitor. This means that with temporal dithering enabled, it can dither even on a 10-bit display, since its internal buffer is 12-bit or higher.
WhisperingWind if i use a converter from hdmi to display port, can i use xrandr to set 6-bit color?
I don't know if this will work in your case. But I think it's worth a try.
eDenon-2 Are you using the Intel ArcA770 or A750?
I tried the other DP to DVI adapter (a regular cable with DP on one side and DVI on the other), and the minimum bit depth shown by the
xrandr --verbose
command was 8 for this DP output. I used an RX6600 card for the test. This means not all adapters are suitable for this purpose.
Meanwhile, a Type-C DP (Intel Xe) -> Belkin USB-C Video Adapter -> DVI allowed switching to 6 bits.
whystrainwhy Arc A380, but now RX6400
It turns out that the minimum 8-bit limitation for HDMI on my Intel UHD Xe (12th gen CPU) is not a hardware issue; it's a driver limitation in both Linux and Windows. In Windows, I was able to force a switch to 6-bit mode over HDMI by writing a specific value to the video controller pipeline state register. The following bits determine the bit depth (the rightmost bit indicates this mode, besides 8 bits, zero is used there):
- 00000100000000000000000001000000 - 6-bit
- 00000100000000000000000000000000 - 8-bit
- 00000100000000000000000000100000 - 10-bit
- 00000100000000000000000010000000 - 12-bit
For my hardware, the value for 6 bits is 0x4000040, and it works for HDMI connection.
P.S. To enable 6 bits over HDMI, I assigned the variable 'New' the value 0x4000040, which was written to the register at this line: https://github.com/skawamoto0/ditherig/blob/master/ditherig/ditherig.cpp#L630C60-L630C63 (this is from the original ditherig app code). I will post the modified source code later in case anyone else wants to experiment.
Here are photos showing the screen in both 6-bit and 8-bit modes over HDMI. Banding is visible in the 6-bit mode:
P.P.S. If it works for me, it will most likely work on other generations of UHD as well, although I can't say exactly which ones, since I only have Intel UHD Xe (12th gen CPU) for testing. I'll try to ask friends to find other UHD models for testing.
Prefer KDE Neon for my eyes. Am convinced KDE Neon (dark) with Wayland is about as good as it gets with my hardware in terms of eye comfort.
Crunch bang ++. Lightweght Debian Openbox dsitro. Turning off compositing makes it pretty good