- Edited
I recommend trying out ARC. I've been using mine for about a week to watch YouTube on a TV with Windows 11 and the latest version of Ubuntu (standard kernel, no system modifications). The “picture” doesn't cause any strain.
I recommend trying out ARC. I've been using mine for about a week to watch YouTube on a TV with Windows 11 and the latest version of Ubuntu (standard kernel, no system modifications). The “picture” doesn't cause any strain.
WhisperingWind so win11 and Ubuntu feel both okay? I thought you had strain issues with the arc?
Eye strain is present when using my monitor, I assume due to the FRC module. I think it's related to the post-processing of the final image, as the picture is very pleasant when using software rendering. Apparently, software rendering disables this post-processing. But it's not an issue with the card itself, as the same thing happens when running Ubuntu on a virtual machine on my Mac. I think this is the issue with my monitor.
The picture on the TV is very easy on the eyes (I use GPU render here).
P.S. I was in therapy for 5 days and didn't use the computer, only occasionally used my phone. When I got home, I started using ARC with the TV on a regular basis. At first, I thought it was because my eyes had rested. But the effect is still there (today is the 6th day).
WhisperingWind
Could it be these files, in https://github.com/torvalds/linux/tree/master/drivers/gpu/drm/amd/amdgpu:
They contain these particular lines: https://github.com/torvalds/linux/blob/c2ee9f594da826bea183ed14f2cc029c719bf4da/drivers/gpu/drm/amd/amdgpu/dce_v6_0.c#L446-L472
I think if we simple delete the lines, the outcome would be the same as for 10 bit. As the 10 bit case is marked as "not needed", I imagine it is referring to any kind of dithering that is not needed. It seems in this case the value 0 is written to something (perhaps a GPU register) that is called "mmFMT_BIT_DEPTH_CONTROL". A lot of guessing, but perhaps it helps.
I previously tried debugging the module's code, and the main decision regarding dithering was applied at this location: https://github.com/torvalds/linux/blob/c2ee9f594da826bea183ed14f2cc029c719bf4da/drivers/gpu/drm/amd/display/amdgpu_dm/amdgpu_dm_crc.c#L237.
I also made some changes in a few other places, including the files you mentioned above. However, it seems I might have overlooked something. I initially got the RX6600 mainly for experimenting with Hackintosh, but those attempts were unsuccessful due to dithering issues. At that time, I didn't focus much on Linux, unlike now with Intel cards. Unfortunately, I've sold the RX6600, so I can't give it another try at the moment.
P.S. I think I'll build a new PC next year based on an AMD CPU. If it has an iGPU, I'll give it another try.
I can make changes in the amdgpu module. However, it won't be a 'clean' code, and I won't be able to test the results.
aticonfig is old and isn't used anymore. The driver in use for pretty much everyone is amdgpu. Sometimes you can set an option in GRUB but generally it's still a work-in-progress, doesn't even easily have the ability to change the colour space away from RGB and colour depth changes seem mostly to be between 30 and 24 bits. So dithering is probably present.
It'll probably get there someday, but not soon unless you're using Windows.
Don't spend a lot of time on this, it's not ready for what you want.
Sunspark swaywm has setting foreach output to set render bit depth though
https://man.archlinux.org/man/sway-output.5
Even for subpixel layout…
''''''''''''''
moonpie I want more context.
I think dithering is enabled per display based on EDID traits https://en.wikipedia.org/wiki/Extended_Display_Identification_Data. It would be great if we could query the status.
Might be wrong, but that would imply there are couple of assumptions made…
I have BenQ XL2546K, XL2566K, and an old LG 24GM79G. On my now sold M1 Mac with the LG, I noticed a fine screen pattern. On my PC with a 6950XT, it's similar but only up close. I suspect it's dithering. The BenQ displays don't show this on the 6950XT but have poor colors on the M1 Mac. I think the LG might be 6-bit with 2-bit FRC. Specs show BenQ has this too, but it's less noticeable. I also see banding on the BenQ, likely due to a limited color gamut.
,,,,,,,,,,,
WhisperingWind Thank you for that guide. Reading it I was able to compile a Linux Kernel on my Debian notebook that has Intel's dithering successfully disabled. The dithering was just spatial (not temporal) and didn't seem to cause obvious issues, at least not on static images, but who knows if it contributed to eye strain over time when watching scrolling/moving content. Time will tell.
I wonder if there is a faster way to have the source code changes apply other than to compile a new Kernel though. My poor notebook compiled for hours with loud fan noise. Does anyone know if we could shorten the process? If we want to modifiy the distro's own Kernel, I think we would have to repeat the change every time a new Kernel shows up in the distro's package manager. Which could become annoying fast if compiling takes that long each time.
In this case, a package with a pre-compiled kernel can be put together and shared for easy installation, so users won't have to go through the hassle of compiling it themselves. However, different Linux distributions may have different package managers, library sets, and system components, which could affect the compatibility of the package. This task might require quite a bit of time and effort, as I would need to build, test, and maintain multiple packages for various distributions and possibly different kernel versions.
Currently, there don't seem to be other options besides manually building the kernel, since changes are needed in the source code of the i915 module. Distributing this module separately isn't feasible either, as it's closely linked to a specific kernel version.
.............
Have you looked at Intel VTune? Note: may not work with ARC.
No, I haven't tried it yet. I'll take a look, thanks for the link.
FBC and DRRS don't look good
It seems like the folks at Intel decided to complicate things for kernel developers with all this functionality
.................
Compression (FBC) is present, but as I understand it, it should be lossless (?) and not strain the eyes.
My hardware doesn't support DRRS, so I couldn't read the DRRS status report.
............
moonpie I don't think this is 100% the case, since I have 8th gen laptop (ThinkPad T480) where "Display Refresh Rate Switching" is an option in the 2017 drivers control panel. It pretty much has the same effect when enabled -- dynamically switching between 48/60hz with no obvious transition.
Also, multiple eDP panels I've connected to the laptop (including ones as old as 2014) have both 48 and 60hz modes reported in the EDID. So features that are pretty similar to this seem to have been around for a while