• OS
  • who can use Linux but not windows? Share your Linux setup

9 days later

WhisperingWind can you see what the amd driver shows for default dithering behavior? I have a 7950x PC that has the AMD Raphael igpu. Thinking of trying that first on my incandescent monitor. Since it was safe on my tv.

    jordan

    The logic for handling dithering in the amdgpu kernel module is more complex than in the i915. Based on the code, only 10-bit and 12-bit can be considered safe. Dithering may be present at 6-bit and 8-bit. But I don't know how to determine it.

      WhisperingWind that's good to know thanks for letting me know.

      I did see this awhile back I just remembered but unsure if that's not good enough for amd.

      AMD graphics cards:
      1. Switch to runlevel 3 (telinit 3)
      2. Run the following as Root:
      aticonfig --set-pcs-val=MCIL,DP_DisableDither,1
      Note that if this command does not turn off dithering, it might be necessary to use this command instead:
      aticonfig --set-pcs-val=TMDS_DisableDither,1
      3. Save the changes.
      4. Switch back to runlevel 5 (telinit 5)

      Edit: I found these too DP_DisableDither, HDMI_DisableDither if anyone was curious

      Seems more complicated will stick to Intel for now. Just have different hardware laying around

        jordan

        I tried these commands before and concluded that they are for older AMD cards. I'm not sure exactly how old, but ChatGPT mentioned something about RX 400 or older.

        P.S. “For Ubuntu 16.04 LTS and above, the AMD Catalyst or fglrx driver is no longer supported by AMD, or in Ubuntu.{1} If you have an AMD GPU and wish to run any Ubuntu version 16.04 LTS or newer, there are two open source driver options: Radeon or AMDGPU. The AMDGPU-PRO driver provides the open source AMDGPU driver and a proprietary overlay. Newer AMD GPUs designed with GCN technology (Graphics Core Next) should use AMDGPU or AMDGPU-PRO, while older AMD GPUs should use Radeon.”

        https://help.ubuntu.com/community/BinaryDriverHowto/AMD?action=show&redirect=BinaryDriverHowto%2FATI

          jordan

          I recommend trying out ARC. I've been using mine for about a week to watch YouTube on a TV with Windows 11 and the latest version of Ubuntu (standard kernel, no system modifications). The “picture” doesn't cause any strain.

            jordan

            Eye strain is present when using my monitor, I assume due to the FRC module. I think it's related to the post-processing of the final image, as the picture is very pleasant when using software rendering. Apparently, software rendering disables this post-processing. But it's not an issue with the card itself, as the same thing happens when running Ubuntu on a virtual machine on my Mac. I think this is the issue with my monitor.

            The picture on the TV is very easy on the eyes (I use GPU render here).

            P.S. I was in therapy for 5 days and didn't use the computer, only occasionally used my phone. When I got home, I started using ARC with the TV on a regular basis. At first, I thought it was because my eyes had rested. But the effect is still there (today is the 6th day).

            WhisperingWind
            Could it be these files, in https://github.com/torvalds/linux/tree/master/drivers/gpu/drm/amd/amdgpu:

            • dce_v11_0.c
            • dce_v10_0.c
            • dce_v8_0.c
            • dce_v6_0.c

            They contain these particular lines: https://github.com/torvalds/linux/blob/c2ee9f594da826bea183ed14f2cc029c719bf4da/drivers/gpu/drm/amd/amdgpu/dce_v6_0.c#L446-L472

            I think if we simple delete the lines, the outcome would be the same as for 10 bit. As the 10 bit case is marked as "not needed", I imagine it is referring to any kind of dithering that is not needed. It seems in this case the value 0 is written to something (perhaps a GPU register) that is called "mmFMT_BIT_DEPTH_CONTROL". A lot of guessing, but perhaps it helps.

              KM

              I previously tried debugging the module's code, and the main decision regarding dithering was applied at this location: https://github.com/torvalds/linux/blob/c2ee9f594da826bea183ed14f2cc029c719bf4da/drivers/gpu/drm/amd/display/amdgpu_dm/amdgpu_dm_crc.c#L237.

              I also made some changes in a few other places, including the files you mentioned above. However, it seems I might have overlooked something. I initially got the RX6600 mainly for experimenting with Hackintosh, but those attempts were unsuccessful due to dithering issues. At that time, I didn't focus much on Linux, unlike now with Intel cards. Unfortunately, I've sold the RX6600, so I can't give it another try at the moment.

              P.S. I think I'll build a new PC next year based on an AMD CPU. If it has an iGPU, I'll give it another try.

              • KM likes this.

              I can make changes in the amdgpu module. However, it won't be a 'clean' code, and I won't be able to test the results.

              aticonfig is old and isn't used anymore. The driver in use for pretty much everyone is amdgpu. Sometimes you can set an option in GRUB but generally it's still a work-in-progress, doesn't even easily have the ability to change the colour space away from RGB and colour depth changes seem mostly to be between 30 and 24 bits. So dithering is probably present.

              It'll probably get there someday, but not soon unless you're using Windows.

              Don't spend a lot of time on this, it's not ready for what you want.

                bluetail I think the bit depth as set in the driver still has relevance

                moonpie I want more context.

                I think dithering is enabled per display based on EDID traits https://en.wikipedia.org/wiki/Extended_Display_Identification_Data. It would be great if we could query the status.
                Might be wrong, but that would imply there are couple of assumptions made…

                I have BenQ XL2546K, XL2566K, and an old LG 24GM79G. On my now sold M1 Mac with the LG, I noticed a fine screen pattern. On my PC with a 6950XT, it's similar but only up close. I suspect it's dithering. The BenQ displays don't show this on the 6950XT but have poor colors on the M1 Mac. I think the LG might be 6-bit with 2-bit FRC. Specs show BenQ has this too, but it's less noticeable. I also see banding on the BenQ, likely due to a limited color gamut.

                dev