Sapphire AMD Radeon RX 6600 PULSE - dithering

The situation with the RX6600 is slightly more interesting in Windows 11 24H2 compared to the UHD630. I used the 24Q4 version of the Pro driver. Color depth: 8-bit, RGB.

The difference between the driver default and dithering disabled.

There is a difference between the "driver default" and "dithering disable" modes in ColorControl. However, the difference between these modes is very minimal. If dithering is present in the "driver default" mode, it is very subtle. I will put forward a hypothesis that these changes are the work of certain spatial dithering algorithms. However, this hypothesis is currently without evidence. By accepting this hypothesis, it can be argued that spatial dithering on the RX6600 in Windows 11 is enabled by default.

Temporal dithering is not enabled by default.

This is the result of analyzing images in "driver default" mode and "dithering disable" mode to show the absolute difference between pixels.

Total Pixels: 921,600

Pixel Differences:

  • 0: 885,439 pixels (96.08%)
  • 1: 25,993 pixels (2.82%)
  • 2: 10,168 pixels (1.10%)

As can be seen, the image is almost unchanged.

P.S. Although I had such a card a long time ago and found that it slightly strained my eyes even with dithering turned off, I ended up selling it. A week ago, I took an RX 6600 from my work office (this test was conducted with this one) to examine it more closely, and I found it to be very comfortable for my eyes.

P.P.S.Another interesting observation: If I select (ARC A770 / UHD 48EUs / RX 6600) at 60 Hz, my 60 Hz TV slightly strains my eyes. However, if I select 59.94 Hz, it feels much more comfortable. The difference is noticeable almost immediately. I'm not sure what causes this - perhaps it's because the display's actual refresh rate is 59.94 Hz rather than 60 Hz.

    WhisperingWind There is a difference between the "driver default" and "dithering disable" modes in ColorControl

    Here is difference:

    amd 780m win10 Dithering=Driver default = 0x0000C000 = 00000000000000001100000000000000, in details it set:

    FMT_BIT_DEPTH_CONTROL.FMT_RGB_RANDOM_ENABLE[14:14]

    FMT_BIT_DEPTH_CONTROL.FMT_HIGHPASS_RANDOM_ENABLE[15:15]

    amd 780m win10 Dithering=Disabled = 0x00000000

      WhisperingWind But if I select 59.94 Hz, it feels much more comfortable

      If you watch GPU via TV, 59.94 is TV NTSC-based framerate, when 60 is digital (PC). Perhabs, your TV force to use TV standarts more then PC

      simplex

      The value for my graphics card is different.

      Windows 10 22H2: 0x0000c900.

      Linux Cinnamon 22.1 (v.6.8.0): 0x00008900.

      In my case, the FMT_BIT_DEPTH_CONTROL.FMT_SPATIAL_DITHER_EN bit is additionally set in Windows/Linux. And in Linux, the FMT_BIT_DEPTH_CONTROL.FMT_RGB_RANDOM_ENABLE bit is cleared.

      When I tested amd780m, I felt strain less after using dithering=disable. Also as in safe 4600h vega6 grapfics, same registry keys was active (dithering=default), without eye-strain

      Imagine we will find internal registers to disable some in-chip pipeline conversions!

        simplex

        I can use the RX 6600 on Windows 10 22H2 without any eye strain. I'm curious if the RX 7800 XT will be just as comfortable as the RX 6600. Although it's not a given, since they have different architectures: RDNA2 vs RDNA3. Your iGPU is based on RDNA3, and maybe some kind of processing has been added to it that causes eye strain.

        I wonder if there are any success stories on LEDStrain with RDNA3 graphics cards?

          WhisperingWind Your iGPU is based on RDNA3

          I know the man, who got strain after switching from 5600g to 5700g. It is vega 7 vs vega 8. Only CPU switch.

          Also, reading this forum, I found claims begins from 5700h+ CPUs which also vega 8+ gen iGPU

          WhisperingWind I found it to be very comfortable for my eyes

          Is any difference in old rx 6600 and new one ?

          Monitor (now 6-bit, before were 8-bit), cables, TV, card vendor, PC hardware….?

          Btw, does you ram really run at 3200 XMP at 1.2V? Wow

            simplex

            Is any difference in old rx 6600 and new one ?

            The old graphics card was worse. But I don't remember exactly whether I set it to 60 Hz or 59.94 Hz; and could this have affected the result.

            The new one is perfect for my eyes.

            Monitor (now 6-bit, before were 8-bit), cables, TV, card vendor, PC hardware….?

            Both cards are Sapphire AMD Radeon RX 6600 PULSE.

            I updated the BIOS from the 2019 version to the 2023 version. The hardware has not changed (described in the first post), but the cable was replaced with a UGREEN (HDMI 2.1). The old one was an unnamed Chinese cable, but it was also fine; I tested it with the new graphics card.

            I use Sony KD-49XG8096 TV (manufactured in 2019) as monitor for my PC; the display is true 8-bit.

            Btw, does you ram really run at 3200 XMP at 1.2V? Wow

            The BIOS shows 1.2 V, but there might be an error, and it could actually be 1.35 V.

            12 days later

            @WhisperingWind may I ask you to measure rtx20 gen cards for dithering purpose like you did?

            Today I replaced b580 to rtx2080s and insta got strain, in desktop, on static…

              simplex

              I currently do not own any RTX 20 series cards, but if I ever acquire one, I will analyze it.

              Previously, I had a PNY Quadro T400 in my work PC, and it strained my eyes even in the BIOS.

              9 days later

              I compared images taken using an HDMI recorder on Linux Cinnamon 22.1. Test subjects: ARC A770 and UHD 630 (i9-9900K). The settings used are: 8-bit, dithering off, and Broadcast RGB set to Limited.

              In the archive, you will find simple comparison reports for three pairs of images. Including the compared images; all images in the report are clickable.

              For the first pair of images, 8 differing pixels were found. The differences equal ±1 for each component (R, G, or B).

              For the second pair of images, 1901 differences were found: 1900 pixels differ by ±1 for each component (R, G, or B), and one pixel differs by ±2 for each component (R, G, or B). All 1900 differing pixels form a small horizontal line drawn in the settings application.

              For the third pair of images, only one differing pixel was found, with differences equal to ±2 for each component (R, G, or B).

              Despite the different architectures of the GPUs, they produce nearly identical visuals in Linux Cinnamon 22.1 overall.

                WhisperingWind

                How did you disable dithering on Linux on Intel GPUs? Its off by default right?

                Is there a difference in output between Windows and Linux? Which would you say is more comfortable?

                Have you tested the same GPU with different distributions or desktop environments? Would be interesting to see if there are meaningful differences.

                  karut

                  How did you disable dithering on Linux on Intel GPUs?

                  When using an Intel GPU/iGPU in 8-bit mode or higher, dithering is disabled by default on Linux. In 6-bit mode, dithering is enabled, but there is a fix to disable it.

                  Is there a difference in output between Windows and Linux?

                  I haven't tested it in detail yet, but it should be. It doesn't necessarily have to lead to eye strain, just slightly different shades due to the different color schemes in the drivers, etc.

                  Which would you say is more comfortable?

                  My experience might not be entirely relevant since I use an Apple Silicon Mac 90% of the time, and the PC is connected to a TV for gaming and YouTube.

                  Previously, I used a mini PC based on the i5-12450H (UHD 48EUs) with Linux Cinnamon 22.1 and Windows 10 22H2, and it was equally comfortable in both operating systems. However, it recently stopped turning on after a power surge. Now I use an RX6600 and Windows 10 22H2. The second setup seems subjectively a bit more comfortable to me.

                  Have you tested the same GPU with different distributions or desktop environments?

                  Haven't tested it yet. It's more difficult, as different compositors and desktop environments will introduce significant inaccuracies in the test. But theoretically, it can be attempted using calibration images opened in full screen. For example, comparing contrast and brightness may not be valid due to the predominance of lighter shades in the interface of one of the desktop environments. In general, this requires some thought on how it can be done.

                  What do you think regardin rtx20 series cards?

                  https://forums.developer.nvidia.com/t/nvidia-2080ti-gfx-output-modifies-pixels/160030

                  https://forums.developer.nvidia.com/t/can-not-completely-cancel-dithering/260938

                  The topics above shows it contain non-equal images. Can it be result of rounding? I.e. nvidia use 16 float instead of 32 bit calculation which gives different and non-accurate results each frame

                    simplex

                    Getting a 1:1 image when recording to an HDMI recorder can be a non-trivial task, as the driver-GPU combination might apply its own color correction. For instance, on Linux, Intel UHD 48EUs and Intel ARC A770 render almost identically. However, in Windows 10 22H2, even when both GPUs use the same Intel UHD driver, the difference between the two GPUs is much more significant. The driver might recognize the GPU model and load a custom LUT for color correction.

                    Dithering, for example, is supposed to have a specific structure, such as random noise to smooth gradient transitions or a checkerboard pattern to improve color reproduction. I think that individual altered pixels cannot be attributed to either of these. This suggests that it is either a calculation error or additional color correction.

                    Unfortunately, I no longer have the screenshots taken from the Intel UHD 48EUs, but I do have a screenshot taken from the Intel UHD 630. Previously, I compared the Intel UHD 630 and Intel ARC A770 on Linux, and they rendered almost identically. Here's a comparison on Windows 11 (UHD 630 vs. ARC A770): the difference is noticeable to the naked eye (different drivers, different final color processing). In other words, the driver itself can make adjustments to how the GPU renders.

                    P.S. I’ve encountered strange rendering behavior when comparing screenshots from Intel UHD 48EUs and ARC A770 on Linux, captured using an HDMI recorder. This is not exactly the same issue you mentioned, but it might be related. The screenshots are nearly identical except for the bottom panel (including different icons and time) and the mouse cursor. In the top-left corner, there are small lines of red dots - these represent differences in pixel colors between the two screenshots. These differences are consistent; you can minimize the browser, close it, open any other program, and these lines will remain in their original state. After a system reboot, they may disappear or reappear. This is also not dithering and seems to be some kind of rendering glitch (software or hardware).

                    dev