In this thread, I will be sharing the results of image analysis from various graphics cards.

A description of my hardware can be found below.

First PC:

- Graphics Card 1: ASRock Arc A770 Phantom Gaming 16GB OC.

- Graphics Card 2: Sapphire AMD Radeon RX 6600 PULSE.

- Motherboard: Z390 AORUS MASTER G2 Edition (rev. 1.0); BIOS F20 (2013); Resizable BAR enabled; Compatibility Support Module disabled.

- RAM: Kingston Fury 3200MHz 32GB DDR4 RAM (4x8GB). XMP profile applied: 3200MHz, 1.35V.

- CPU: i9-9900K without overclocking, Intel UHD630 iGPU.

Second PC (Mini-PC):

- CPU: i5-12450H.

- iGPU: Intel UHD 48EUs.

- RAM: Lexar DDR4-3200, 1.2V.

The screenshots were captured using the Blackmagic UltraStudio Recorder 3G.

Here are the scripts I used to analyze images.

Previously, I posted two entries on the topic of dithering, and I’ll include links to them here along with a brief summary of the testing results.

Testing for dithering in Windows 10 22H2 using an Intel ASRock ARC A770. Result: No dithering by default.

Testing for dithering in Windows 11 24H2 using Intel UHD 48EUs. Result: No dithering by default.

Intel UHD 630 (i9-9900K) - dithering

Regarding GPUs based on the Xe architecture, we know that in the latest versions of Windows, dithering is disabled in the OS by default. However, the Intel UHD 630 (i9-9900K) is not based on the Xe architecture. Let's test it in conjunction with the Ditherig app.

Dithering is disabled by default on the Intel UHD 630 (i9-9900K) in Windows 11 24H2. Driver version used: 26.20.100.7642 (1/22/2020). Color depth: 8-bit, RGB.

The difference between dithering enabled and disabled.

Red dots traditionally mark altered pixels. The traditional checkerboard pattern typical of Intel is visible.

This is the result of analyzing images with and without dithering to calculate the absolute difference between pixels. "0" shows the number of pixels with no difference between the two images. "1" shows the number of pixels with a difference of one unit in any direction for any of the R G B components. And so on.

Pixel Differences:

  • 0: 421,289 pixels (45.71%)
  • 1: 428,176 pixels (46.46%)
  • 2: 72,135 pixels (7.83%)

Spatial dithering does not significantly alter the image on the UHD 630 (i9-9900K).

Sapphire AMD Radeon RX 6600 PULSE - dithering

The situation with the RX6600 is slightly more interesting in Windows 11 24H2 compared to the UHD630. I used the 24Q4 version of the Pro driver. Color depth: 8-bit, RGB.

The difference between the driver default and dithering disabled.

There is a difference between the "driver default" and "dithering disable" modes in ColorControl. However, the difference between these modes is very minimal. If dithering is present in the "driver default" mode, it is very subtle. I will put forward a hypothesis that these changes are the work of certain spatial dithering algorithms. However, this hypothesis is currently without evidence. By accepting this hypothesis, it can be argued that spatial dithering on the RX6600 in Windows 11 is enabled by default.

Temporal dithering is not enabled by default.

This is the result of analyzing images in "driver default" mode and "dithering disable" mode to show the absolute difference between pixels.

Total Pixels: 921,600

Pixel Differences:

  • 0: 885,439 pixels (96.08%)
  • 1: 25,993 pixels (2.82%)
  • 2: 10,168 pixels (1.10%)

As can be seen, the image is almost unchanged.

P.S. Although I had such a card a long time ago and found that it slightly strained my eyes even with dithering turned off, I ended up selling it. A week ago, I took an RX 6600 from my work office (this test was conducted with this one) to examine it more closely, and I found it to be very comfortable for my eyes.

P.P.S.Another interesting observation: If I select (ARC A770 / UHD 48EUs / RX 6600) at 60 Hz, my 60 Hz TV slightly strains my eyes. However, if I select 59.94 Hz, it feels much more comfortable. The difference is noticeable almost immediately. I'm not sure what causes this - perhaps it's because the display's actual refresh rate is 59.94 Hz rather than 60 Hz.

    WhisperingWind There is a difference between the "driver default" and "dithering disable" modes in ColorControl

    Here is difference:

    amd 780m win10 Dithering=Driver default = 0x0000C000 = 00000000000000001100000000000000, in details it set:

    FMT_BIT_DEPTH_CONTROL.FMT_RGB_RANDOM_ENABLE[14:14]

    FMT_BIT_DEPTH_CONTROL.FMT_HIGHPASS_RANDOM_ENABLE[15:15]

    amd 780m win10 Dithering=Disabled = 0x00000000

      WhisperingWind But if I select 59.94 Hz, it feels much more comfortable

      If you watch GPU via TV, 59.94 is TV NTSC-based framerate, when 60 is digital (PC). Perhabs, your TV force to use TV standarts more then PC

      simplex

      The value for my graphics card is different.

      Windows 10 22H2: 0x0000c900.

      Linux Cinnamon 22.1 (v.6.8.0): 0x00008900.

      In my case, the FMT_BIT_DEPTH_CONTROL.FMT_SPATIAL_DITHER_EN bit is additionally set in Windows/Linux. And in Linux, the FMT_BIT_DEPTH_CONTROL.FMT_RGB_RANDOM_ENABLE bit is cleared.

      When I tested amd780m, I felt strain less after using dithering=disable. Also as in safe 4600h vega6 grapfics, same registry keys was active (dithering=default), without eye-strain

      Imagine we will find internal registers to disable some in-chip pipeline conversions!

        simplex

        I can use the RX 6600 on Windows 10 22H2 without any eye strain. I'm curious if the RX 7800 XT will be just as comfortable as the RX 6600. Although it's not a given, since they have different architectures: RDNA2 vs RDNA3. Your iGPU is based on RDNA3, and maybe some kind of processing has been added to it that causes eye strain.

        I wonder if there are any success stories on LEDStrain with RDNA3 graphics cards?

          WhisperingWind Your iGPU is based on RDNA3

          I know the man, who got strain after switching from 5600g to 5700g. It is vega 7 vs vega 8. Only CPU switch.

          Also, reading this forum, I found claims begins from 5700h+ CPUs which also vega 8+ gen iGPU

          WhisperingWind I found it to be very comfortable for my eyes

          Is any difference in old rx 6600 and new one ?

          Monitor (now 6-bit, before were 8-bit), cables, TV, card vendor, PC hardware….?

          Btw, does you ram really run at 3200 XMP at 1.2V? Wow

            simplex

            Is any difference in old rx 6600 and new one ?

            The old graphics card was worse. But I don't remember exactly whether I set it to 60 Hz or 59.94 Hz; and could this have affected the result.

            The new one is perfect for my eyes.

            Monitor (now 6-bit, before were 8-bit), cables, TV, card vendor, PC hardware….?

            Both cards are Sapphire AMD Radeon RX 6600 PULSE.

            I updated the BIOS from the 2019 version to the 2023 version. The hardware has not changed (described in the first post), but the cable was replaced with a UGREEN (HDMI 2.1). The old one was an unnamed Chinese cable, but it was also fine; I tested it with the new graphics card.

            I use Sony KD-49XG8096 TV (manufactured in 2019) as monitor for my PC; the display is true 8-bit.

            Btw, does you ram really run at 3200 XMP at 1.2V? Wow

            The BIOS shows 1.2 V, but there might be an error, and it could actually be 1.35 V.

            dev