I want to try disabling dithering on my gpu 6900xt, i found the Colorcontrol app but it doesn't seem to work on my card, no difference when i change resolution.

How else can i disable dithering?

    erfto1

    With ColorControl, via webUI (browser interface) you can set dithering=disabled. This is for external output dithering

    For in-chip GPU (internal) blending/dithering - not sure it could be programable

      simplex

      Im using an external monitor connected with DP, is there a way i can check if theres temporal dithering with no equipment?

      I tried using different dithering options but couldn't see any difference here

        erfto1 temporal dithering

        If you mean GPU internal dithering, it hard to check

        GPU dithering is 1st layer. Than it cover with 2nd layer: monitor pixel inverions + monitor FRC. Hard to say what layer you can see now

          simplex

          I want to check gpu dither, my monitor is running on 8 bit native theres no frc i guess

            erfto1

            For an accurate check for temporal dithering on a GPU, you will need an external HDMI recorder to capture the video output and analyze the recording.

            I have a Sapphire AMD Radeon RX 6600 PULSE, which I use on Windows 10 22H2. In DRIVER_DEFAULT mode, temporal dithering is absent (tested using an HDMI recorder), but your situation may differ since different GPU models/vendors/vBIOS versions can produce varying results.

            Try the following:

            1. Use a true 8-bit display without FRC.
            2. Test for pixel inversion using this tool : 12 tests (for some reason, lagom.nl does not always work for me).
            3. Avoid wide gamut displays if you are sensitive to features like KSF phosphor.
            4. Ensure the GPU is set to 8-bit and RGB 4:4:4 standard, as other modes might introduce dithering.
            5. Make sure your dithering settings look like mine.

            I sometimes had situations where the display caused eye strain, even though the specifications were fine. Manufacturers may sometimes withhold certain characteristics, or components might change during production. Try using a different display or a different PC with your display to make sure the display is safe.

              WhisperingWind Use a true 8-bit display without FRC.

              I second this. If you buy a monitor that doesn't support 10 bit color/HDR and is sRGB only, they GPU likely won't force the display to emulate 10 bit color.

              In Windows, a pure 8 bit only panel won't allow HDR settings.

                Clokwork

                If you buy a monitor that doesn't support 10 bit color/HDR and is sRGB only, they GPU likely won't force the display to emulate 10 bit color.

                Modern GPUs work with oversized buffers and can apply dithering on both 8-bit and 10-bit displays. So, if the driver is designed to enable dithering, it will do so regardless of whether you're using an 8-bit or 10-bit monitor. Additionally, the choice of color space usually doesn’t make much of a difference in this regard.

                  WhisperingWind the choice of color space usually doesn’t make much of a difference in this regard.

                  I don't see how it wouldn't make a difference. If the output is 16.7 Million colors, an 8 bit monitor won't have any need at all to use temporal dithering.

                  WhisperingWind if the driver is designed to enable dithering, it will do so regardless of whether you're using an 8-bit or 10-bit monitor.

                  Have you found any evidence of a modern GPU forcing temporal dithering on a monitor that only supports 8-bit (16.7M colors)?

                    Clokwork

                    I don't see how it wouldn't make a difference. If the output is 16.7 Million colors, an 8 bit monitor won't have any need at all to use temporal dithering.

                    If the graphics card outputs a true 8-bit signal without any dithering, a true 8-bit monitor will display exactly those 8 bits per channel. In this case, you are correct, there will be no temporal dithering.

                    But this is not always the case; in some situations, we cannot directly control the graphics card outputs or temporal dithering settings. The most striking example is the Apple Silicon iGPU driver.

                    Have you found any evidence of a modern GPU forcing temporal dithering on a monitor that only supports 8-bit (16.7M colors)?

                    The iGPU driver on my M1 Mac enables temporal dithering by default, even on a true 8-bit panel.

                    dev