I want to try disabling dithering on my gpu 6900xt, i found the Colorcontrol app but it doesn't seem to work on my card, no difference when i change resolution.
How else can i disable dithering?
I want to try disabling dithering on my gpu 6900xt, i found the Colorcontrol app but it doesn't seem to work on my card, no difference when i change resolution.
How else can i disable dithering?
For an accurate check for temporal dithering on a GPU, you will need an external HDMI recorder to capture the video output and analyze the recording.
I have a Sapphire AMD Radeon RX 6600 PULSE, which I use on Windows 10 22H2. In DRIVER_DEFAULT mode, temporal dithering is absent (tested using an HDMI recorder), but your situation may differ since different GPU models/vendors/vBIOS versions can produce varying results.
Try the following:
I sometimes had situations where the display caused eye strain, even though the specifications were fine. Manufacturers may sometimes withhold certain characteristics, or components might change during production. Try using a different display or a different PC with your display to make sure the display is safe.
WhisperingWind Use a true 8-bit display without FRC.
I second this. If you buy a monitor that doesn't support 10 bit color/HDR and is sRGB only, they GPU likely won't force the display to emulate 10 bit color.
In Windows, a pure 8 bit only panel won't allow HDR settings.
If you buy a monitor that doesn't support 10 bit color/HDR and is sRGB only, they GPU likely won't force the display to emulate 10 bit color.
Modern GPUs work with oversized buffers and can apply dithering on both 8-bit and 10-bit displays. So, if the driver is designed to enable dithering, it will do so regardless of whether you're using an 8-bit or 10-bit monitor. Additionally, the choice of color space usually doesn’t make much of a difference in this regard.
WhisperingWind the choice of color space usually doesn’t make much of a difference in this regard.
I don't see how it wouldn't make a difference. If the output is 16.7 Million colors, an 8 bit monitor won't have any need at all to use temporal dithering.
WhisperingWind if the driver is designed to enable dithering, it will do so regardless of whether you're using an 8-bit or 10-bit monitor.
Have you found any evidence of a modern GPU forcing temporal dithering on a monitor that only supports 8-bit (16.7M colors)?
I don't see how it wouldn't make a difference. If the output is 16.7 Million colors, an 8 bit monitor won't have any need at all to use temporal dithering.
If the graphics card outputs a true 8-bit signal without any dithering, a true 8-bit monitor will display exactly those 8 bits per channel. In this case, you are correct, there will be no temporal dithering.
But this is not always the case; in some situations, we cannot directly control the graphics card outputs or temporal dithering settings. The most striking example is the Apple Silicon iGPU driver.
Have you found any evidence of a modern GPU forcing temporal dithering on a monitor that only supports 8-bit (16.7M colors)?
The iGPU driver on my M1 Mac enables temporal dithering by default, even on a true 8-bit panel.