I am using Linux and a Sony TV as the display. APU is Ryzen 5600G, so the driver in use is amdgpu
. There are some Linux desktop applications that make use of GPU acceleration and cause instant eye strain. Let's assume that temporal dithering is the cause, and that the driver enables temporal dithering for certain GPU-accelerated applications. I am wondering if modifying the display's EDID information in the right way could make the driver stop doing that.
This is the EDID as seen in xrandr --props
, in hexadecimal bytes:
00 ff ff ff ff ff ff 00 4d d9 04 7e 01 01 01 01
01 1c 01 03 80 5f 36 78 0a 0d c9 a0 57 47 98 27
12 48 4c 21 08 00 81 80 a9 c0 71 4f b3 00 01 01
01 01 01 01 01 01 08 e8 00 30 f2 70 5a 80 b0 58
8a 00 b8 17 32 00 00 1e 02 3a 80 18 71 38 2d 40
58 2c 45 00 b8 17 32 00 00 1e 00 00 00 fc 00 53
4f 4e 59 20 54 56 20 20 2a 30 30 0a 00 00 00 fd
00 17 3e 0e 88 3c 00 0a 20 20 20 20 20 20 01 6a
02 03 59 f0 5b 61 60 5d 5e 5f 62 1f 10 14 05 13
04 20 22 3c 3e 12 16 03 07 11 15 02 06 01 65 66
2c 0d 7f 07 15 07 50 3d 07 bc 57 06 00 83 0f 00
00 6e 03 0c 00 20 00 b8 3c 2f 00 80 01 02 03 04
67 d8 5d c4 01 78 80 01 e2 00 cb e3 05 ff 01 e5
0f 03 00 00 06 e3 06 0d 01 01 1d 00 72 51 d0 1e
20 6e 28 55 00 b8 17 32 00 00 1e 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 93
Can you help me find out which bytes to modify? I could then try to load the modified EDID as a kernel parameter and test if the changes fixed the problem. This is all assuming that xrandr displays correct EDID data, and that passed EDID kernel parameters are actually respected by the driver.