I took a closer look at the i915 module code. It appears that dithering is disabled by default and only enabled for 6-bit displays:
https://github.com/torvalds/linux/blob/master/drivers/gpu/drm/i915/display/intel_display.c#L4777C2-L4782C37
In other words, my edits theoretically should not have had any effect.
Hmm, that's strange. Why do my eyes seem to strain a bit less after the edits? It might be related to my hardware setup: Tecno MEGA MINI M1 PC -> Thunderbolt 4 -> Belkin USB-C Video Adapter -> DVI cable -> BenQ GL2450 monitor (6-bit+FRC). I might have accidentally changed some settings elsewhere, which had such an effect, or it could just be the new kernel version. There are many questions.
I think I need to start by getting a monitor without dithering and see how Linux looks by default on it (maybe I'm not fighting Linux, but my current monitor). And I need to understand how all this works. Otherwise, there will be a lot of magic and unicorns here), which is not good.
I think I should first learn how to detect dithering in the video card's output signal (I think I can create some kind of detector in Python) and then build further work based on that.
I will hold off on fully presenting the results until I am completely sure that I have achieved some outcome and it is 100% not a placebo.
DisplaysShouldNotBeTVs
Can you please tell me if you have any developments in the field of image processing for detecting dithering? As I understand it, temporal dithering can be detected using the method proposed by aiaf. But what about other types of dithering? Do you possibly have any links to some materials?