- Edited
Not that it may help for Live distros - I'm just adding the current "xorg.conf" way of disabling dithering for NVIDIA cards:
Section "Device"
Identifier "Nvidia Graphics"
Option "FlatPanelProperties" "Dithering=Disabled"
EndSection
Source: ftp://download.nvidia.com/XFree86/Linux-x86_64/375.26/README/xconfigoptions.html
I have no xorg.conf and add my options in a custom file: /etc/X11/xorg.conf.d/20-nvidia.conf. Not sure if all distributions can read those files. It works for Arch Linux.
But for testing the NVIDIA tool is probably better, for it can switch dithering on and off on the fly.
By the way, slightly off-topic but I'm currently testing another Option, "ForceFullCompositionPipeline" "true". Due to the shallow documentation I'm not sure how exactly it works and if it acts as a full compositor, but similarly it seems to prevent tearing everywhere. Which is also some kind of flicker.
And on Reddit someone said he had eye strain which got fixed just by using "compton" (a compositor):
https://www.reddit.com/r/i3wm/comments/4x3seb/eye_strain_with_i3_seems_solved_with_a_compositor/
Edit: I just combined the option ForceFullCompositionPipeline with compton, which works. Compton additionally reduces some clutter when moving windows over WebKit browsers. What's going on in this chain remains a mystery: Program's drawings and accelerations -> X Server -> Driver -> Driver Options -> GL/X Render -> some screen buffers -> even more buffers (compton, "CompositionPipeline") -> graphic card's final output, including unknown forced video mode optimizations -> Monitor -> Monitor's dithering/whatever ...