kammerer I've tested many distros over the years and I've never been able to use Linux comfortably. Even going back to the mid 00's when I was testing Redhat, Knoppix Live and early Ubuntu releases I still noticed the issues we have now. I remember at the time (2006 or so) thinking Linux makes me feel a little weird/eye strain, so I never switched and stuck with XP. At the time Linux was nowhere near as user-friendly as it is now so I didn't feel motivated to dig deep and stuck with Windows. There is definitely something going on with how the desktop is rendered, or dithering of some type has always been in use with Linux and perhaps only recently baked into Windows. OTOH there are several on this forum that use Linux daily without issues, I'd be interested to know if dithering is happening on those setups.

I'm no expert on Linux and don't now what part of the chain is causing the discomfort, if it's the DE, Kernel, Driver or anything else. The good thing about Linux is we can chop and change and hopefully see what exact part of the system is causing it e.g. if I don't use a DE just command line will dithering still be present? If i rollback the kernel to X version regardless of the distro will that work? Again I'm not an expert but as we have that flexibility in Linux it should help to get to the 'point of failure' much easier.

Could try posting on one of the techie freelance websites like PeoplePerHour, obivously going to cost a bit of money but might get further than just asking nicely.

  • JTL replied to this.

    I've been thinking about this for a while, but If anyone can use certain versions of Linux comfortably but not others on that same hardware, here's some information that should be collected along with the "distribution version"

    1) Kernel version: Do uname -a in a console/terminal

    example:

    Linux linux 5.0.0-27-generic #28~18.04.1-Ubuntu SMP Thu Aug 22 03:00:32 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux

    2) Driver "binded" the GPU: Do lspci -v and copy+paste the block of text that starts with VGA controller

    01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Venus XT [Radeon HD 8870M / R9 M270X/M370X] (rev 83) (prog-if 00 [VGA controller])
    	Subsystem: Apple Inc. Radeon R9 M370X Mac Edition
    	Flags: bus master, fast devsel, latency 0, IRQ 69
    	Memory at 80000000 (64-bit, prefetchable) [size=256M]
    	Memory at b0c00000 (64-bit, non-prefetchable) [size=256K]
    	I/O ports at 3000 [size=256]
    	Expansion ROM at b0c40000 [disabled] [size=128K]
    	Capabilities: <access denied>
    	Kernel driver in use: radeon
    	Kernel modules: radeon, amdgpu

    3) Driver "binded" to X.Org. This one's a bit harder, you need to know the location of the X.Org log file that can change from system to system: Do grep "AIGLX: Loaded and initialized" /var/log/Xorg.0.log (At least on my system)

    If all goes well you should get something like this

    [    19.230] (II) AIGLX: Loaded and initialized radeonsi

      Seagull They'd need to have the infrastructure in place to test for and (more importantly) have an understanding of "our issue", the causes and technical solutions, which isn't trivial to explain for most people.

      I've heard mixed things about such places myself.

      JTL Thanks. I would share my output but mine isn't comfortable to begin with.

      Has anybody had luck posting on popular distro forums for advice e.g. Ubuntu/Mint? Although nobody knows of dithering outside of this site.

      The Arch community isn't known for being helpful (rtfm 🙂) but they are probably tech-savvy enough to look into it - it's not an Arch-specific issue though but I'd be willing to bet Arch has the same dithering/rendering issues regardless of DE used.

      2 months later

      Bumping this as I posted on AskUbuntu for advice to see if a bug can be reported or any support. (I assume the xrandr off setting still doesn't work)

      It has already been reported as a bug before

      Intel - https://bugs.launchpad.net/ubuntu/+source/linux/+bug/1776642
      Nvidia - https://bugs.launchpad.net/ubuntu/+source/linux/+bug/1819023

      One comment I noticed on the Nvidia thread above

      "Also, I don't think "off" is a default that we should ever aim for. Upstream would probably reject that because for many types of monitor "off" will look worse.

      Did "static 2x2" work for you? I am assuming the problem was "auto" defaulting to "dynamic 2x2"."

      The thing I don't quite understand is that I'm running a 2019 monitor, yet connecting a 2009/2010 PC to it. Colours seem fine, I don't notice heavy banding. Why would 2019 Linux without dithering look "worse" than a 2009 OS?

      • JTL replied to this.

        diop

        diop The thing I don't quite understand is that I'm running a 2019 monitor, yet connecting a 2009/2010 PC to it. Colours seem fine, I don't notice heavy banding. Why would 2019 Linux without dithering look "worse" than a 2009 OS?

        I suspect they could be aiming for the "lowest common denominator" of hardware and compensating accordingly.

        It's probably better to contain all the Linux discussions to this thread.

        I've created a bug with Ubuntu Launchpad here > https://bugs.launchpad.net/ubuntu/+source/linux/+bug/1853694

        Latest reply:

        the flicker is a driver bug, test a newer mainline kernel build

        https://kernel.ubuntu.com/kernel-ppa/mainline/

        test with drm-tip, and if it still flickers, a bug should be filed upstream

        I have installed the latest mainline kernel successfully and selected it during startup.

        I have no experience with drm-tip, or it's usage, and I'm not the most tech-savvy. After googling on how to install I get as far as cloning the git and installing dependencies however upon compiling receive errors about 'missing ssh' or words to that effect. Anybody here better versed with compiling from github and can give any advice? I will give it another try from scratch.

        FYI: for registered users Launchpad allows to mark issue as "This bug affects you". This information could be found on top of issue. Now it's "This bug affects 3 people"

        I want to keep positive thinking, but it sounds the one who replied "it's a driver bug" didn't understand the problem at all and just refers to something like "the screen is flickering heavily, so let's update and see if it helps". Worth a try, sure.

        Maybe to avoid another source of confusion it would help to rename "dithering" to "temporal dithering" since dithering can also be static.

        • diop replied to this.

          KM Updated the original description to state temporal dithering.

          From a support/developer perspective I understand why they would require testing on the latest versions, as that's exactly what they are running and any changes would require updating.

          • KM likes this.

          While browsing the intel-gfx archives I found this post > https://lists.freedesktop.org/archives/dri-devel/2017-September/152651.html

          i915.enable_dithering allows to force dithering on all outputs on (=1) or off (=0). The default is -1 for current automatic per-pipe selection.

          This is useful for debugging and for special case scenarios,e.g., providing simulated 10 bpc output on 8 bpc digital sinks if a 10 bpc framebuffer + rendering is in use.

          A more flexible solution would be connector properties, like other drivers (radeon, amdgpu, nouveau) already provide. A global override via module parameter is useful even with such connector properties, e.g., for scientific applications which require strict control over dithering, to have an override for DE's which may not expose such properties via some standard protocol in a user-controllable way, e.g., afaik all currently existing Wayland compositors.

          This is from 2017 so I don't know if this was eventually implemented, or if the author is an Intel dev (I assume so).

          Another interesting read re: dithering/noisy output on Linux >4.2.

          https://patchwork.kernel.org/patch/6995211/

          It will need a bit of work to find this out when i'm back in the lab. So far
          i just know something bad is happening to the signal and i assume it's the
          dithering, because the visual error pattern of messiness looks like that
          caused by dithering. E.g., on a static framebuffer i see some repeating
          pattern over the screen, but the pattern changes with every OpenGL
          bufferswap, even if i swap to the same fb content, as if the swap triggers
          some change of the spatial dither pattern (assuming PIPECONF_DITHER_TYPE_SP
          = spatial dithering?)

          If that's the case we simply limit to only ever dither when the sink
          is 6bpc, and not in any other case.

          So my understanding is that dithering is used to simulate a higher colour range. E.g. Make a 6-bit display look like 8-bit, make an 8-bit display simulate 10-bit etc. Is this why Windows 10 looks ultra saturated compared to W7? I have checked the colour range is set to limited (my personal preference), but if dithering of any sort is enabled is the driver trying to force 10-bit colour 'effect' on my monitor? Dithering is the rapid value adjustment of a pixel's colour. If we zoomed in on a single pixel and it's value was going from darkest blue to lightest blue in a quick succession, it would essentially be a strobe effect. Obviously the jump to closer values won't be as obvious, but it is inevitable that it would have an effect on the output (flicker, movement). Is the desktop without temporal dithering (spatial) that bad? If I bought a very expensive monitor tommorrow (10-bit), I would be very unhappy that plugging it into any Mac would not show me as native a picture as possible, as it would be dithered regardless. Even beyond our symptoms, I think there is a use case for photographers/video editors etc to disable dithering.

          13 days later

          Actually both my CCFL monitors are 8 bits (dell U3011 and dell U2711), but they can support 10 bit via FRC.
          Disable dithering doen't help me, so I think that FRC is triggered somehow on modern OS in some way regardless of 24 bit colors specified in system. Is there any way do switch off frc?

          a month later

          If you have ati/amd card you can join to investigations in https://gitlab.freedesktop.org/drm/amd/issues/977
          The current attempt here is to find video card registers that could trigger sympthoms. In my case there are near 70 differences in register values on Good and Bad OS, most of them are not well documented so it's bit difficult to understand their meaning and it's still not clear how update registers directly.

          6 days later

          I remember back in 2014 trying to use Ubuntu but being really dizzy looking at the screen (the font rendering got me mad!!). I eventually discovered that you can change the aliasing settings and I disabled them. It seemed to work fine for me.

          After all these years, I kept reinstalling / using Ubuntu on other machines and somehow came to realize that at least lately, I've forgotten to change those font aliasing settings...

          So either they changed something (that for me turned out to be for the better) or I got used to them.. I don't know!

          MacOSX makes me tilt instantaneously, though! I can't stand their font rendering algorithm at all!

          22 days later

          I've wrote that I tested several old ati cards and quiet recent nVidia one, all of them causes similar symptoms.
          I've also bought AMD RX 570 card yesterday to test with Linux cause it's supported by new amdGPU driver.
          And after starting Linux I've immediately getting pressure in temple area and tinnitus (sound in ears).
          Is anybody has similar symptoms on any card?

          dev