Linux Graphics Stack
- Edited
JTL It might be worth reverse engineering steamlink to see how it captures the image.
I think VMWare also uses PCoIP as an option but I doubt there's any free software using it.
If there's a way to reverse-engineer steamlink to at least show a difference in pixel data sent in a good/bad stream (not even looking at the output) then dithering still very much needs to be considered as the issue, at least there would be hard data to lean towards that theory.
diop Just to bump this.
As Nouveau is open-source and so is Linux, is there anybody that can be contacted to remove the rendering/dithering issues in Linux? Do the issues vary from DE to DE and distro to distro?
We're not getting anywhere right now waiting for a kext to appear for OS X or a driver/OS fix for Windows but as Linux is largely community driven and open source can't a techie solve our problems in 10 minutes?
It's not clear whom we should ping about this problems. Seems developers of xorg. In any case we should collect our histories in one place to add weight to our problem cause sending simple messages to freedesktop mailing list don't work...
Last time I also tried FreeBSD system and I got same symptoms with it.
FreeBSD have live forum and I posted message into it two weeks ago (https://forums.freebsd.org/threads/hidden-dithering-or-broken-synchronization-in-graphics-stack-that-causes-eye-strain.72165/) but there are no any answers yet. You can also join to this thread adding attention to problem.
On FreeBSD forum also was interesting thread on related problem from 2014 with next message:
I just realized the "working" modes that I marked green in my first post have something in common. They don't use hardware acceleration. Do you know of any differences in the video signals when hardware acceleration is used? Even the smallest difference might matter, so please reply if you know something.
I've tried to get in touch with it's author but without any success.
- Edited
kammerer I've tested many distros over the years and I've never been able to use Linux comfortably. Even going back to the mid 00's when I was testing Redhat, Knoppix Live and early Ubuntu releases I still noticed the issues we have now. I remember at the time (2006 or so) thinking Linux makes me feel a little weird/eye strain, so I never switched and stuck with XP. At the time Linux was nowhere near as user-friendly as it is now so I didn't feel motivated to dig deep and stuck with Windows. There is definitely something going on with how the desktop is rendered, or dithering of some type has always been in use with Linux and perhaps only recently baked into Windows. OTOH there are several on this forum that use Linux daily without issues, I'd be interested to know if dithering is happening on those setups.
I'm no expert on Linux and don't now what part of the chain is causing the discomfort, if it's the DE, Kernel, Driver or anything else. The good thing about Linux is we can chop and change and hopefully see what exact part of the system is causing it e.g. if I don't use a DE just command line will dithering still be present? If i rollback the kernel to X version regardless of the distro will that work? Again I'm not an expert but as we have that flexibility in Linux it should help to get to the 'point of failure' much easier.
Could try posting on one of the techie freelance websites like PeoplePerHour, obivously going to cost a bit of money but might get further than just asking nicely.
- Edited
I've been thinking about this for a while, but If anyone can use certain versions of Linux comfortably but not others on that same hardware, here's some information that should be collected along with the "distribution version"
1) Kernel version: Do uname -a
in a console/terminal
example:
Linux linux 5.0.0-27-generic #28~18.04.1-Ubuntu SMP Thu Aug 22 03:00:32 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux
2) Driver "binded" the GPU: Do lspci -v
and copy+paste the block of text that starts with VGA controller
01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Venus XT [Radeon HD 8870M / R9 M270X/M370X] (rev 83) (prog-if 00 [VGA controller])
Subsystem: Apple Inc. Radeon R9 M370X Mac Edition
Flags: bus master, fast devsel, latency 0, IRQ 69
Memory at 80000000 (64-bit, prefetchable) [size=256M]
Memory at b0c00000 (64-bit, non-prefetchable) [size=256K]
I/O ports at 3000 [size=256]
Expansion ROM at b0c40000 [disabled] [size=128K]
Capabilities: <access denied>
Kernel driver in use: radeon
Kernel modules: radeon, amdgpu
3) Driver "binded" to X.Org. This one's a bit harder, you need to know the location of the X.Org log file that can change from system to system: Do grep "AIGLX: Loaded and initialized" /var/log/Xorg.0.log
(At least on my system)
If all goes well you should get something like this
[ 19.230] (II) AIGLX: Loaded and initialized radeonsi
- Edited
JTL Thanks. I would share my output but mine isn't comfortable to begin with.
Has anybody had luck posting on popular distro forums for advice e.g. Ubuntu/Mint? Although nobody knows of dithering outside of this site.
The Arch community isn't known for being helpful (rtfm ) but they are probably tech-savvy enough to look into it - it's not an Arch-specific issue though but I'd be willing to bet Arch has the same dithering/rendering issues regardless of DE used.
- Edited
Bumping this as I posted on AskUbuntu for advice to see if a bug can be reported or any support. (I assume the xrandr off setting still doesn't work)
It has already been reported as a bug before
Intel - https://bugs.launchpad.net/ubuntu/+source/linux/+bug/1776642
Nvidia - https://bugs.launchpad.net/ubuntu/+source/linux/+bug/1819023
One comment I noticed on the Nvidia thread above
"Also, I don't think "off" is a default that we should ever aim for. Upstream would probably reject that because for many types of monitor "off" will look worse.
Did "static 2x2" work for you? I am assuming the problem was "auto" defaulting to "dynamic 2x2"."
The thing I don't quite understand is that I'm running a 2019 monitor, yet connecting a 2009/2010 PC to it. Colours seem fine, I don't notice heavy banding. Why would 2019 Linux without dithering look "worse" than a 2009 OS?
diop The thing I don't quite understand is that I'm running a 2019 monitor, yet connecting a 2009/2010 PC to it. Colours seem fine, I don't notice heavy banding. Why would 2019 Linux without dithering look "worse" than a 2009 OS?
I suspect they could be aiming for the "lowest common denominator" of hardware and compensating accordingly.
- Edited
It's probably better to contain all the Linux discussions to this thread.
I've created a bug with Ubuntu Launchpad here > https://bugs.launchpad.net/ubuntu/+source/linux/+bug/1853694
Latest reply:
the flicker is a driver bug, test a newer mainline kernel build
https://kernel.ubuntu.com/kernel-ppa/mainline/
test with drm-tip, and if it still flickers, a bug should be filed upstream
I have installed the latest mainline kernel successfully and selected it during startup.
I have no experience with drm-tip, or it's usage, and I'm not the most tech-savvy. After googling on how to install I get as far as cloning the git and installing dependencies however upon compiling receive errors about 'missing ssh' or words to that effect. Anybody here better versed with compiling from github and can give any advice? I will give it another try from scratch.
FYI: for registered users Launchpad allows to mark issue as "This bug affects you". This information could be found on top of issue. Now it's "This bug affects 3 people"
I want to keep positive thinking, but it sounds the one who replied "it's a driver bug" didn't understand the problem at all and just refers to something like "the screen is flickering heavily, so let's update and see if it helps". Worth a try, sure.
Maybe to avoid another source of confusion it would help to rename "dithering" to "temporal dithering" since dithering can also be static.
I've also reported issue to freedesktop drm/amd: https://gitlab.freedesktop.org/drm/amd/issues/977
The bug reports will get more attention if more users are affected.
If you (member or lurker) find the latest Ubuntu distribution gives you strain/discomfort please sign up to Launchpad and add yourself to the number of users affected.
Bug Link > https://bugs.launchpad.net/ubuntu/+source/linux/+bug/1853694
Also see @kammerer 's link above. https://gitlab.freedesktop.org/drm/amd/issues/977
- Edited
While browsing the intel-gfx archives I found this post > https://lists.freedesktop.org/archives/dri-devel/2017-September/152651.html
i915.enable_dithering allows to force dithering on all outputs on (=1) or off (=0). The default is -1 for current automatic per-pipe selection.
This is useful for debugging and for special case scenarios,e.g., providing simulated 10 bpc output on 8 bpc digital sinks if a 10 bpc framebuffer + rendering is in use.
A more flexible solution would be connector properties, like other drivers (radeon, amdgpu, nouveau) already provide. A global override via module parameter is useful even with such connector properties, e.g., for scientific applications which require strict control over dithering, to have an override for DE's which may not expose such properties via some standard protocol in a user-controllable way, e.g., afaik all currently existing Wayland compositors.
This is from 2017 so I don't know if this was eventually implemented, or if the author is an Intel dev (I assume so).
- Edited
Another interesting read re: dithering/noisy output on Linux >4.2.
https://patchwork.kernel.org/patch/6995211/
It will need a bit of work to find this out when i'm back in the lab. So far
i just know something bad is happening to the signal and i assume it's the
dithering, because the visual error pattern of messiness looks like that
caused by dithering. E.g., on a static framebuffer i see some repeating
pattern over the screen, but the pattern changes with every OpenGL
bufferswap, even if i swap to the same fb content, as if the swap triggers
some change of the spatial dither pattern (assuming PIPECONF_DITHER_TYPE_SP
= spatial dithering?)If that's the case we simply limit to only ever dither when the sink
is 6bpc, and not in any other case.
So my understanding is that dithering is used to simulate a higher colour range. E.g. Make a 6-bit display look like 8-bit, make an 8-bit display simulate 10-bit etc. Is this why Windows 10 looks ultra saturated compared to W7? I have checked the colour range is set to limited (my personal preference), but if dithering of any sort is enabled is the driver trying to force 10-bit colour 'effect' on my monitor? Dithering is the rapid value adjustment of a pixel's colour. If we zoomed in on a single pixel and it's value was going from darkest blue to lightest blue in a quick succession, it would essentially be a strobe effect. Obviously the jump to closer values won't be as obvious, but it is inevitable that it would have an effect on the output (flicker, movement). Is the desktop without temporal dithering (spatial) that bad? If I bought a very expensive monitor tommorrow (10-bit), I would be very unhappy that plugging it into any Mac would not show me as native a picture as possible, as it would be dithered regardless. Even beyond our symptoms, I think there is a use case for photographers/video editors etc to disable dithering.