Linux Graphics Stack
Are you sure you need an actual capture card?
I have a tv in my living room which both my housemates stream to via a steamlink to play video games and watch tv. His computer uses a GTX1060 which hurts my eyes both with his monitor, and when he streams to the living room. I can use the living room tv fine when streaming to it from my pc with its GTX660. The steamlink uses ethernet.
Now if the steam link can get ahold of the video output after its been dithered, it must be possible, atleast for nvidia cards.
- Edited
Seagull Interesting, so you're saying when using a known good setup via steamlink the output is fine, regardless of the display?
I guess the ShadowPC I rented was using a similar technology which would explain why I could see the dithering and feel the effects. Clearly the effect is happening at the software level if it is also happening when streamed. When you think about it, an HDMI output is 1's and 0's so technically a VM/Stream could display the output verbatim to that of an actual card.
Can you record steamlink sessions or is there a Nvidia direct capture?
- Edited
Seagull So one thing I'm interested to know next is if your good and bad streams can be captured losslessly with those dithering effects still present? I guess this is exactly what @JTL is working on right now. A quick google and some are mentioning using NDI on OBS? I don't know what either of those terms are but it's probably a big thing in the streaming community.
The fact that one stream is good and one bad is promising though. If clips demonstrating a good and bad setup can be saved into some form of video format that retain the dithering etc this is hard evidence that can help us for future purchases and also to harass Nvidia with .
I suspect it's some sort of PCoIP being used. Just like my Shadow PC. Some blurb I found online below.
"PC-over-IP (PCoIP) technology delivers a secure, high-definition and highly responsive computing experience. It uses advanced display compression to provide end users with on-premises or cloud-based virtual machines as a convenient alternative to local computers. This virtual workspace architecture compresses, encrypts and transmits only pixels to a broad range of software clients, mobile clients, thin clients and stateless PCoIP Zero Clients, providing a highly secure enterprise environment.
From a user's perspective, there is no difference between working with a local computer loaded with software and an endpoint receiving a streamed pixel representation from a centralized virtual computer.
Because the PCoIP protocol transfers only display information in the form of pixels, no business information ever leaves your cloud or data center."
I'm actually trying this out today, using OBS software to capture my screen and some software I threw together to see if the pixels are changing at all from frame to frame. Probably won't work though, I have no idea if OBS is getting the video before or after its been dithered. It atleast shows no dither using my GTX660. I'll make a post if I get any decent results when I try a bad card.
Seagull Good luck. I think the dots are starting to join here.
PCoIP = streaming technology which transfers the pixel information only
Dithering = the movement/rapid value adjustment of nearby pixels to present the 'illusion' of a different color
If a 'bad' system is using temporal dithering and I am streamed the exact same output using PCoIP, it will undoubtedly produce the same symptoms as if I were in front of the real machine.
So it's all about the pixels?
Seagull Are you sure you need an actual capture card?
It's the most accurate way to test, because it's entirely possible for other image capture methods to interfere with the image in other ways (compression, chroma subchanels, etc.)
It might be worth reverse engineering steamlink to see how it captures the image.
- Edited
JTL It might be worth reverse engineering steamlink to see how it captures the image.
I think VMWare also uses PCoIP as an option but I doubt there's any free software using it.
If there's a way to reverse-engineer steamlink to at least show a difference in pixel data sent in a good/bad stream (not even looking at the output) then dithering still very much needs to be considered as the issue, at least there would be hard data to lean towards that theory.
diop Just to bump this.
As Nouveau is open-source and so is Linux, is there anybody that can be contacted to remove the rendering/dithering issues in Linux? Do the issues vary from DE to DE and distro to distro?
We're not getting anywhere right now waiting for a kext to appear for OS X or a driver/OS fix for Windows but as Linux is largely community driven and open source can't a techie solve our problems in 10 minutes?
It's not clear whom we should ping about this problems. Seems developers of xorg. In any case we should collect our histories in one place to add weight to our problem cause sending simple messages to freedesktop mailing list don't work...
Last time I also tried FreeBSD system and I got same symptoms with it.
FreeBSD have live forum and I posted message into it two weeks ago (https://forums.freebsd.org/threads/hidden-dithering-or-broken-synchronization-in-graphics-stack-that-causes-eye-strain.72165/) but there are no any answers yet. You can also join to this thread adding attention to problem.
On FreeBSD forum also was interesting thread on related problem from 2014 with next message:
I just realized the "working" modes that I marked green in my first post have something in common. They don't use hardware acceleration. Do you know of any differences in the video signals when hardware acceleration is used? Even the smallest difference might matter, so please reply if you know something.
I've tried to get in touch with it's author but without any success.
- Edited
kammerer I've tested many distros over the years and I've never been able to use Linux comfortably. Even going back to the mid 00's when I was testing Redhat, Knoppix Live and early Ubuntu releases I still noticed the issues we have now. I remember at the time (2006 or so) thinking Linux makes me feel a little weird/eye strain, so I never switched and stuck with XP. At the time Linux was nowhere near as user-friendly as it is now so I didn't feel motivated to dig deep and stuck with Windows. There is definitely something going on with how the desktop is rendered, or dithering of some type has always been in use with Linux and perhaps only recently baked into Windows. OTOH there are several on this forum that use Linux daily without issues, I'd be interested to know if dithering is happening on those setups.
I'm no expert on Linux and don't now what part of the chain is causing the discomfort, if it's the DE, Kernel, Driver or anything else. The good thing about Linux is we can chop and change and hopefully see what exact part of the system is causing it e.g. if I don't use a DE just command line will dithering still be present? If i rollback the kernel to X version regardless of the distro will that work? Again I'm not an expert but as we have that flexibility in Linux it should help to get to the 'point of failure' much easier.
Could try posting on one of the techie freelance websites like PeoplePerHour, obivously going to cost a bit of money but might get further than just asking nicely.
- Edited
I've been thinking about this for a while, but If anyone can use certain versions of Linux comfortably but not others on that same hardware, here's some information that should be collected along with the "distribution version"
1) Kernel version: Do uname -a
in a console/terminal
example:
Linux linux 5.0.0-27-generic #28~18.04.1-Ubuntu SMP Thu Aug 22 03:00:32 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux
2) Driver "binded" the GPU: Do lspci -v
and copy+paste the block of text that starts with VGA controller
01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Venus XT [Radeon HD 8870M / R9 M270X/M370X] (rev 83) (prog-if 00 [VGA controller])
Subsystem: Apple Inc. Radeon R9 M370X Mac Edition
Flags: bus master, fast devsel, latency 0, IRQ 69
Memory at 80000000 (64-bit, prefetchable) [size=256M]
Memory at b0c00000 (64-bit, non-prefetchable) [size=256K]
I/O ports at 3000 [size=256]
Expansion ROM at b0c40000 [disabled] [size=128K]
Capabilities: <access denied>
Kernel driver in use: radeon
Kernel modules: radeon, amdgpu
3) Driver "binded" to X.Org. This one's a bit harder, you need to know the location of the X.Org log file that can change from system to system: Do grep "AIGLX: Loaded and initialized" /var/log/Xorg.0.log
(At least on my system)
If all goes well you should get something like this
[ 19.230] (II) AIGLX: Loaded and initialized radeonsi
- Edited
JTL Thanks. I would share my output but mine isn't comfortable to begin with.
Has anybody had luck posting on popular distro forums for advice e.g. Ubuntu/Mint? Although nobody knows of dithering outside of this site.
The Arch community isn't known for being helpful (rtfm ) but they are probably tech-savvy enough to look into it - it's not an Arch-specific issue though but I'd be willing to bet Arch has the same dithering/rendering issues regardless of DE used.