Currently on a Xubuntu 16.04 on a struggling 2006 Toshiba Satellite with a CCFL display without eyestrain because I have not found anything to upgrade to without symptoms. Used 14.04 and several previous Mint versions in the past without problem.

I have found no "modern"/LED backlit device that any of those or any other distro/DE/OS is comfortable on since this problem began for me 2+ years ago. I have tried every tip posted including editing config files for driver settings etc that others swore solved their problem. Nothing ever helped. It was first noticed when I used a Lenovo Ideapad for a few weeks live booting Solus/Budgie. Prior to that I was using the Toshiba primarily. In 2017/18 I did extensive testing with an Thinkpad X1 Carbon 3 (Intel HD 5500) off eBay and the default LED/TN display was painful on any distro even in the BIOS/TTY. I tried 5 different panels in it (no PWM/color profiles etc) to no effect. Same story with an X1 Carbon 2 (Intel HD 4400) I tried and returned a few new devices of all price ranges from Best Buy with Intel and AMD of various combos including Ryzen only laptops. I have not had a desktop since this began so cannot speak to various GPUs/displays/connections.

The only LED backlit devices I used comfortably before this began were 2009 and 2011 Macbook Pros with OSX Lion/Mountain Lion and those had LED/TN displays (one glossy/one matte) which I sold before this problem (however trying a used 2011, ostensibily with the same display and specs, on the current MacOS in a store was immediately miserable) , iPad 2 that is maxed out as iOS 9 and still doesn't hurt, and a Nokia Lumia 635 Windows Phone. I have a Moto G4 Play with Lineage OS 14.1 on it but don't use it enough to say if its a problem.

Basically I have no idea where the problem lies as the only common denominator is "new hardware sucks always". Some old hardware sucks. Some LED is ok and some sucks. Some IPS is ok and some sucks. No characteristic, inclucing supposedly "safe" software, is always good or always bad without exception...including CCFL.

kammerer It would be useful if we had my "capture card testing" setup with some "results" with various GPUs and operating system/driver versions. Much more concrete evidence then "My eyes hurt with Ubuntu 18.04 and I'm not sure why"

    8 days later

    kammerer After I obtain the equipment needed (unknown timeframe) and run the tests and compile the results.

      JTL

      Are you sure you need an actual capture card?

      I have a tv in my living room which both my housemates stream to via a steamlink to play video games and watch tv. His computer uses a GTX1060 which hurts my eyes both with his monitor, and when he streams to the living room. I can use the living room tv fine when streaming to it from my pc with its GTX660. The steamlink uses ethernet.

      Now if the steam link can get ahold of the video output after its been dithered, it must be possible, atleast for nvidia cards.

        Seagull Interesting, so you're saying when using a known good setup via steamlink the output is fine, regardless of the display?

        I guess the ShadowPC I rented was using a similar technology which would explain why I could see the dithering and feel the effects. Clearly the effect is happening at the software level if it is also happening when streamed. When you think about it, an HDMI output is 1's and 0's so technically a VM/Stream could display the output verbatim to that of an actual card.

        Can you record steamlink sessions or is there a Nvidia direct capture?

          diop

          My precise set up:

          GTX660, Benq BL2405 upstairs.
          Steamlink, Benq BL2405 in living room.

          No strain on either, unless my housemate is using the steamlink with his 1060.

          • diop replied to this.

            Seagull So one thing I'm interested to know next is if your good and bad streams can be captured losslessly with those dithering effects still present? I guess this is exactly what @JTL is working on right now. A quick google and some are mentioning using NDI on OBS? I don't know what either of those terms are 😉 but it's probably a big thing in the streaming community.

            The fact that one stream is good and one bad is promising though. If clips demonstrating a good and bad setup can be saved into some form of video format that retain the dithering etc this is hard evidence that can help us for future purchases and also to harass Nvidia with 🙂.

            I suspect it's some sort of PCoIP being used. Just like my Shadow PC. Some blurb I found online below.

            "PC-over-IP (PCoIP) technology delivers a secure, high-definition and highly responsive computing experience. It uses advanced display compression to provide end users with on-premises or cloud-based virtual machines as a convenient alternative to local computers. This virtual workspace architecture compresses, encrypts and transmits only pixels to a broad range of software clients, mobile clients, thin clients and stateless PCoIP Zero Clients, providing a highly secure enterprise environment.

            From a user's perspective, there is no difference between working with a local computer loaded with software and an endpoint receiving a streamed pixel representation from a centralized virtual computer.

            Because the PCoIP protocol transfers only display information in the form of pixels, no business information ever leaves your cloud or data center."

              diop

              I'm actually trying this out today, using OBS software to capture my screen and some software I threw together to see if the pixels are changing at all from frame to frame. Probably won't work though, I have no idea if OBS is getting the video before or after its been dithered. It atleast shows no dither using my GTX660. I'll make a post if I get any decent results when I try a bad card.

                Seagull Good luck. I think the dots are starting to join here.

                PCoIP = streaming technology which transfers the pixel information only
                Dithering = the movement/rapid value adjustment of nearby pixels to present the 'illusion' of a different color

                If a 'bad' system is using temporal dithering and I am streamed the exact same output using PCoIP, it will undoubtedly produce the same symptoms as if I were in front of the real machine.

                So it's all about the pixels?

                • hpst replied to this.

                  diop So it's all about the pixels?

                  It does SEEM to add evidence to the dithering theory, but as we know nothing is every as simple as it seems. As soon as @JTL gets setup this should be a very quick answer as he would be able to see if dithering was there and was translating through.

                  Seagull Are you sure you need an actual capture card?

                  It's the most accurate way to test, because it's entirely possible for other image capture methods to interfere with the image in other ways (compression, chroma subchanels, etc.)

                  It might be worth reverse engineering steamlink to see how it captures the image.

                    JTL

                    Have you found a capture card that won't suffer the same problems? I was thinking BlackMagic Intensity pro 4k.

                    I was unable to detect any dithering via OBS capture, couldn't get NDI to work, but may try again at some point.

                    JTL It might be worth reverse engineering steamlink to see how it captures the image.

                    I think VMWare also uses PCoIP as an option but I doubt there's any free software using it.

                    If there's a way to reverse-engineer steamlink to at least show a difference in pixel data sent in a good/bad stream (not even looking at the output) then dithering still very much needs to be considered as the issue, at least there would be hard data to lean towards that theory.

                    • diop replied to this.
                      2 months later

                      diop Just to bump this.

                      As Nouveau is open-source and so is Linux, is there anybody that can be contacted to remove the rendering/dithering issues in Linux? Do the issues vary from DE to DE and distro to distro?

                      We're not getting anywhere right now waiting for a kext to appear for OS X or a driver/OS fix for Windows but as Linux is largely community driven and open source can't a techie solve our problems in 10 minutes?

                      It's not clear whom we should ping about this problems. Seems developers of xorg. In any case we should collect our histories in one place to add weight to our problem cause sending simple messages to freedesktop mailing list don't work...

                      Last time I also tried FreeBSD system and I got same symptoms with it.
                      FreeBSD have live forum and I posted message into it two weeks ago (https://forums.freebsd.org/threads/hidden-dithering-or-broken-synchronization-in-graphics-stack-that-causes-eye-strain.72165/) but there are no any answers yet. You can also join to this thread adding attention to problem.

                      On FreeBSD forum also was interesting thread on related problem from 2014 with next message:
                      I just realized the "working" modes that I marked green in my first post have something in common. They don't use hardware acceleration. Do you know of any differences in the video signals when hardware acceleration is used? Even the smallest difference might matter, so please reply if you know something.

                      I've tried to get in touch with it's author but without any success.

                      • diop replied to this.

                        kammerer I've tested many distros over the years and I've never been able to use Linux comfortably. Even going back to the mid 00's when I was testing Redhat, Knoppix Live and early Ubuntu releases I still noticed the issues we have now. I remember at the time (2006 or so) thinking Linux makes me feel a little weird/eye strain, so I never switched and stuck with XP. At the time Linux was nowhere near as user-friendly as it is now so I didn't feel motivated to dig deep and stuck with Windows. There is definitely something going on with how the desktop is rendered, or dithering of some type has always been in use with Linux and perhaps only recently baked into Windows. OTOH there are several on this forum that use Linux daily without issues, I'd be interested to know if dithering is happening on those setups.

                        I'm no expert on Linux and don't now what part of the chain is causing the discomfort, if it's the DE, Kernel, Driver or anything else. The good thing about Linux is we can chop and change and hopefully see what exact part of the system is causing it e.g. if I don't use a DE just command line will dithering still be present? If i rollback the kernel to X version regardless of the distro will that work? Again I'm not an expert but as we have that flexibility in Linux it should help to get to the 'point of failure' much easier.

                        Could try posting on one of the techie freelance websites like PeoplePerHour, obivously going to cost a bit of money but might get further than just asking nicely.

                        • JTL replied to this.

                          I've been thinking about this for a while, but If anyone can use certain versions of Linux comfortably but not others on that same hardware, here's some information that should be collected along with the "distribution version"

                          1) Kernel version: Do uname -a in a console/terminal

                          example:

                          Linux linux 5.0.0-27-generic #28~18.04.1-Ubuntu SMP Thu Aug 22 03:00:32 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux

                          2) Driver "binded" the GPU: Do lspci -v and copy+paste the block of text that starts with VGA controller

                          01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Venus XT [Radeon HD 8870M / R9 M270X/M370X] (rev 83) (prog-if 00 [VGA controller])
                          	Subsystem: Apple Inc. Radeon R9 M370X Mac Edition
                          	Flags: bus master, fast devsel, latency 0, IRQ 69
                          	Memory at 80000000 (64-bit, prefetchable) [size=256M]
                          	Memory at b0c00000 (64-bit, non-prefetchable) [size=256K]
                          	I/O ports at 3000 [size=256]
                          	Expansion ROM at b0c40000 [disabled] [size=128K]
                          	Capabilities: <access denied>
                          	Kernel driver in use: radeon
                          	Kernel modules: radeon, amdgpu

                          3) Driver "binded" to X.Org. This one's a bit harder, you need to know the location of the X.Org log file that can change from system to system: Do grep "AIGLX: Loaded and initialized" /var/log/Xorg.0.log (At least on my system)

                          If all goes well you should get something like this

                          [    19.230] (II) AIGLX: Loaded and initialized radeonsi

                            Seagull They'd need to have the infrastructure in place to test for and (more importantly) have an understanding of "our issue", the causes and technical solutions, which isn't trivial to explain for most people.

                            I've heard mixed things about such places myself.

                            dev