JTL Yeah RDP is much more intricate, more integrated into the OS, I can see how that could create issues. Also apps like NoMachine are set to use hardware encoding/decoding by default, so I disable that on both ends (in addition to checking the specifically-named "Disable post processing" option NoMachine has)

However, in my post I was only intending to refer to VNC clients with lossless transmission.

I installed nomachine and it working but bad image quality. Will work on settings. Do you know of a similar pkg that allows connection via cable to eliminate any latency? Thx!

I do think software plays a critical role.

I use a galaxy s22. It's AMOLED, so it does flicker. But it's (almost) completely fine for me. Basically a non issue.

Yet both the MacBook Air and pro are big problems. And many recent windows laptops too. For me, I think it's the dithering or any other black magic they're using to display a billion colors. Wish it could be turned off.

I think the hypothesis is a good start, but needs to be filled out with more specifics. What are the plausible factors or differences between macOS and other operating systems? In the StillColor discussion, there was mention of the macOS graphics pipeline internally using high-color-depth (12 IIRC, or even higher) buffers, which inherently requires either clamping or dithering to display on a physical LCD. There were also discussions regarding font rendering, smoothing/bluriness, antialiasing (especially subpixel antialiasing, or lack thereof), etc. which distinguishes the appearance of macOS from other systems. Even as far as general flickering or temporal dithering, some have argued that these "features" are simply harsher or more aggressive on macOS, especially on more recent Mac hardware.

As far as Asahi Linux, I tried it when I owned an M2 TB MacBook Pro briefly, and I found it to not be an improvement over macOS. Possibly the issue was the reintroduction of temporal dithering (versus disabling it on macOS via StillColor), but I suspect the main trigger was the LCD backlight itself (KSF phosphor with a harsh red undertone).

    macsforme

    but needs to be filled out with more specifics

    Here we come up against the fact that macOS is a closed operating system, and we don't know many things about how it is structured internally.

    there was mention of the macOS graphics pipeline internally using high-color-depth (12 IIRC, or even higher) buffers

    This isn't something specific to macOS. In general, when it comes to signal processing, it's often more effective to work with higher-dimensional data and then reduce it to lower dimensions at the end of the process. This approach helps minimize rounding errors during calculations. Since most operating systems perform a lot of image processing, and GPUs handle much of this work, it makes sense for GPUs to process higher-dimensional data as well. The conversion to lower dimensions typically happens in the final stages of the GPU's pipeline, and the algorithms used for this process are more dependent on the GPU hardware than the operating system itself (on the driver's side, it is possible to select these algorithms from the list available in the hardware, but in any case, the processing itself will be performed on the hardware side).

    I want to point out that the presence of such the conversion in certain hardware does not necessarily have a negative impact on the eyes. The impact on the eyes depends on the types of algorithms used in the process. Unfortunately, we lack are specific knowledge about these algorithms to fully understand the whole picture.

    There were also discussions regarding font rendering, smoothing/bluriness, antialiasing (especially subpixel antialiasing, or lack thereof),

    Apple disabled subpixel font smoothing (also known as subpixel anti-aliasing) starting with macOS Mojave, which was released in 2018. The company shifted to grayscale font smoothing instead. This change was made in part because subpixel rendering is less effective on modern high-resolution Retina displays, which have a much higher pixel density compared to older screens.

    When choosing a HiDPI monitor, this problem is resolved. I’m currently away from home. I’m using an iPad Pro (iPadOS 14) 2017 as a monitor (Sidecar) for my MacBook. On the Retina display, the font looks perfect. But on my low-DPI display, there’s a slight blur. But subjectively, it has no effect on my eyes.

    I have a hypothesis that this might be a plus for 6-bit+FRC monitors. Because this might slightly reduce the activity of the display's FRC module, as complex smoothing algorithms can increase the load on pixel brightness management. At least in Linux, using Grayscale font smoothing (or completely disabling it) strains the eyes less when reading text on such monitors.

    I can’t quite recall any other major issues I’ve had with macOS on my M1, aside from the display challenges - which, for those with sensitive eyes, can be pretty frustrating in most cases. That said, I might be forgetting something important.

      WhisperingWind Apple disabled subpixel font smoothing (also known as subpixel anti-aliasing) starting with macOS Mojave, which was released in 2018. The company shifted to grayscale font smoothing instead. This change was made in part because subpixel rendering is less effective on modern high-resolution Retina displays, which have a much higher pixel density compared to older screens.

      I read about this when Mojave was released. I also noticed firsthand that text was much harder to read on standard-DPI monitors starting with Mojave, with one of the worst scenarios being trying to read a PDF document in Safari (could have been a Safari issue in theory). Going beyond my previous point (and I'm not sure where this started or if it was always the case), macOS text on standard-DPI monitors seems categorically fuzzy, and I've seen examples within Apple's own app UIs making me suspect that macOS text rendering has no concept of physical pixel geometry. I frequently saw single-pixel-wide text elements (for example, for the letters 'j,' 'i,' etc.) seemingly sitting between two pixels, such that two pixel columns were at 50% instead of a single pixel column at 100%. The default Terminal font and size was awfully blurry for this reason and I had to increase the size (they also got rid of the ability to disable antialiasing in the Terminal in Monterey or maybe earlier, before restoring it in Ventura). The Time Machine preference pane in Monterey also had horribly blurry text for the "Back Up Automatically" (IIRC) text, to cite another specific example.

      As you pointed out, this is inconsequential on high-DPI screens. However, if I am correct about macOS pixel geometry unawareness for text rendering, then the macOS blurry text issue on standard-DPI screens goes beyond subpixel versus grayscale antialiasing methods.

      When choosing a HiDPI monitor, this problem is resolved.

      For this reason, when I upgraded to an Apple silicon MacBook Pro I tried practically every available high-DPI 1080p external monitor (LG 24UD58-B, LG 24MD4KL-B, and Dell P2415Q), and all were uncomfortable (this was prior to StillColor, so quite possibly the aggressive Apple silicon temporal dithering was to blame). The built-in MacBook display was equally or more horrible on the mini-LED 14-inch MacBook Pro, and was where all my vision problems started. I ultimately reverted to an Intel Mac with a standard-DPI screen, but my vision was never the same after that.

        macsforme

        Which monitor did you end up choosing?

        Over the past year, I’ve tested 8 monitors, but none of them worked for me: BenQ GL2480, BenQ GL2450HM, Samsung Odyssey G5 G55C S27CG550EI, ASUS ROG Strix XG259CMS, Philips 241V8L/01, Titan Army P27A2R, BenQ PD2705Q, Dell P2422H. The testing was done on Linux and macOS (Apple Silicon Mac, Intel Mac). For now, I’m sticking with my old BenQ GL2450 (TN, 6Bit+FRC), which for some unknown reason doesn’t cause discomfort. Interestingly, although the BenQ GL2480 and BenQ GL2450HM use exactly the same panel as my BenQ GL2450, they come with different scaler models.

          WhisperingWind I use a variety of older screens. My main laptop is 2012 13-inch MacBook Pro, and for occasional gaming I use an Acer XB241h (AUO 120Hz panel from 2012). Both of these I can tolerate for a few hours before feeling mild symptoms (nothing as harsh as modern screens and/or Apple silicon). At work, my main screen I believe is a Dell P2213f. So in summary, mostly screens with TN panels from the early 2010s.

          WhisperingWind BenQ GL2480 and BenQ GL2450HM use exactly the same panel as my BenQ GL2450, they come with different scaler models

          Have you measured flickering in gl2480 / gl2450 ?

          Old capacitors can pass voltage pulsation.

          Also, if monitor is old, blue peak of LED is extremly big (the main issue of this phenomen, is LED coating got cracks due to temperatures, and blue spectrum starts to emit directly in your eye/retina). I stoped using my "safe" old laptop without dithering but having increased blue peak, and eyes become better in terms of sensitivity. I also replaced all LED room lamps (all PWM free, sunlike or big yellow 2800k peak types) into halogens / incandescent

            simplex

            Have you measured flickering in gl2480 / gl2450 ?

            I have a simple device for measuring flicker, and according to it, there were no issues.

            Also, if monitor is old, blue peak of LED is extremly big (the main issue of this phenomen, is LED coating got cracks due to temperatures, and blue spectrum starts to emit directly in your eye/retina).

            Most likely, the problem was with this, since those two monitors are from 2010, meaning they are very old.

            I want to try buying a 27" iMac from 2015 and use it as a thin client for working with other Macs/PCs. A long time ago, I had one like that at work, and I don't remember having any particular issues with it. It also has a non-wide gamut display.

            Does anyone have the kext from Amulet that disables dithering for AMD graphics cards on Mac?

              So after reading all responses I think we can all agree there are no solid experiences that would lean towards Mac OS as primary cause of eye strain, although it plays a part by interfacing with hardware. But from more practical terms, attemting to solve software problems, could lead to work without tangible results on wrong hardware. Apple devices may not nescesarily fall into this category due to close software and hardware integration, but then closed nature of software is probably an issue. The question is - are there aspects of software that control parameters of hardware directly linked to eye strain regardless if those are accessible now or will be made accessible in the future? (I am excluding obvious UI based elements like font smoothing etc with the assumption that there is something more harsh as some flicker/modulation tests have shown in other posts)

                Donux

                Direct interaction with the equipment is hidden in modules whose source code is not openly available. This creates a problem and leads us to the need for reverse engineering (decompiling and studying the assembly code of these modules). As I understand it, Apple explicitly prohibits this in the EULA: https://developer.apple.com/forums/thread/687343.

                But even if it is possible to resolve the legality issues, without the source code, this is an incredibly labor-intensive task.

                • JTL replied to this.

                  WhisperingWind Direct interaction with the equipment is hidden in modules whose source code is not openly available. This creates a problem and leads us to the need for reverse engineering (decompiling and studying the assembly code of these modules). As I understand it, Apple explicitly prohibits this in the EULA: https://developer.apple.com/forums/thread/687343.

                  In some places, reverse engineering legitimately obtained products for investigation and correction of issues is not illegal.

                  I think a more pressing concern is the possibility of complexity with isolating and investigating which component is potentially responsible for the "cause and effect" of a particular issue.

                  10 days later

                  WhisperingWind Does anyone have the kext from Amulet that disables dithering for AMD graphics cards on Mac?

                  It was linked on this forum a few times, but the links I found appear to be expired. The filename was AHKinject_SCN078.dmg. This post was a summary of all the info I could find about disabling dithering on Intel macOS, and it had a few links that might point you in the right direction. If you can figure out my email address, you can message me privately for more info.

                  Note that I could not get ahkinject.kext to work for me consistently, as it would only work for about 30 seconds before reverting to dithering (established via capture card). Someone also apparently had to patch the kernel extension for newer macOS versions, but I'm not sure whether the patched version is publicly available.

                  simplex Old capacitors can pass voltage pulsation.

                  Also, if monitor is old, blue peak of LED is extremly big (the main issue of this phenomen, is LED coating got cracks due to temperatures, and blue spectrum starts to emit directly in your eye/retina). I stoped using my "safe" old laptop without dithering but having increased blue peak, and eyes become better in terms of sensitivity. I also replaced all LED room lamps (all PWM free, sunlike or big yellow 2800k peak types) into halogens / incandescent

                  simplex Using old/used phones is not good idea:
                  ...

                  1. The screen is worn out, the blue peak becomes higher than at the beginning, the screen becomes "cold" in shade - no solution, because it is impossible to find a new screen. And it is unknown from what LEDs / spectral the screen replicas are made. And it is unknown in which replica there will be FRC/dithering and in which - not

                  I believe you established this via empirical spectrometer readings, so I tentatively agree that this happens but I have a few thoughts/discussion points:

                  1. Is there any more data available on how widespread this kind of LED backlight degradation is? Were these specifically computer monitors that were tested, or smartphone screens as well?
                  2. I presume you would draw a distinction between old (as in age) versus high usage hours? So someone who can find an older panel in new condition or with low hours could still reap the benefits.
                  3. What about blue light filtering glasses to mitigate the effect of higher blue light emission on older panels?
                  4. I have found that some heavily-worn screens do hurt my eyes; however, my impression was not that they turned blue, but rather yellow/brown. I attributed this discoloration to degradation of the film layers intended to transmit/diffuse the light, rather than degradation of the LEDs themselves, but I am now reconsidering this. I do find older displays to be bluer in general versus modern screens, but I attributed that to modern screens getting redder (especially wide-gamut and KSF phosphor panels), rather than the other way around.
                  5. For those who can tolerate CCFL backlights, I wonder if refurbishing CCFL screens is a viable option, since replacement CCFL tubes still seem readily available. As far as refurbishing LED-backlit screens, I've seen at least one video of a repair shop replacing individual LEDs for a screen backlight, but the technology is generally more miniature so I would be skeptical of whether this is generally practical for LED-backlit screens.

                  A program certainly is able to introduce (additional) dithering.

                  The new Chrome browsers on macOS

                  The new Transmission client.

                  And so on

                    sdkjbfakgljbafkjb

                    Which specific app versions and operating systems are causing issues? I will include them in the list for analysis.

                    Did you use StillColor or BetterDisplay to disable dithering?

                    What monitor are you using (8-bit, 6-bit+FRC, etc)?

                    What Mac do you have?

                      WhisperingWind

                      On macOS Catalina, which has no significant dithering in the OS (I know of)

                      However the newer Chromes (>= 120) and Transmission 4.0 on it really annoy me, very hard to look at. Chrome specifically has this weird "sleek" look where everything is very sharp and blacks are very black.

                      But it's fine I just use older versions of them

                        sdkjbfakgljbafkjb

                        On macOS Catalina

                        Sorry, I can't perform the analysis on macOS Catalina because I have only an Apple Silicon Mac.

                        On Apple Silicon running Sonoma 14.1.2 and Chrome 133, any flickering is absent (I use StillColor). Perhaps the browser's rendering engine has color settings that make the rendering result more contrasting. I have noticed that a high-contrast and sharp image strains my eyes, and perhaps the same is true in this case

                          WhisperingWind

                          Yeah it's fine I just use older versions.

                          But basically, it seems even single windows are able to introduce temporal dithering on macOS (even when otherwise there is none) 😛

                          Adding this here because I'm positive a Mac OS version difference causes it on identical hardware. I'm still leaning toward "temporal dithering" being the cause and I'm locked at Mojave on a 2013 MBP. Updating to Catalina caused in instantly so I reverted. When that laptop died because the keyboard could no longer be repaired I bought the same hardware model which had been updated to a newer OS and caused strain issues. I used a factory reset and time machine backup to boot into Mojave and it was fine again. On this machine for my eyes Catalina OS and newer were the culprit.

                          Recently I accidentally updated Brave browser and somehow even on this OS/Hardware the browser itself seems to have introduced dithering (newer Chromium?). Others have reported it being related to the hardware acceleration setting but there might be a bug preventing it from stoping when I turn it off. Firefox is fine so I've switched over.

                          It's a software/driver issue or a complex web, not simple hardware. Friends have suggested simply using an older monitor but you can plug in a 1920x1080 monitor from 2010 into a new mac and it will be nauseating. It's a software/driver thing. I'm positive of that.

                          Similarly, Windows/Linux seem to be copying Mac- or is it the GPU companies? Unsure but I'll get the same effect on an increasing number of non Mac computers.

                          I hope to try higher end monitors like BenQ and one of the nano-coated screens from Apple, as I've heard some have success with them.

                            dev