This is my first post after following this forum for a while. I had some severe fatigue issues after long hours of laptop use, and eventually, I came up with a solution. Even though it may not affect many people in this forum, I hope it can be useful for some. I have been trying to remediate this issue for about 2 years, and there is a noticeable difference in the last year. I will give a compressed version below.

It all started with noticeable fatigue by evening of every workday. This began several years ago and progressively became worse to the point where I had to take a stern decision to look for a solution. I went through all the information I could gather from various internet sources and also came across this forum.

I first started with PWM. The 2019 MacBook Pro (Intel) I still use has PWM but at >100KHz range. I wasn't seeing any flickering, but I still installed Lunar Pro and reduced the brightness in software while keeping actual display brightness at 100%. This turned off PWM, and over a period of a month, I could feel a noticeable improvement in the fatigue situation. I also switched to AMD's discrete card in the MacBook Pro from Intel's built-in one. But I think the 100% brightness hack did the trick mainly. However, fatigue issues were still far from completely gone. Then I started experimenting with WiFi and started using an ethernet connection. This is where I saw a dramatic difference in improvement. After a couple of months, I can say things improved a lot. I had WiFi and Bluetooth turned off all the time during this period.

At this point, I was fine with leaving it like that, but I still wanted to see if things could be further improved. So I started using an old HP 22er monitor with the Mac in clamshell mode. This definitely gave me some relaxation as the screen is big and overall less strain rather than working on the laptop screen. But the improvement from turning off WiFi still remained the most significant one so far. I then decided to buy a good monitor and permanently switch to that from the positive experience of using an external monitor. After much research, I bought a 27" Dell UltraSharp UP2720Q monitor which has DC dimming, excellent colors, and gamma that can even rival an OLED display when viewing head-on. The monitor was excellent already, and I eventually made some adjustments to the Display settings like scaling to pixel-to-pixel, turning off response time (normal to off), and starting using the Adobe RGB color space instead of sRGB. I set Adobe RGB in both monitor and Mac, and colors are perfect; Mac converts colors to Adobe RGB with no over or under saturation issues. This also increased the realism of image and video because Adobe RGB uses pure 2.2 gamma as opposed to the altered sRGB gamma of mathematical 2.2. This is a 10-bit monitor and can easily handle the subtle gray variations, so I can use pure 2.2 gamma without issues. Turning off response time made a difference, I guess. The difference is nearly impossible to perceive. Also, it's a 10-bit monitor and the AMD discrete card is sending 30-bit data, so no more temporal or spatial dithering to worry about. This monitor supports Thunderbolt 3 and no DSC or compression-related issues to deal with either. The final result was really good, very calm whites with no business I have never seen before. Perceptual sharpness is very good as it's a Professional grade monitor.

Overall excellent experience with no more fatigue. There is still fatigue after sitting in front of the computer for a long time, but my evenings are back to the same quality as a weekend evening when I don't have to sit in front of the computer for work.

I also keep the laptop physically away from me. I don't know if that makes any difference. The WiFi experience made me worry about a lot of other possibilities, so I keep it several feet away from me.

thats great to hear about the up2720q being good. I just wonder if it is truly 10 bit since b&h photo told me its 8+2frc and also replacement panels online seem to be 8+2 as well. hmm

  • mmn replied to this.

    Thanks for your feedback! Very helpful. I have a theory that latest MacOS determines when its going to send 8bit with (temporal/spatial/inverse dither) and when to send 10bit. Even with their XDR Pro display they do this, but its still my theory, and i'm trying to prove it 🙂

    • mmn replied to this.

      jordan

      So I did some research and this is what I could find.

      If you load up the User's Guide PDF of UP2720Q from Dell's website, under Monitor Specifications, they say this: "Color depth 1.07 billion colors (real 10 Bit)"

      Now if you go to Apple's website and load up the Pro Display XDR product page, there you can see: "With true 10-bit color, Pro Display XDR can produce more than a billion colors with extreme accuracy..."

      So I don't know what is the correct term to use here, "real" or "true", but they both don't say just "1.07 billion colors" or "Billion colors", which can be achieved using dithering.

      Dell is a reputed manufacturer and this is a monitor in their PremierColor series, which they only have two models at this moment I think. So I don't see a reason to not believe when they say "real 10 Bit" in brackets.

      Also suspiciously UP2720Q is still the last one in the series now, even after 5 years. This is my theory: they probably used Panasonic's professional panel, and Panasonic stopped the manufacturing of Pro LCD panels around 2021 or so. Nowhere else I could confirm this though, so it is my theory at this point. Another thing is the superior uniformity different reviewers praised it for. Even Pro Display XDR has much worse gray uniformity. So Dell must have sourced a really good panel for UP2720Q. Another notable aspect is the color accuracy I am seeing when I calibrate it using its built-in colorimeter. When I turned off the response time from default "normal" to "off", the color accuracy significantly improved and the Delta E 94 came even below 0.1 or 0.2 for most of the colors, except gray which still stayed a bit higher but under 1. This is excellent accuracy if we can trust the built-in colorimeter, playing in the range of high-end Eizo monitors. So after seeing all these, my only explanation is it is indeed a 10-bit panel with some superior accuracy. Also, the gamma is excellent and all the levels are controlled via LUTs. So my theory is that precision maniacs Panasonic most likely manufactured this panel.

        madmozg

        So this is what I can see in Mac:

        System Information -> Hardware -> Graphics/Displays

        Framebuffer Depth: 30-Bit Color (ARGB2101010)

        So Mac detected the external display as 10-bit capable and is sending out 10-bit data. Now whether that 10-bit is true 10-bit or 10-bit generated from dithered 8-bit source data by the graphics card is something I'm not sure about. But again, I don't see a reason to not trust Apple in this case. I'm not using the built-in Intel graphics. Mine is a 16" MacBook Pro and has a discrete card, which they included for people using the laptop for professional use. And macOS is 10-bit capable.

        Another interesting thing I experimented with this monitor was connecting it to an ASUS Zenbook S 13 OLED before updating its BIOS to enable the USB4 support. I was getting 8-bit banding all over the place. I then updated the BIOS and enabled USB4 in the BIOS, and everything went away, resulting in a very smooth display as in Mac. So another possibility here could be the bandwidth. With TB3 capable of 40Gb/s, I guess Mac checks for sufficient bandwidth, the Graphics card driving the display, and most likely decides what bit rate to use.

        mmn I did see that but they dont mention if its true 10 bit panel or not. I am hoping it is and with what you all said about accuracy thats amazing! I just have trust issues myself especially after viewsonic advertises true 10 bit also but when asking over email for clarification they said its a 8+2 frc panel. Would you ever consider checking it for dithering ? id even buy you a cheap microflip scope to test for it. - if your in the usa lol.

        • mmn replied to this.

          jordan

          After you mentioned panel, I just checked the user guide again. Interestingly, the "real 10-bit" remark is under Monitor Specifications -> Flat Panel Specifications. I hadn't paid attention to that before.

          For dithering testing, I can post here what I can do about it. I do have a 100x Jewelry Magnifier using which I can see the pixels but not record easily on my phone. I can post here if I can manage to get a video clip somehow. By the way, the pixel inversion is always happening on anything using liquid crystal, or it is not going to last. So all the pixels are flipping in a symmetric fashion all the time. I haven't checked which inversion mode it is using (there are websites to check that I guess), but definitely it will be using one, or the LCD won't last. So at a micro level, LCD pixels are always dancing, if you could say it that way.

          As for the dithering concerns, here is my thought on it. The graphics device is as important as the monitor device itself. This is overlooked unfortunately. Integrated graphics may be able to operate in true internal 10-bit mode but there is no way to guarantee that is the case all the time. In the case of macOS, it is Apple who works with Intel and configures the driver and prioritizes it for different scenarios. For an integrated graphic, power efficiency can be much more important than accuracy and performance. As I mentioned in the original post, along with the PWM experiment, I also switched back and forth on graphics devices and tested the results. Fortunately in my case, that didn't make much of a difference compared to turning on or off the PWM (using 100% brightness). But I did see some weird things going on though. For example, I could see the slight overall shifting of edges of small bright shapes (like tiny icons/buttons) on gray or dark gray background. This is very subtle but noticeable if we pay enough attention to it. In reality, it doesn't have to happen like that. The graphic device is sending pixel-for-pixel data as we believe it is. Why should it slightly shift at all when switching from integrated to discrete graphics? The truth is, Intel UHD is doing more than just sending pixel-for-pixel data. I don't know what it is doing, but I don't want that when sending data to a 10-bit capable monitor.

          jordan

          There you go: https://youtu.be/SN1Z0wTs-5U

          Format: 240 FPS HD (Recorded on iPhone 11) Subject: Head of the letter "r" (upside down, as the microscope does that). Medium dark gray text on very light gray background.

          Graphics device: AMD Radeon Pro 5300M 4 GB

          Computer: MacBook Pro 16-inch 2019

          OS: macOS Sonoma 14.7.1

          Also I am not able to switch to built-in Intel graphics when connected to this monitor. It sticks to AMD card all the time.

            Yesterday I applied nvram boot-args="dither=0 enableDither=0" from terminal utility in recovery mode and there are some interesting artifacts noticeable now.

            When watching videos on YouTube, or when viewing images in general, there is a slightly smoothened experience. This is very similar to a very light noise reduction applied. Everything feels like wax coated or extra smooth. But the difference is subtle, only noticeable if we pay attention to it.

            The text and other non-media content looks basically the same. The busyness of white background may or may not have improved but is very difficult to discern. It is only subtly noticeable in media content at this point.

            I checked the pixel shimmering at 240fps again and it is there, nothing has changed. But there is indeed something different after I made that nvram setting change, that's for sure.

            There is no color banding before or after the change, on both UP2720Q and the built-in screen. Everything was driven by the discrete AMD card all the time though.

            This led me to different conclusions and also questions some of the concepts being discussed in this forum in general.

            The main conclusion is, using a microscope and high-speed recording may not be the right way to detect dithering in LCD-based displays after all. LCD has to switch polarization every other frame so that will overshadow any additional pixel switching introduced by the dithering. We won't be able to visually distinguish these two switching patterns.

            Another assumption is, if a panel says it is 8+2FRC, that is supposed to be the panel spec, meaning the panel circuitry is supposed to take care of the 2FRC part and whoever is sending data to the panel, it appears like a 10-bit supported panel it is sending data to. I didn't see any 8-bit banding on the built-in screen after I applied the nvram settings. So I suspect the panel still received 10-bit data and did all the FRC internal to it, using its own circuitry.

            So that raises the big question of whether dithering was indeed disabled. I don't think dithering was disabled. No Mac laptop screen panel is natively 10-bit capable. And I don't see any 8-bit banding whatsoever. I checked a lot of different media like images and video, no banding. So that means dithering is still indeed happening.

            But I did see some new artifacts after the nvram setting change, mainly the effect of de-noising, although a bit subtle. So this is where things get really intriguing. We know dithering is also used in audio, not just video. Dithering in audio is the mixing of noise to avoid quantization errors and making low bit rate audio free of audibly detectable artifacts. Now if we translate this to video, we know image compression like JPEG worked by using quantization and taking advantage of the way human visual system works. So this is my theory: there has to be a hidden way of doing something similar to adding noise to audio, to make the quantization artifacts in the video or JPEG compressed images less noticeable to the human eye.

            So there has to be some totally other objective here behind adding dithering in the source via a graphics card. I guess we are mostly stuck on the idea of color depth management when we hear about dithering. What I see is, this could be simply achieved by the panel circuitry itself, while the graphics card is doing much more than that, working hard to apply different methods of reducing perceivable quantization. Also this can be a big deal for companies like Apple, making sure their laptops are not displaying wax-coated content as opposed to clear and crisp content from others.

            Now what is it really playing out when it comes to health? I don't have many ideas, but I have some theories though. If the basic method here is to add random noise to make the quantization errors less perceivable from compression, then we humans are presented with a lot of junk visual data than there is in the original source. And here is the important aspect of the theory: we may be presented with a lot more junk than our brains need to process. This could be a bit overwhelming for some.

              mmn thanks for checking. Did you disable font smoothing? I think that will cause dithering around text

              • mmn replied to this.

                mmn there is a slightly smoothened experience. This is very similar to a very light noise reduction applied. Everything feels like wax coated or extra smooth

                same feeling I got when compared z390d chipset to z690p. And intel iris xe to amd 780m

                mmn We won't be able to visually distinguish these two switching patterns.

                Yes, I called it "multi-layer pie" where final layer driven by Monitor, mask all previo

                mmn So I suspect the panel still received 10-bit data

                I suppose, 8+FRC panels get 10/8 bit input, then in case of 10 bit input, downscale it to 8, and dither using FRC to make smooth look. If panel get pure 8 bit, it smooth via FRC without 10 to 8 bit downscaling, and when users report they noticed 8bit input to 10bit panel looks less strainy than 10bit input, it mean monitor's t-con downscale method cause extra dithering before FRC applyed

                • mmn replied to this.

                  jordan

                  No, I haven't done any other adjustments to fonts. The main difference is, I have no problems with dithering, thus no way to experientially say its effect.

                  simplex

                  when users report they noticed 8bit input to 10bit panel looks less strainy than 10bit input

                  This is also very interesting. My speculation is, when Apple says 1 billion colors, they are not completely discarding the 2 bits worth of data from the source 10-bit when doing the inevitable downscaling to 8 bits for the 8-bit native panel. I guess they capture that lost 2-bit info in the 2FRC part. This could be a smart algorithm used in the panel circuitry of a 10-bit advertised panel. So basically they could be splitting the 10-bit data into 8-bit spatial + 2-bit temporal using smart circuitry. Maybe this is adding its own complexity and users may be perceiving it.

                  But another interesting thing is, I could see many times in this forum people confirming the effect of Stillcolor. And no one reported 8-bit banding as a side effect of that. So I think we can assume they are not entirely getting rid of dithering but only whatever dithering introduced by the graphics unit. And surprisingly that is giving a very positive outcome. So that is the intriguing part I guess.

                  Simply put, we pay the premium for a high quality feature-rich product that we expect to do that extra fine processing. But in reality, clever methods are being employed in them to actually utilize our own brain to do a major part of that visual processing. This is an irony!

                  dev