aiaf This is remarkable work! So much easier to decipher than Lagom.

👍 glad it's working!

aiaf What's interesting is that on my MBP M3 Max (P3-1600 nits preset at max HW and SW brightness), with Stillcolor on (enableDither = No), I can see the 9-bit bands clearly twice as large as the 10-bit bands. So this confirms at least a "native" 10-bit image, likely dithered by the TCON. However, when I turn dithering back on, the banding on the 11-bit section becomes a lot smoother, and zoomed in, the bands are half the width of those in the 10-bit section.

I think this gives some credence to the double dithering approach, or DCP dithering -> TCON reverse temporal dithering as I will clarify later. This would be the case if these MBPs do not use true 10-bit panels.

From what I understand, this would also happen if the panel is native 10-bit. With enableDither=false, the display would receive a non-dithered, truncated or rounded 10 bit signal, and not dither. On the test ladder image, you would then see new color divisions up to a max of the 10-bit gradient level, natively displayed. Then, with enableDither=true, the display still receives a 10-bit signal, but it has now been color dithered by the GPU or CPU from the original 16-bit value. Because the GPU and CPU have access to the original 16-bits, the dithering now pushes us up higher on the gradient ladder. You now see the 11-bit level, simulated.

Theoretically, how much higher bit-depth you can simulate will increase with higher screen area of the same color, and higher frequency refresh rates of the display. And also, counter-intuitively, the slower the response time of the liquid crystal, the higher the dithered color depth. In that case, the inertia of the liquid crystal can average between two input levels. And Apple has anecdotely been said to use that strategy.

It gets complicated trying to determine if two systems are color dithering. Given if one system has color dithered, the second system in the pipeline should have no effect at all to the signal. And if one system color dithers the original 16-bit value, it is not possible to further increase the bit-depth of the result. No matter how many layers of dithering are applied. As I see it, the algorithm behind color dithering is all or nothing.

What's interesting for me is that you are seeing a change in the bit-depth with Stillcolor. I don't see that on my M1 MacBook Pro 16". Or with my external OLED 10-bit (maybe 8+FRC) display. The highest gradient ladder level with new divisions remains at the 10-bit level for both displays, with or without Stillcolor.

    Blooey a native 10-bit panel makes a lot of sense given our observations. To be clear I had to crank up my screen's brightness to the absolute maximum and zoom really close to see regular divisions in the 11-bit section (maybe a 4x wider gradient would help) but we're still not 100% sure of what's happening under the hood of these opaque systems.

    Double dithering produces a shoddy image full of artifacts as as I observed in one 6-bit+FRC display. (Also noted by @DisplaysShouldNotBeTVs ), which is not the case on these built-in displays.

    I want you to read this damning Apple patent filed in 2011 . Not to present this patent as evidence that these systems are in any shipping products in 2024, but it clearly states that they may employ GPU dithering as a method of reducing storage and transmission bandwidth requirements (e.g. 10-bit to 8-bit). The TCON having received this dithered, lower bit depth image from the GPU may then employ reverse temporal dithering techniques to restore bit depth (to a degree even greater than the source image, e.g. 4 bits to 12 bits) and additionally may employ other techniques to reduce artifacts.

    Certainly a lot to think about.

      your idea for double dithering could be good. At lease, when I measure "bad" win11 devices with opple4, it shows very "dirty" signal (I mean, at 100% of max brightness, it shows too much chaotic movement in oscillogram, but still no PWM) - until I got same "durty" signal at good devices at… 70..80% of max brightness. Dirty signal = full of dips & peaks pulsations, when good devices with win10 1809 show smooth grapfics at 60+ brightness

      And 2nd one, brightness level in bad laptops changes non-linear (not sure how it helps)

      Blooey Thanks for that test image. On my MBA M2 15-inch, toggling dither on and off produces the most noticeable effect at the 9th bit level, which suggests to me (if I'm understanding it right) that the device has an 8-bit panel.

        aiaf The scheme described in that patent is interesting, but it also seems really strange to me. It's been two decades since I took signal processing in grad school, but isn't it true that you can't inverse dither in the general case? It would violate the Shannon/Nyquist theorem. If the TCON is doing something like 4 bit to 12 bit "inverse dither", it's generating essentially temporal garbage.

        aiaf Double dithering produces a shoddy image full of artifacts as as I observed in one 6-bit+FRC display.

        Good point. I should have been more clear that once a color is dithered down to a particular destination bit depth, you will not see any change to the image when you repeat the dithering algorithm on the result using that same destination bit depth, or a higher destination bit depth. The algorithm is one and done once color values are centered on whole integers.

        However, it would not be a good idea to first dither down to a lower destination bit depth, to only then re-dither to a lower destination bit depth. I could absolutely see that producing much worse results than one single pass to that final bit depth.

          DannyD2 Cool, sounds like it’s working!

          I’m wondering how you would describe the effect that you are seeing on the 9-bit level? It should just be that the 9-bit level does not split every color band of the 8-bit previous level in two. Like every previous level.

          If the level is not supported you won’t see the previous level’s bands split exactly in half down the middle. You may see the bands shift left or right depending on how the system truncates the remaining bits, or the bands may be identical to the previous level. In those cases that bit-depth is not supported. The only case that should appear with a supported level is an exact split of the previous level, right in half. At least, that is how I intended the image to behave.

            Dadab12 Do you guys believe that the software will make it possible for me to enjoy an apple laptop?

            I believe temporal dithering is not a benign approach to color dithering. It doesn’t take much extrapolation of known issues with other tried and failed technologies to realize it falls into the same category (single chip DLP color rendering, PWM dimming).

            Engineering tends to often lean on persistence of vision to solve a problem, without looking at what system in the human biology is taking over and allowing that signal to persist. If a sub-pixel can be seen in isolation, it follows that it can and will generate a response in the viewers biology, regardless of what neighboring pixels do to hide the effect. We may see one thing, but be processing something else. It’s important that what we see and what we process are one and the same. If not possible, the details should be very carefully considered. We don’t see this consideration in all of the current engineering.

            All that is to say, if any software is proven to disable temporal dithering in a given setup, that is one key layer of poorly considered technology removed from the overall problem. And many Apple products do implement temporal dithering.

              This might sounds dumb, actually most likely does. @aiaf, Is there a way to "snip" a cable/ block a connector to the screen to achieve further dithering disabling on a hardware level, as in knock it down a bit depth? I just replaced my screen in this M2 13" MBA with a $200 eBay knockoff one and was looking at the display ribbons when installing and got me thinking.

              I honestly wouldn't care if it was only black and white so long as it didn't dither. Tried setting the triple click accessibility shortcut to greyscale but still dithers, full red is what I use now like the most intense versions of Flux but still little relief.

              • aiaf replied to this.

                It looks to me like this patent is only describing the well understood characteristic of color dithering, that if you dither using high frequency noise (blue-noise), you can reverse out to a higher bit depth by applying a center weighted average. Which is just a gaussian blur.

                Yes, it does seem like you have recovered the original bit depth with a blur, but not mentioned in the patent is that it would come at the cost of lost resolution.

                I don’t think Apple is implementing this, or I believe we would see extreme artifacts as they attempt to recover the detail lost to the blur. And that is a lot of image processing just to increase bandwidth. Also, with the bandwidth of thunderbolt, and direct short ribbon connections within the laptops, I don’t see a reason to implement something like this on top of basic temporal dithering.

                aiaf We're starting to suspect the all these Apple display panels have another dithering layer that's driven by the TCON on the display.

                Do you think that the M1 MacBook Air, M1 MBP 13" with Touch Bar and M2 MBP 13" with Touch Bar (the lower-end ones that don't have MagSafe) could potentially be a way around this?

                These 3 specific laptops are the only Apple Silicon laptops that are only marketed to display at "millions of colors", unlike the M2/M3 Air and all of the 14"/16" Pros with no Touch Bar, which all include "billions of colors" in their spec sheet.

                Of course, this doesn't change anything at the OS level, as even these 3 "only millions of colors" laptops still show 10bpc in BetterDisplay and certainly also use DCP dithering.

                However, maybe the "millions of colors" spec could indicate that the additional TCON layer isn't present on these laptops?

                  Blooey

                  So the EE or EECS people don't talk with physiologists and biologists and vice versa. A recurring theme on this forum.

                  A patent I found interesting but not mac specific (edit: oops this was already posted by @aiaf, just me highlighting the inversion and artifact connections to dithering from this older Apple patent):

                  https://patents.google.com/patent/US20100207959?oq=temporal+dithering

                  A method and system for temporal dithering of pixels in a display. The dithering of the pixels may allow for simulation of 8-bit color from a 6-bit display. Moreover, the dithering of the pixels may be selected to follow a specific pattern to minimize display artifacts, which might otherwise result from interference generated by pixel inversion techniques performed during the pixel dithering. Through application of selective dithering techniques, including utilization of specific dithering patterns, the generation of display artifacts via interference from pixel inversion techniques during the display of an image may be minimized.

                  Dadab12 get a laptop from Apple and try out the various recommendations here, including Stillcolor, changing the color profile, reducing brightness, and selecting a monitor that does not FRC. If it doesn't work out for you, Apple generally has a 2-week return period.

                  photon78s in a perfect world, yes. But verify with your monitor first.

                  Blooey let's assume that these MBPs panels are 8-bit+FRC. Is it reasonable to say that the DCP temporally dithers a 12(or 16, AllRez shows 16 for bpc sometimes)-bit signal to 10 bits, and then the TCON would then temporally dither that (instead of a merely downsampled 10-bit signal) into 8 bits? You say it's not a good idea, but maybe it works? At least with double dithering you don't lose all of 6 bits going from 16 to 10, visually. I'm finding it difficult to visually reverify the half-width bands in the 11-bit section, but whatever is going on Stillcolor is changing something in the quality of the banding, it's smoother, and knowing what we know, it's the temporal dither.

                  MTNEYE I wouldn't recommend this honestly, but then again I don't know nearly enough about hardware.

                  DisplaysShouldNotBeTVs the BetterDisplay bpc is not really indicative of connection bit depth, nor is it even accurately reporting the same numbers that are reported by AllRez. For example, when AllRez reports 64bpp and 16bpc, BD is reporting that as 32bpp and 8bpc. Did you try @Blooey 's new gradient test on your MBA? Would really like to read your analysis of how Stillcolor affects the image.

                    aiaf let's assume that these MBPs panels are 8-bit+FRC. Is it reasonable to say that the DCP temporally dithers a 12(or 16, AllRez shows 16 for bpc sometimes)-bit signal to 10 bits, and then the TCON would then temporally dither that (instead of a merely downsampled 10-bit signal) into 8 bits?

                    Yes for sure, I think you would always be better off double dithering if the only other option is rounding or truncating bits somewhere else in the pipeline. But if you can perform a single dither from 16 to 8, I believe your results should be significantly better than multiple dither passes. You are right though, if the intermediate step is 10-bit, there is a lot of detail still in that depth, and it will likely be hard to see the artifacts after a second dither-down.

                    But I have some doubt about Apple double dithering to a built-in display. They wrote the software, and they built the hardware. So I wonder why we think they wouldn’t just dither down to native bit-depth in the first pass? Or for that matter, why it doesn’t behave exactly the same as an external display to the system.

                    Is there evidence that the panel is native 8-bit, or are we just trying to work out the possibilities? I missed the clues from earlier in the thread if 8-bit native was already established.

                    • aiaf replied to this.

                      Blooey it's all conjecture. I asked on the MacRumors forums one person said it's all 8bit+FRC except on the Pro Display XDR.

                      I'm trying to make both scenarios logically consistent in my head. A native 10-bit panel makes a lot of sense, but then the DCP dithering would be an oversight by Apple's engineers. Unless of course it means better color reproduction, i.e. higher apparent bit-depth?

                      Having an intermediary 10-bit dithered step is possibly a physical limitation imposed by bandwidth, computational, or power requirements. A 64-bit (or even 40-bit) 4/8/6K signal at 60Hz is a ton of data without compression. There are also concerns of driving multiple displays.

                      In the case of an 8-bit+FRC panel, dithering on a pre-dithered 10-bit input could mean more faithful color reproduction without running into physical limitations of the hardware.

                      The limitations are outlined here https://support.apple.com/en-my/101571

                      I believe these machines use eDP for the internal display. https://arstechnica.com/civis/threads/macos-update-will-let-the-m3-macbook-pro-work-with-two-external-displays.1499263/page-3

                        aiaf I asked on the MacRumors forums one person said it's all 8bit+FRC except on the Pro Display XDR.

                        It may not be a fair suggestion, but this makes it seem like the Apple marketing department is involved. Discovering the bit depth of a monitor for ourselves should not be this convoluted. For one, we know they make no mention of FRC in their specifications. And if Apple is using 6-bit+FRC panels in a product, I could see the motivation to hide that spec. Or they may hide that property if they want the Pro Display XDR to appear like the sole gateway to wide color. This may not be true, and I hope it isn't.

                        I spent some time trying to simply print out the supported sample bit depths of my displays. I found that every method, function, and property that at one time did provide a display bit depth has now been deprecated. Maybe someone has had more luck. But if not, that missing public API is, unfortunately, somewhat telling.

                        aiaf Having an intermediary 10-bit dithered step is possibly a physical limitation imposed by bandwidth, computational, or power requirements. A 64-bit (or even 40-bit) 4/8/6K signal at 60Hz is a ton of data without compression. There are also concerns of driving multiple displays.

                        To perform alpha-blending, Core Animation is capable of operating on and retaining a 32-bit per component, floating point texture, with alpha. Which is to say, Metal can handle a 128-bit floating point buffer. Every CALayer is associated with one NSWindow, and every NSWindow is associated with one NSScreen. So, as I see it, it would be trivial to simply dither that buffer down to an arbitrary display buffer at whatever bit depth the display or cabling supports natively. I still don't see the need to dither once to 10-bit, and again to 8-bit. That would only require more bandwidth and computation. It can all presumable be handled once by Core Animation, with better results. The total resulting buffers are smaller, with higher quality dithering right from the original 32-bit buffer.

                        I just wonder if we can come up with an experiment that proves FRC vs native bit depth. It would put the concern about missed layers of temporal dithering to rest. My ladder test at least gets us to what the system supports. But I haven't yet come up with something to isolate for FRC.

                        aiaf A native 10-bit panel makes a lot of sense, but then the DCP dithering would be an oversight by Apple's engineers.

                        I'm not sure what you mean by this? That GPU dithering shouldn't exist for a 10-bit panel?

                        DisplaysShouldNotBeTVs I think the M1 air uses PWM dimming IIRC. The M1 and M2 13 inch touch bar Macbook Pro should in theory be the most benign screens as they are basically just the 13 inch MBP from the intel days with an Apple Silicon chip. The touch bar and keyboard backlight does flicker, though. The keyboard backlight is easy to turn off, idk about the touch bar. In any case, I'm curious about this also.

                        dev