I talked to a guy who works as an hardware engineer for very big company with their mobiles phones, I told him my problem with the displays, and he was shocked and hear about this for the first time. He mentioned that they are just getting the displays from other manufacturers like samsung/lg/boe and don't give a sh1t what that display is, how it works and etc. They just assemble it and sell, thats it. So maybe this info would be some how helpful 😃

aiaf I have an election app that overlays live static noise. The type people here would panic when seeing. I've used it plenty when using the built in display. It's super comfortable at times. Almost like white noise for the eyes. Especially when I start to get annoyed by glowing text or focus problems. It locks the refresh rate obviously, but my main feel has been that it gives enough input to keep something in the vision system activated enough to avoid some issue. I also get the feel that it is easier to focus at the right distance to where the screen is. This however makes all pixels change continously, so I would guess it might also prevent some other stuff by accident. Also it keeps the gpu at super high load and eats the battery.

    photon78s Absolutely. Text can suddenly start to glow again. I've been somewhat uncertain about if this is from the display, something closer to migraine switching on, or related to contact lens dryness / light scattering. Tbh I don't know. I do get similar effects on other text at times, but I've also had many cases where I reopen the laptop lid or change something and it seems to come on. The are multiple flags related to temperature and similar, so I guess in theory it could be. I've also usually felt that the screen is better in low power mode. Even when comparing with fixed refresh rate. I did not however check what framebuffer flags change when in that mode.

    One slightly more far out theory of mine is that what is known but slightly changed triggers some other processing on the brain. Truetone is absolute hell obviously, but I often end up with way more issues after testing some color adjustments or resolutions. Or it could just be the reticular activating system and hypervigilance after focusing on it changes are good for a while.

    Sure would be nice to have a straight forward objective measure for when problems are present.

      async yep i've done this before on my M1 XDR Pro a few times too! it's because people like you and me are only sensitive to invisible flicker (flicker trying to hide itself, like temporal dithering) but visible/obvious static noise is fine.

      for example for me i'm actually totally fine to be at a concert with tons of obviously flashing intense strobe lights at once, but can't stand flourescent or cheap LED lights that are trying to "convince" me that they're on but are actually flashing on and off at 120hz.

      something interesting i've noticed the few times i've used a visible static pattern like this, is that my vision persistence seems to get really good for like 15 minutes after looking away from the computer. like it suddenly feels like real life is running at a "way higher framerate" too and turning my head has a less "jagged" feeling than i'm usually used to experiencing in real life.

      and for a bit after turning off the noise pattern, the display also looks WAY smoother than usual, like it feels like i'm suddenly looking at a screen from the future with double the resolution for a little bit of time.

      it's a really good feeling that makes me think "is this the way people with better vision than me truly see the world?"

      can you send me the app you made / upload it to GitHub?

        aiaf

        The TCON also detects certain patterns that may be difficult for the LCD panel to display, and optimizes the display at the pixel level to minimize artifacts.

        Sounds suspicious… deeper than the DCP level, "minimize artifacts", what is this optimization?

        async

        That is why I was wondering why some says screens don't manage being color "accurate" over time and dithering and other algorithms can be used to "hide" defects in hardware. On my T480s, I recorded something but it is hard to be sure. Really need to find a better camera not just a better phone app with better camera sensor and ideally bigger/brighter scope as well.

        Over at flickersense.org, it is reported that certain LED lights start out with no or very little flicker and start flickering over time as they "degrade". Similar things probably happen with screens.

          photon78s

          photon78s Over at flickersense.org, it is reported that certain LED lights start out with no or very little flicker and start flickering over time as they "degrade". Similar things probably happen with screens.

          Yes it’s my understanding that bulb manufacturers can use off-brand smoothing capacitors, where the liquid dialectic tends to dry up. This removes the smoothing capacitance from the original circuit.

          DisplaysShouldNotBeTVs Here's the tool. It's a bit rough. If you have node installed it should be runnable with npm i; npm run start

          https://we.tl/t-PNMjtpQ6Qd

          Not really relevant in this topic, but out of curiousity, when you get glowing text at times, try to turn on the light on your cellphone and hold it to the side of your head so you see the reflection in the screen. Try moving the reflected light on top of the text you read, and try moving the head together with the light. At least for me that instantly gets the text into focus again. The effect is way more prominent with my Mateview 28 that has a matte screen. I've had some luck with aiming light at the screen from the side of the screen as well previously. Not exactly sure how it cleans up the "glow", but it sure changes how easy the text is to read at times. Might even be related the proproreception.

            aiaf I made a quick (dis)proof of concept of a dither busting overlay

            https://gist.github.com/aiaf/8006286d0b38fe02518bf7f8a8890918

            Plays a transparent checkerboard pattern every other frame over most of the screen at 60fps. I checked with my capture card, it does not alter the pixels in a significant way frame-to-frame for the DCP dithering algorithm to change its behavior. In fact, this overlay becomes another source of eyestrain and screen instability in its current incarnation. But maybe someone can experiment with different patterns, frequencies, or blending modes and see if that makes a difference.

            Easier to just disable DCP dithering entirely with Stillcolor.

              async I'll give this a try with a capture card

              DisplaysShouldNotBeTVs Any idea if similar apis will be accessible on iOS when pushing apps directly from XCode, or would it require jailbreaking? Unfortunately I'm at iOS 17 now, so that most likely won't be cracked for some time. Did a lot of jailbreaking previously, but forgot about it at some point.

              @aiaf thank you so much for working on this, super excited there's a chance I can have a usable M2.

              as a note: I am using a M2 MBA 13" with a cheap eBay replacement screen I put it, kind that auto disables True Tone. Running Stillcolor I still notice same symptoms. I also see no banding, and no change whatsoever when toggling Stillcolor on/off. Tried running an RGB profile instead of usual sRGB but still no luck. Today I tried to se the banding on an M3 (I believe) 15 MBA in the Apple store and I could not notice it either.

              Do you think you will be releasing a Beta to test the added features you were testing on your personal? would love to try.

              Also to anyone what wants any info on this cheap replacement screen for research purposes, send me the commands to enter in terminal and I will pull up what I can. It seems to be identical the broken OEM one I had

              Thanks again.

                aiaf

                The Apple ProMotion mechanism automatically switches between 48/60/120hz according to the content you are running.

                Therefore I think the advantage of running something on the screen has to be researched from a different perspective.

                If you look at monitor reviews on https://www.rtings.com/ you can see that monitors show different behavior on different refresh rates. For example there are monitors who flicker if you run at a certain refresh rate (60hz), and also at the same time the same monitor doesn't flicker at the maximum refresh rate of the panel (120/144hz).

                The ProMotion mechanism drops the refresh rate to 48hz if nothing is moving on the screen, but when you move your mouse it immediately goes up to 120hz. Argument of Apple is battery optimization. There is no setting to force fixed 120hz.

                The point is if you have running a video (even invisible) on the screen the ProMotion mechanism changes in a different state. Atleast the author of this program found a notable difference: https://github.com/abinabdc/flickeringMacFix

                I cannot test it myself as I have the MBA without ProMotion, but I think its definitely something worth to figure out. For example maybe there can be different PWM rates at different refresh rates.

                  Hunter20 It goes way below that. See screenshot. Doesn't matter if it is with ProMotion or set to 60 hz. Both are variable. Low / High Power also doesn't seem to matter. It can be checked with Quartz Debug (Additional Tools for Xcode). Variable refresh rate is important for smooth scroll, but tbh I don't really see much difference. In Quartz Debug you can also force disable the variable refresh rate, but that creates screen tearing when scrolling.

                  There are flags for number of windows open in the framebuffer, and ofc the refresh rates change if you have anything that changes like a video in some other window. So there are tons of things that might change how it behaves.

                  Is there a way to programmatically set the frame rate to a fixed number on Mac OS? IIRC the "limit frame rate" setting only caps it at 60hz. It still drops lower if it "sees fit". So its still a variable refresh rate, albeit less so.

                  Findings on external monitors & bit depth

                  This might be common knowledge on this forum, but I’ll share it anyway.

                  I have a Samsung Odyssey G7 which is reported as 10 bits (8-bit+FRC).

                  I’m using an old HDMI cable I have laying around whose provenance I’m uncertain of but the jacket reads “High Speed HDMI Cable with Ethernet.” It’s capable of a max 10.2 Gb/s transmission rate (1), (2). Let’s call it HSwE.

                  I can now confirm the following behavior on an M3 Max MPB (equipped with an HDMI 2.1 port capable of 48.0 Gb/s)

                  I set the monitor to 1920x1080 (HiDPI) at 60Hz.

                  • When connected via HDMI-2.1 -> (HSwE cable) -> HDMI-2.0:
                    • (enableDither = No): Lagom gradient shows 128 distinct bands, screen is much easier to look at. My custom-made gray test shows a very obvious and abrupt change in tone when you toggle dithering.
                    • (enableDither = Yes): Lagom gradient is a lot smoother and shows 256 bands, corresponding with the change in grayscale value at 3px intervals. My gray test shows a colder gray. The gradient here looks as smooth as the gradient on my built-in display. Same gradations.
                  • When connected via Thunderbolt/USB4->DP1.4 (TB/USB4 maxes out at 40Gb/s, DP 1.4 at 32.4Gbps)
                    • (enableDither = No): Lagom gradient looks just as smooth as my built-in display, showing 256 bars. There’s a shimmer around the top white border of the gradient, especially at higher brightness. My gray test shows a change in tone when toggling dithering, but not as obvious as when connected via HSwE-HDMI.
                    • (enableDither = Yes): gradient still has 256 bars. There’s a change in the quality and smoothness of the gradient that’s difficult to put into words, a difference in luminosity. My gray test shows a slightly darker shade of gray.
                  Conclusions:

                  Assuming RGB pixel encoding and no DSC (Display Stream Compression).

                  • Using a HSwE HDMI cable or lower forces 8bpc. The bandwidth required to send 10bpc (32-bit signal) at 60Hz for 3840x2160 pixels (1920x1080 HiDPI) is around 14-16 Gbps + HDMI overhead. This requires at least a Premium High Speed HDMI® (aka Category 3 or 4K).
                  • Increase refresh rate and resolution to increase bandwidth requirements (bandwidth calculator )
                  • Using the USB4->DP cable allows the Mac to output a 32-bit signal which makes the external display apply its FRC/dithering algorithms, as evident by the above observations re. gradient banding.
                  • M-series Macs apply DCP temporal dithering to the pixel buffer regardless of bit depth.
                  • These Macs are capable of producing an 8-bit signal if forced to, at least based on bandwidth/speed negotiations. The machinery is there, we just have to find the right buttons.
                  • So if you have a true 8-bit panel or 10-bit (8-bit+FRC), use an HSwE HDMI cable or lower + Stillcolor to eliminate temporal dithering entirely. Avoid 6-bit+FRC panels.
                  • Based on the above observations re. banding and gray change, I conclude that the built-in display receives a 10bpc signal by default. The unknown right now is whether the built-in panel (at least on an M3 Max MBP) is true 10-bit or has additional temporal dither applied by the TCON (Time Controller), in addition to DCP dithering which Stillcolor successfully disables.

                    aiaf Thanks for those tests. I also mentioned this in my comment before. With 4K monitors its easy to achieve those limitations with basic cables. But its hard if you have 1080p or 2K display 🙂

                    Also please take a look into chroma subsampling. https://en.wikipedia.org/wiki/Chroma_subsampling Its also very important thing.

                      aiaf that's an awesome finding. I've been wondering about using a lower bandwidth cable to purposely bottleneck output. I'm sure this would also apply on other computers. I'm curious if you got a dell up2720q true 10 bit monitor that if it would also use dithering still. I did look and dell acknowledges "8+A FRC" and also "true 10bit". So I think the up2720q can be the cheapest true 10bit option if anyone wants to try it. I think you can get them as cheap as $800 form b&h photo as open box condition through their eBay store and I think their site as well.

                      Pics show the true 10bit and then a 8+FRC to show that dell advertises both which makes me trust the up2720q being true 10bit

                      • aiaf replied to this.

                        MTNEYE I don't think that if you change your display on laptop going to remove dithering, because the connection cable from your motherboard to new display will still be the same and capable to send the same amount of data to your display, and the display will project the same, but I might be wrong. I think the same is happening with when you connecting external displays.

                          madmozg The Samsung Odyssey G7 is a 1440p panel, so there's that. You want to find a combination of resolution + refresh rate that makes 10bpc transmission physically impossible on your cable.

                          Re. YUV, I'm certain that output's pixel encoding is set to RGB, unless I'm misunderstanding something here.

                          dev