async

Check these out.


Proof it has been used by Samsung since 2011 + a real-time algorithm that splits every frame of display input into foreground, middle, and background layers and then applies color fringing to create a false 3D effect

https://www.researchgate.net/publication/241222779_Depth_Perception_Enhancement_based_on_Chromostereopsis

Let me know if you want the PDF if it isn't loading for you.


Proof that a very similar real-time algorithm has been used by Intel (!!)

https://jov.arvojournals.org/article.aspx?articleid=2749907

This may explain issues with Intel integrated graphics.

It also may have been exactly what was implemented with this 2013 driver update on a ThinkPad x220.


Proof that it affects text readability

https://core.ac.uk/download/pdf/81129309.pdf


I was going to post about this soon…

I believe that this is the #1 ISSUE with modern displays, much worse than either PWM or temporal dithering.


I do get "some" strain from very old (pre-2010) devices that use PWM, but — unlike modern devices — never in a way that makes the screen feel blurry or 3D, causes extreme brain fog, or impacts my reading ability.

And on the few "very comfortable" modern devices I've managed to find, all of them seem to have little to no false 3D effect, and I cannot easily see color fringing (except for the standard RGB pixel structure itself).

Finally, on every modern screen I find either uncomfortable or unusable, I always see extra, very noticeable (to me) red and blue color fringing, even around pure black and white sharp pixel lines, and it remains even if I set the screen to a software "grayscale mode". This fringing shows up on everything — not just subpixel text — and it will remain even if subpixel text is disabled.

All post-2011 Apple devices are the worst offenders (except for specifically my 2016 13" MacBook Pro, which is fine for whatever reason and looks very flat to me) but I've also seen it on multiple Android devices (both LCD and OLED) and Windows laptops.

There is also most likely equivalent implementations in software -- on Windows 11, which is much worse for me than Windows 10, I see even more obvious color fringing. On some laptops, both Windows 1809 and 22H2 have issues, but 22H2 makes it even worse than it already was on 1809.


Until something is done about this, the only way to "consistently" get around it (i.e. without the trial-and-error of scavenging for the rare modern devices that work), is to find some kind of way to use a truly monochrome display without RGB subpixels like a monochrome LCD or e-ink.

Or for a less effective but still extremely helpful workaround you can use strong monochrome color filter glasses (I have pure red glasses that eliminate almost all colors except red, and they consistently make many unusable screens at least "tolerable" to me and sometimes even better than that in some cases. FYI, the "monochrome filter glasses" that work for me are very different from the more common "blue light" glasses that only tint things orange and keep colors -- those ones don't work for me at all.)

Now I'm very sure that the reason why reducing displays to physically monochrome works so well is that it significantly reduces the possibility that this effect can be created.

    DisplaysShouldNotBeTVs This is super interesting. I'll review everything at home.

    I posted some things about the scaler at some point. Searched a lot for somewhere to config it. I think things like surface compression, max framebuffer layers etc are involved in when it decides to do excessive compression.

    One thing I really would like to see is for someone with a capture card to capture the same screen with both a screenshot and with the capture card.

      DisplaysShouldNotBeTVs These values can be changed with this too RehabMan/OS-X-ioio: Command-line utility for setting ioreg properties via IOService::SetProperties (github.com)

      ioio -s AppleM2ScalerCSCDriver DisableDeRinging false
      ioio -s AppleM2ScalerCSCDriver DisableSIMDOptimizations false
      ioio -s AppleM2ScalerCSCDriver EnableFiltersNoRewriteMode true

      Didn't really see any changes when switching around things. But there might be caching of rendered tiles and similar going on at the same time. If I remember correctly I had some cases when playing around where I could see the cached ones being replaces gradually.

      I tried to modify ioio at some point to allow changing dictionary values, but couldn't get it to work. If someone with more Xcode / MacOS experience wants to take at shot at tormenting ChatGPT until that works feel free. There are many hierarchical / dictionary values that no one tried changing.

      async Yes, I use a low level API for that. There are actually several APIs to manage Presets, including a high level one that Apple made for System Settings - it's a bit clumsy but otherwise works fine. If you are interested, I can send you the details, but you'll need to do heavy coding to utilize it.

      FNP7 Thank you! Many of the techniques should work on iOS/iPadOS as well, but did not experiment with it.

      Donux Yes, you can use terminal commands. Download betterdisplaycli or just use the binary inside the app bundle. Example:

      betterdisplaycli set -n=displayname -quantization=0.5

      Here is more help: https://github.com/waydabber/BetterDisplay/wiki/Integration-features,-CLI

      async I was planning to add some shader effects to BetterDisplay. Right now if Metal overlays are enabled in the app (a bit difficult to make that happen now as you need to change some awkward settings) you can do some simple color adjustments with it only (temperature, RGB control etc) - I primarily added this to support a night shift like effect on non-native displays (on which gamma table adjustments don't work) but did not expand the featureset much. So much to do…

      async

      Here is more proof that Intel is interested in the false 3D effect

      https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6126933/

      "Computer-graphics engineers and vision scientists often wish to create display images that reproduce such depth-dependent blur, but their methods are incorrect. […] We found that accommodation is not driven at all by conventional rendering methods, but that it is driven surprisingly quickly and accurately by our method with defocus and chromatic aberration incorporated."

      Funded by NSF Research Grant BCS-1734677 and by Corporate University Research, Intel Labs


      Here are a bunch of highlights from the above article

      "Example stimuli generated using our rendering method. The left, middle, and right images are simulated to be farther, at, and nearer than current focus"

      "LCA were as effective at driving accommodation as real changes in focal distance"

      "our rendering technique provides a useful signal for accommodation"

      "spherical aberration can in principle provide directional information to guide accommodation"

      "We are currently developing methods that enable real-time updating and are measuring how effectively they can be used to drive accommodation and create realistic depth appearance"

      "The color effects associated with LCA are generally not perceived […] accommodation and depth perception results are good evidence that LCA contributes to visual function even though the resulting color fringes are not necessarily consciously perceived."

      (^ this is not true LOL I can see the color fringing so easily)


      This paper is specifically about creating the effect by modifying existing 2D images

      "In an earlier paper, we describe how to do this for complex 3D scenes using computer graphics (Cholewiak et al., 2017). Here we describe the general problem of which blur kernels are most appropriate for creating realistic retinal images. And then we describe a method for implementing realistic blur for 2 D scenes"


      They know that this can cause depth conflicts

      "Our experiment presented a cue conflict in which LCA indicated that a response was required while the other cues indicated that none was needed."


      They know this can simulate something floating in front of the screen

      "When we simulate that the object is nearer than current focus (positive defocus), blue is displayed sharper than green which is displayed sharper than red. Our results and those of Cholewiak et al. (2017) show that presenting such a stimulus causes the viewer's eye to accommodate nearer such that it becomes focused in front of the screen"


      They know that it creates a blurry image, and that people can notice this

      "Instead of maximizing image sharpness as defined by those metrics, accommodative responses to simulated changes in focal distance (specifically, Defocus + LCA) actually reduce sharpness"

      "an accommodative response in the Defocus + LCA condition necessarily causes more blur in the retinal image. Subjects actually noticed this, but nonetheless made responses in the direction specified by Defocus + LCA"

        DisplaysShouldNotBeTVs Really nice find. I'm certain there are important things to be found here. The list of random shit they are doing instead of actually taking a grid of colored pixels and displaying them is just mind blowing.

        Some random observations.

        I've had a gradient overlay on my Mac with a weak red color near the edges, and blue near the middle for a few days, and it seems to make everything more pleasant. It almost makes the screen feel a bit curved.

        Similar experiments can be carried out with slight color shifts, like making blacks more red, and whites more blue to offsett effects like the text jumping out of the page. I for one can't stand orange looking whites. This can be done with holding the option button when creating a color profile thru the build in MacOS solution, to a lesser extent with color table manipulation in BetterDisplay, or thru the app Gamma Control.

        I have not found any overlays or patterns that affects chromostereopsis in any meaningful way without being too invasive, but I have however seen examples where tormening the color profiles and color table adjustments can create colored fringes that makes everything seem way more flat.

        I advice everyone to actually look up some chromostereopsis image on an OLED screen and see how extreme the 3D effect is with oversaturated colors. I've had one of those image as my wallpaper for a few days now, and it seems like the effect is significantly reduced, so it might be possible to trigger some adaptation. Also it might be required to have enough stimuli to do any kind of adaptation.

        Learning to suppress task-irrelevant visual stimuli with attention - ScienceDirect

          DisplaysShouldNotBeTVs is there a way to turn this off? Is it mentioned anywhere when this stuff was added to drivers?

          Just curious does anyone know if nvidia or amd does this sort of thing ?

          @waydabber is there a way to make an additional app or some sort of to be open source or so, to let us install it on working machines to switch between 8/10bit color mode? I think it will help a lot of people here. Currently i'm not able to install betterdisplay for some reasons on my working laptop.

          async I advice everyone to actually look up some chromostereopsis image on an OLED screen and see how extreme the 3D effect is with oversaturated colors

          The really interesting part is that the chromostereopsis images even "temporarily" affect typically safe displays, which is what really nailed this down for me as possibly THE core issue.

          For example, my 2016 Xiaomi Redmi 3 with LineageOS 18.1, which is one of the most comfortable mobile devices I own, typically looks very flat and is 100% usable for me.

          (The only phone I have with a screen that looks even flatter is the 2014 Nokia X on Android 4.3, which is sooooo comfortable but too slow to really do anything with aside from typing notes)

          However, the moment I put certain chromeostereopsis patterns on the Redmi 3 screen, the false 3D effect suddenly appears out of nowhere, and the pattern creates the same exact "vibrating and shaking" + "strain when trying to focus" feeling that I get from modern devices, especially on the red areas of the pattern.


          Side note:

          The first time I EVER experienced a device with a false 3D effect (and the corresponding eyestrain and brain fog) was in 2015 — it was the GPD XD Android gaming handheld, it ran Android 4.4, although had more obscure specs such as a Rockchip CPU.

          The moment I started playing games on it I noticed that the game "looked 3D" immediately which confused me so much. It was the first time I ever felt "depth" outside of a true stereoscopic 3D screen. However I kind of brushed it off initially, because I had no idea back then if it was from the screen or if it was "just me".

          Although my other devices at the time like my old laptop, luckily, were (and remain) safe — even then, something was very off about that console (but I wasn't yet aware that it was connected to the 3D effect).

          I had difficulty playing games and following what was going on, and sometimes I felt a light "seasickness" after playing. I was otherwise skilled at video games back then, so I wondered why I never completed any games on that system. Now I know why.

          I also vividly remember noticing red artifacts/color fringing next to all white elements on the handheld's screen, which 100% lines up with my current theory.

          Edit 9/10/2024: I was incorrect that 2015 was the "first time" I experienced it. The first device I had the depth perception issue was a Late 2008 MacBook Pro I bought in 2012 (although to a milder degree)… I still have that laptop today and it still has that issue, on both macOS and Linux, it just clicked in my memory that I actually do remember it looking like that back then. I have evidence.. one image I found in the downloads folder from 2014, I actually remember saving because it "looked 3D" on that screen

          Weridly, a 2009 MacBook Pro I just bought recently (that actually has more PWM than the 2008) is surprisingly fine in Linux, unlike the 2008.

            DisplaysShouldNotBeTVs Interesting. As I noted further up here. I did the double invert trick that significantly reduces high gamut colors (and possibly other things on iPhone 14 Pro Max. I switched it off for 1-2 days and noticed this intense red colors on lots of other monitors again. Which is uncommon these days. I was testing the LG woled at some point so it could have been influenced by that as well, but nevertheless interesting.

            I also noted this experience where color filters on the same mobile caused a significanat shift in how other non screen colors looked. Tbh the entire phone is cursed and something is going on.

            I don't necessarily think exposure is always bad, but I guess in some cases it bypasses or messes with the proper adaptation mechanisms. Possibly when the effects are subtle or high frequency. Would love to hear what happens if someone goes all in to use high gamut wallpapers and 3d effects for a few days, if someone wants to risk it. I put one of these 3d ones on my iPhone (with double invert) and have noticed that I experience the 3d effect less.

            One point to note as well is that with shitty glasses or bad fitting contact lenses you'll get chromatic aberration that can look a bit similar to fringing, and that probably doesn't help too much. This also happens more with too high of a prescription strength. White text on a black oled background would probably be the worst one here.

            Mostly all my issues started at a time where I wore too high prescription contact lenses with a green ting for screens (Acuvue Max). I also switched from the touchbar mac to my M1 Max at roughly the same time. But then again, there has been more and more led bulbs, oled tvs and other things, so it is always hard to pinpoint.

              DisplaysShouldNotBeTVs What you're saying makes so much intuitive sense to me, I have two identical MacBook Pro i9 Vega20, the one with later firmware is symptomatic for me… I always felt side by side the problematic one sort of feels like looking at the identical image but a different focal length… felt like a wider lens focal length… so this false 3D makes perfect sense to what I'd felt but couldn't make sense of.

              McCollough effect - Wikipedia

              The McCollough effect is a phenomenon of human visual perception in which colorless gratings appear colored contingent on the orientation of the gratings. It is an aftereffect requiring a period of induction to produce it. For example, if someone alternately looks at a red horizontal grating and a green vertical grating for a few minutes, a black-and-white horizontal grating will then look greenish and a black-and-white vertical grating will then look pinkish. The effect is remarkable because, although it diminishes rapidly with repeated testing, it has been reported to last up to 2.8 months when exposure to testing is limited.

              async I also noted this experience where color filters on the same mobile caused a significanat shift in how other non screen colors looked. Tbh the entire phone is cursed and something is going on.

              yeah i have EXACTLY the same experience with my iPhone 14 Pro: if i use it with one eye covered for 20 minutes, then open both eyes later, the eye that was looking at it sees extremely "orange" colors in real life, almost like Night Shift is applied to the real world. the colors also look more washed out, overall that specific eye now sees reality as more drab and boring.

              on the other hand, the eye that didn't look at the 14 Pro at all is very bluish in comparison, in addition to more saturated and colorful (which is way more pleasant, and I'm pretty sure is the "correct" way of seeing reality).


              another weird issue with the 14 Pro:

              notably, if I use a PWM-free screen for hours (the "true" PWM-free ones, like my 2016 MacBook Pro, that doesn't even show minor brightness dips in a slow-motion iPhone recording)…

              when looking at the 14 Pro afterwards, I can notice the OLED strobing, but not just "on and off" — I actually see it strobing between "reddish light" and "bluish light" and see this constantly shift every time I dart my eyes

              I seem to only notice this on specifically my 14 Pro

                Some copy paste from Claude.ai on the effect. "This theory suggests that the effect is a result of the visual system trying to correct for perceived errors in color perception." is interesting. What happens if you view 50% desaturated photos for example for a long time?

                The McCollough effect is indeed a fascinating phenomenon in visual perception. Its longevity and specificity have made it a subject of considerable research and speculation in neuroscience and psychology. While the exact mechanisms are not fully understood, there are several theories about why this effect occurs and why it lasts so long. Let's explore some of the main hypotheses:

                1. Neural Adaptation Theory:

                This theory suggests that the effect is due to the adaptation of orientation-selective neurons in the visual cortex.

                • Biochemical basis: The prolonged exposure to specific color-orientation pairings might lead to changes in neurotransmitter release or receptor sensitivity in these neurons.

                • Neurological basis: The visual cortex contains cells that respond to specific orientations. The theory posits that these cells become "tuned" to respond differently based on the color associated with that orientation during the induction phase.

                • Associative Learning Theory:

                This theory proposes that the effect is a form of classical conditioning.

                • Neurological basis: The brain forms an association between color and orientation, potentially involving the strengthening of synaptic connections between color-processing and orientation-processing neurons.

                • Biochemical basis: This could involve long-term potentiation (LTP), a process that strengthens synapses through repeated stimulation, possibly mediated by NMDA receptors and changes in calcium signaling.

                • Error Correction Theory:

                This theory suggests that the effect is a result of the visual system trying to correct for perceived errors in color perception.

                • Neurological basis: The brain might interpret the consistent pairing of colors and orientations as an error in its color processing and attempt to compensate, leading to the observed effect.

                • Biochemical basis: This could involve changes in the balance of excitatory and inhibitory neurotransmitters in the visual cortex.

                • Cortical Plasticity Theory:

                This theory attributes the effect to the plasticity of the visual cortex.

                • Neurological basis: The prolonged exposure might induce structural changes in the neural networks of the visual cortex, similar to how learning and memory formation occur.

                • Biochemical basis: This could involve the synthesis of new proteins and the formation of new synaptic connections, processes associated with long-term memory formation.

                As for why the effect lasts so long:

                1. Consolidation of Neural Changes:

                2. The changes induced during the exposure phase might be consolidated into more permanent neural structures, similar to how memories are consolidated during sleep.

                3. This could involve epigenetic changes, altering gene expression in the affected neurons.

                4. Reduced Plasticity in Adult Visual Cortex:

                5. The adult visual cortex has less plasticity compared to that of children, which might make it slower to "unlearn" the induced effect.

                6. This reduced plasticity is associated with changes in the balance of excitatory and inhibitory neurotransmitters and the development of perineuronal nets around certain neurons.

                7. Periodic Reinforcement:

                8. Even brief exposures to similar stimuli in everyday life might serve to reinforce the effect, prolonging its duration.

                9. Memory-Like Mechanisms:

                10. The long-lasting nature of the effect has led some researchers to propose that it involves memory-like mechanisms, potentially engaging brain regions beyond just the primary visual cortex.

                It's important to note that these theories are not mutually exclusive, and the actual mechanism might involve elements from several of them. The McCollough effect's unusual characteristics make it a valuable tool for studying neural plasticity, perceptual learning, and the interaction between different aspects of visual processing.

                The long duration of the effect, in particular, challenges our understanding of sensory adaptation and opens up intriguing questions about the potential for long-term changes in sensory processing. Further research, particularly using advanced neuroimaging techniques and molecular biology methods, may provide more definitive answers about the precise mechanisms underlying this intriguing phenomenon.

                CopyRetry

                E

                Are there hypothetical links between this effect and visual snow? It has also been linked to NMDA receptors.

                Also it would be interesting to hear theories on how insights from it might be relevant to people that have problems seeing lcd panels as flat, as get severe eyestrain from dithering and similar

                Edit

                Your questions draw interesting connections between different visual phenomena and potential clinical applications. Let's explore these ideas:

                1. Links between the McCollough effect and visual snow:

                While there's no direct established link between the McCollough effect and visual snow, there are some intriguing parallels and potential connections:

                a) NMDA receptor involvement:

                • As you noted, both phenomena have been linked to NMDA receptors.

                • In visual snow, some researchers hypothesize that there's a dysfunction in inhibitory mechanisms, possibly involving NMDA receptors.

                • The McCollough effect, as mentioned earlier, might involve NMDA receptor-mediated plasticity.

                b) Cortical hyperexcitability:

                • Visual snow is thought to involve hyperexcitability in the visual cortex.

                • The McCollough effect demonstrates that the visual cortex can undergo long-lasting changes in response to specific stimuli.

                c) Persistence of effects:

                • Both phenomena show a persistence of visual effects even in the absence of the original stimulus.

                Hypothetical link: It's possible that both conditions involve alterations in the balance of excitatory and inhibitory processes in the visual cortex, mediated in part by NMDA receptors. The mechanisms that allow the McCollough effect to persist might provide insights into why visual snow symptoms can be long-lasting.

                1. Relevance to problems seeing LCD panels as flat and eyestrain from dithering:

                The insights from the McCollough effect could indeed be relevant to these issues:

                a) Orientation-specific processing:

                • The McCollough effect demonstrates that our visual system can develop orientation-specific color biases.

                • This might relate to why some people have trouble perceiving LCD panels as flat, as the pixel structure and subpixel arrangement have specific orientations.

                b) Adaptation to fine patterns:

                • The McCollough effect shows how prolonged exposure to specific patterns can induce lasting perceptual changes.

                • This could be relevant to understanding why some people experience more eyestrain from dithering patterns on displays.

                c) Individual differences in visual processing:

                • The variability in susceptibility to the McCollough effect might parallel the variability in people's responses to LCD displays and dithering.

                Theories on how these insights might be applied:

                1. Personalized display calibration:

                2. Understanding an individual's susceptibility to orientation-specific effects could help in developing personalized display settings to reduce eyestrain.

                3. Neural adaptation-based therapies:

                4. The principles of the McCollough effect could potentially be used to develop visual therapies to help individuals adapt to problematic display characteristics.

                5. Improved display design:

                6. Insights into how the visual system processes and adapts to fine patterns could inform the design of displays to minimize eyestrain and perceptual distortions.

                7. Diagnostic tools:

                8. Tests based on principles similar to the McCollough effect could potentially help identify individuals who might be more susceptible to display-related visual discomfort.

                9. Understanding temporal aspects:

                10. The long-lasting nature of the McCollough effect might provide insights into why some people experience persistent visual discomfort even after stopping screen use.

                It's important to note that these connections are largely theoretical at this point. However, they suggest promising avenues for research that could bridge our understanding of perceptual phenomena like the McCollough effect with practical applications in display technology and visual ergonomics.

                Further research in this area could involve:

                • Studying the neural correlates of display-related visual discomfort using neuroimaging techniques.

                • Investigating whether individuals who are more susceptible to the McCollough effect are also more likely to experience issues with LCD displays or dithering.

                • Exploring whether techniques that reduce susceptibility to the McCollough effect might also help in reducing display-related visual discomfort.

                These connections highlight the potential value of basic research into visual perception for addressing practical issues in human-computer interaction and visual health.

                  async Could you edit this to move this into a "> details" collapsed block? I don't want some of the important findings above this post to be pushed up so far, especially by something that's just an AI response

                  Thanks 🙂

                  DisplaysShouldNotBeTVs I also noticed one effect on the beach where after getting sun into one of my eyes for a long time then all black text on the iPhone turns almost solid red on that eye. Pretty weird effect. I would guess the strong sunlight causes bleaching and desensitizes some of the other cones. I guess that could also be a plausible hypothesis when all red lights starts to glow, that heavy blue exposure causes a desensitizing effect, and the brain tries to jack up the saturation of everything to compensate.

                  Are we confident whether this colored fringe 3D effect on text is distinct from subpixel antialiasing? The latter can also cause colored fringes on text.

                    @waydabber Did you try Quartz Display Services to possibly bypass more of the image processing when streaming a display? As far as I can tell when using a streaming display now it simply outouts it in a window, but it should be possible to take over the display.

                    macsforme It's entirely different, because it affects images too. In addition, even if the text you're displaying is manually rendered (or drawn in an image editor) and only made out of sharp black pixels, AND ClearType or font smoothing is off, you will still see the red and blue fringing on affected devices.

                    This fringing is definitely an additional effect, and not me simply seeing the physical RGB subpixels. This is because I also own a few older devices with screens (that happen to all be safe) where I can't see this effect at all —

                    on those usable devices, I can still notice physical RGB subpixels if I look closely, but it is very mild and the edges don't look "more exaggerated than the rest" at all.

                    On the bad devices, it looks like the fringing is extra full-size colored pixels that are visibly placed to the left and right of a filled shape, instead of colors only staying within the shape's pixels themselves.

                      dev