DisplaysShouldNotBeTVs Really nice find. I'm certain there are important things to be found here. The list of random shit they are doing instead of actually taking a grid of colored pixels and displaying them is just mind blowing.

Some random observations.

I've had a gradient overlay on my Mac with a weak red color near the edges, and blue near the middle for a few days, and it seems to make everything more pleasant. It almost makes the screen feel a bit curved.

Similar experiments can be carried out with slight color shifts, like making blacks more red, and whites more blue to offsett effects like the text jumping out of the page. I for one can't stand orange looking whites. This can be done with holding the option button when creating a color profile thru the build in MacOS solution, to a lesser extent with color table manipulation in BetterDisplay, or thru the app Gamma Control.

I have not found any overlays or patterns that affects chromostereopsis in any meaningful way without being too invasive, but I have however seen examples where tormening the color profiles and color table adjustments can create colored fringes that makes everything seem way more flat.

I advice everyone to actually look up some chromostereopsis image on an OLED screen and see how extreme the 3D effect is with oversaturated colors. I've had one of those image as my wallpaper for a few days now, and it seems like the effect is significantly reduced, so it might be possible to trigger some adaptation. Also it might be required to have enough stimuli to do any kind of adaptation.

Learning to suppress task-irrelevant visual stimuli with attention - ScienceDirect

    DisplaysShouldNotBeTVs is there a way to turn this off? Is it mentioned anywhere when this stuff was added to drivers?

    Just curious does anyone know if nvidia or amd does this sort of thing ?

    @waydabber is there a way to make an additional app or some sort of to be open source or so, to let us install it on working machines to switch between 8/10bit color mode? I think it will help a lot of people here. Currently i'm not able to install betterdisplay for some reasons on my working laptop.

    async I advice everyone to actually look up some chromostereopsis image on an OLED screen and see how extreme the 3D effect is with oversaturated colors

    The really interesting part is that the chromostereopsis images even "temporarily" affect typically safe displays, which is what really nailed this down for me as possibly THE core issue.

    For example, my 2016 Xiaomi Redmi 3 with LineageOS 18.1, which is one of the most comfortable mobile devices I own, typically looks very flat and is 100% usable for me.

    (The only phone I have with a screen that looks even flatter is the 2014 Nokia X on Android 4.3, which is sooooo comfortable but too slow to really do anything with aside from typing notes)

    However, the moment I put certain chromeostereopsis patterns on the Redmi 3 screen, the false 3D effect suddenly appears out of nowhere, and the pattern creates the same exact "vibrating and shaking" + "strain when trying to focus" feeling that I get from modern devices, especially on the red areas of the pattern.


    Side note:

    The first time I EVER experienced a device with a false 3D effect (and the corresponding eyestrain and brain fog) was in 2015 — it was the GPD XD Android gaming handheld, it ran Android 4.4, although had more obscure specs such as a Rockchip CPU.

    The moment I started playing games on it I noticed that the game "looked 3D" immediately which confused me so much. It was the first time I ever felt "depth" outside of a true stereoscopic 3D screen. However I kind of brushed it off initially, because I had no idea back then if it was from the screen or if it was "just me".

    Although my other devices at the time like my old laptop, luckily, were (and remain) safe — even then, something was very off about that console (but I wasn't yet aware that it was connected to the 3D effect).

    I had difficulty playing games and following what was going on, and sometimes I felt a light "seasickness" after playing. I was otherwise skilled at video games back then, so I wondered why I never completed any games on that system. Now I know why.

    I also vividly remember noticing red artifacts/color fringing next to all white elements on the handheld's screen, which 100% lines up with my current theory.

    Edit 9/10/2024: I was incorrect that 2015 was the "first time" I experienced it. The first device I had the depth perception issue was a Late 2008 MacBook Pro I bought in 2012 (although to a milder degree)… I still have that laptop today and it still has that issue, on both macOS and Linux, it just clicked in my memory that I actually do remember it looking like that back then. I have evidence.. one image I found in the downloads folder from 2014, I actually remember saving because it "looked 3D" on that screen

    Weridly, a 2009 MacBook Pro I just bought recently (that actually has more PWM than the 2008) is surprisingly fine in Linux, unlike the 2008.

      DisplaysShouldNotBeTVs Interesting. As I noted further up here. I did the double invert trick that significantly reduces high gamut colors (and possibly other things on iPhone 14 Pro Max. I switched it off for 1-2 days and noticed this intense red colors on lots of other monitors again. Which is uncommon these days. I was testing the LG woled at some point so it could have been influenced by that as well, but nevertheless interesting.

      I also noted this experience where color filters on the same mobile caused a significanat shift in how other non screen colors looked. Tbh the entire phone is cursed and something is going on.

      I don't necessarily think exposure is always bad, but I guess in some cases it bypasses or messes with the proper adaptation mechanisms. Possibly when the effects are subtle or high frequency. Would love to hear what happens if someone goes all in to use high gamut wallpapers and 3d effects for a few days, if someone wants to risk it. I put one of these 3d ones on my iPhone (with double invert) and have noticed that I experience the 3d effect less.

      One point to note as well is that with shitty glasses or bad fitting contact lenses you'll get chromatic aberration that can look a bit similar to fringing, and that probably doesn't help too much. This also happens more with too high of a prescription strength. White text on a black oled background would probably be the worst one here.

      Mostly all my issues started at a time where I wore too high prescription contact lenses with a green ting for screens (Acuvue Max). I also switched from the touchbar mac to my M1 Max at roughly the same time. But then again, there has been more and more led bulbs, oled tvs and other things, so it is always hard to pinpoint.

        DisplaysShouldNotBeTVs What you're saying makes so much intuitive sense to me, I have two identical MacBook Pro i9 Vega20, the one with later firmware is symptomatic for me… I always felt side by side the problematic one sort of feels like looking at the identical image but a different focal length… felt like a wider lens focal length… so this false 3D makes perfect sense to what I'd felt but couldn't make sense of.

        McCollough effect - Wikipedia

        The McCollough effect is a phenomenon of human visual perception in which colorless gratings appear colored contingent on the orientation of the gratings. It is an aftereffect requiring a period of induction to produce it. For example, if someone alternately looks at a red horizontal grating and a green vertical grating for a few minutes, a black-and-white horizontal grating will then look greenish and a black-and-white vertical grating will then look pinkish. The effect is remarkable because, although it diminishes rapidly with repeated testing, it has been reported to last up to 2.8 months when exposure to testing is limited.

        async I also noted this experience where color filters on the same mobile caused a significanat shift in how other non screen colors looked. Tbh the entire phone is cursed and something is going on.

        yeah i have EXACTLY the same experience with my iPhone 14 Pro: if i use it with one eye covered for 20 minutes, then open both eyes later, the eye that was looking at it sees extremely "orange" colors in real life, almost like Night Shift is applied to the real world. the colors also look more washed out, overall that specific eye now sees reality as more drab and boring.

        on the other hand, the eye that didn't look at the 14 Pro at all is very bluish in comparison, in addition to more saturated and colorful (which is way more pleasant, and I'm pretty sure is the "correct" way of seeing reality).


        another weird issue with the 14 Pro:

        notably, if I use a PWM-free screen for hours (the "true" PWM-free ones, like my 2016 MacBook Pro, that doesn't even show minor brightness dips in a slow-motion iPhone recording)…

        when looking at the 14 Pro afterwards, I can notice the OLED strobing, but not just "on and off" — I actually see it strobing between "reddish light" and "bluish light" and see this constantly shift every time I dart my eyes

        I seem to only notice this on specifically my 14 Pro

          Some copy paste from Claude.ai on the effect. "This theory suggests that the effect is a result of the visual system trying to correct for perceived errors in color perception." is interesting. What happens if you view 50% desaturated photos for example for a long time?

          The McCollough effect is indeed a fascinating phenomenon in visual perception. Its longevity and specificity have made it a subject of considerable research and speculation in neuroscience and psychology. While the exact mechanisms are not fully understood, there are several theories about why this effect occurs and why it lasts so long. Let's explore some of the main hypotheses:

          1. Neural Adaptation Theory:

          This theory suggests that the effect is due to the adaptation of orientation-selective neurons in the visual cortex.

          • Biochemical basis: The prolonged exposure to specific color-orientation pairings might lead to changes in neurotransmitter release or receptor sensitivity in these neurons.

          • Neurological basis: The visual cortex contains cells that respond to specific orientations. The theory posits that these cells become "tuned" to respond differently based on the color associated with that orientation during the induction phase.

          • Associative Learning Theory:

          This theory proposes that the effect is a form of classical conditioning.

          • Neurological basis: The brain forms an association between color and orientation, potentially involving the strengthening of synaptic connections between color-processing and orientation-processing neurons.

          • Biochemical basis: This could involve long-term potentiation (LTP), a process that strengthens synapses through repeated stimulation, possibly mediated by NMDA receptors and changes in calcium signaling.

          • Error Correction Theory:

          This theory suggests that the effect is a result of the visual system trying to correct for perceived errors in color perception.

          • Neurological basis: The brain might interpret the consistent pairing of colors and orientations as an error in its color processing and attempt to compensate, leading to the observed effect.

          • Biochemical basis: This could involve changes in the balance of excitatory and inhibitory neurotransmitters in the visual cortex.

          • Cortical Plasticity Theory:

          This theory attributes the effect to the plasticity of the visual cortex.

          • Neurological basis: The prolonged exposure might induce structural changes in the neural networks of the visual cortex, similar to how learning and memory formation occur.

          • Biochemical basis: This could involve the synthesis of new proteins and the formation of new synaptic connections, processes associated with long-term memory formation.

          As for why the effect lasts so long:

          1. Consolidation of Neural Changes:

          2. The changes induced during the exposure phase might be consolidated into more permanent neural structures, similar to how memories are consolidated during sleep.

          3. This could involve epigenetic changes, altering gene expression in the affected neurons.

          4. Reduced Plasticity in Adult Visual Cortex:

          5. The adult visual cortex has less plasticity compared to that of children, which might make it slower to "unlearn" the induced effect.

          6. This reduced plasticity is associated with changes in the balance of excitatory and inhibitory neurotransmitters and the development of perineuronal nets around certain neurons.

          7. Periodic Reinforcement:

          8. Even brief exposures to similar stimuli in everyday life might serve to reinforce the effect, prolonging its duration.

          9. Memory-Like Mechanisms:

          10. The long-lasting nature of the effect has led some researchers to propose that it involves memory-like mechanisms, potentially engaging brain regions beyond just the primary visual cortex.

          It's important to note that these theories are not mutually exclusive, and the actual mechanism might involve elements from several of them. The McCollough effect's unusual characteristics make it a valuable tool for studying neural plasticity, perceptual learning, and the interaction between different aspects of visual processing.

          The long duration of the effect, in particular, challenges our understanding of sensory adaptation and opens up intriguing questions about the potential for long-term changes in sensory processing. Further research, particularly using advanced neuroimaging techniques and molecular biology methods, may provide more definitive answers about the precise mechanisms underlying this intriguing phenomenon.

          CopyRetry

          E

          Are there hypothetical links between this effect and visual snow? It has also been linked to NMDA receptors.

          Also it would be interesting to hear theories on how insights from it might be relevant to people that have problems seeing lcd panels as flat, as get severe eyestrain from dithering and similar

          Edit

          Your questions draw interesting connections between different visual phenomena and potential clinical applications. Let's explore these ideas:

          1. Links between the McCollough effect and visual snow:

          While there's no direct established link between the McCollough effect and visual snow, there are some intriguing parallels and potential connections:

          a) NMDA receptor involvement:

          • As you noted, both phenomena have been linked to NMDA receptors.

          • In visual snow, some researchers hypothesize that there's a dysfunction in inhibitory mechanisms, possibly involving NMDA receptors.

          • The McCollough effect, as mentioned earlier, might involve NMDA receptor-mediated plasticity.

          b) Cortical hyperexcitability:

          • Visual snow is thought to involve hyperexcitability in the visual cortex.

          • The McCollough effect demonstrates that the visual cortex can undergo long-lasting changes in response to specific stimuli.

          c) Persistence of effects:

          • Both phenomena show a persistence of visual effects even in the absence of the original stimulus.

          Hypothetical link: It's possible that both conditions involve alterations in the balance of excitatory and inhibitory processes in the visual cortex, mediated in part by NMDA receptors. The mechanisms that allow the McCollough effect to persist might provide insights into why visual snow symptoms can be long-lasting.

          1. Relevance to problems seeing LCD panels as flat and eyestrain from dithering:

          The insights from the McCollough effect could indeed be relevant to these issues:

          a) Orientation-specific processing:

          • The McCollough effect demonstrates that our visual system can develop orientation-specific color biases.

          • This might relate to why some people have trouble perceiving LCD panels as flat, as the pixel structure and subpixel arrangement have specific orientations.

          b) Adaptation to fine patterns:

          • The McCollough effect shows how prolonged exposure to specific patterns can induce lasting perceptual changes.

          • This could be relevant to understanding why some people experience more eyestrain from dithering patterns on displays.

          c) Individual differences in visual processing:

          • The variability in susceptibility to the McCollough effect might parallel the variability in people's responses to LCD displays and dithering.

          Theories on how these insights might be applied:

          1. Personalized display calibration:

          2. Understanding an individual's susceptibility to orientation-specific effects could help in developing personalized display settings to reduce eyestrain.

          3. Neural adaptation-based therapies:

          4. The principles of the McCollough effect could potentially be used to develop visual therapies to help individuals adapt to problematic display characteristics.

          5. Improved display design:

          6. Insights into how the visual system processes and adapts to fine patterns could inform the design of displays to minimize eyestrain and perceptual distortions.

          7. Diagnostic tools:

          8. Tests based on principles similar to the McCollough effect could potentially help identify individuals who might be more susceptible to display-related visual discomfort.

          9. Understanding temporal aspects:

          10. The long-lasting nature of the McCollough effect might provide insights into why some people experience persistent visual discomfort even after stopping screen use.

          It's important to note that these connections are largely theoretical at this point. However, they suggest promising avenues for research that could bridge our understanding of perceptual phenomena like the McCollough effect with practical applications in display technology and visual ergonomics.

          Further research in this area could involve:

          • Studying the neural correlates of display-related visual discomfort using neuroimaging techniques.

          • Investigating whether individuals who are more susceptible to the McCollough effect are also more likely to experience issues with LCD displays or dithering.

          • Exploring whether techniques that reduce susceptibility to the McCollough effect might also help in reducing display-related visual discomfort.

          These connections highlight the potential value of basic research into visual perception for addressing practical issues in human-computer interaction and visual health.

            async Could you edit this to move this into a "> details" collapsed block? I don't want some of the important findings above this post to be pushed up so far, especially by something that's just an AI response

            Thanks 🙂

            DisplaysShouldNotBeTVs I also noticed one effect on the beach where after getting sun into one of my eyes for a long time then all black text on the iPhone turns almost solid red on that eye. Pretty weird effect. I would guess the strong sunlight causes bleaching and desensitizes some of the other cones. I guess that could also be a plausible hypothesis when all red lights starts to glow, that heavy blue exposure causes a desensitizing effect, and the brain tries to jack up the saturation of everything to compensate.

            Are we confident whether this colored fringe 3D effect on text is distinct from subpixel antialiasing? The latter can also cause colored fringes on text.

              @waydabber Did you try Quartz Display Services to possibly bypass more of the image processing when streaming a display? As far as I can tell when using a streaming display now it simply outouts it in a window, but it should be possible to take over the display.

              macsforme It's entirely different, because it affects images too. In addition, even if the text you're displaying is manually rendered (or drawn in an image editor) and only made out of sharp black pixels, AND ClearType or font smoothing is off, you will still see the red and blue fringing on affected devices.

              This fringing is definitely an additional effect, and not me simply seeing the physical RGB subpixels. This is because I also own a few older devices with screens (that happen to all be safe) where I can't see this effect at all —

              on those usable devices, I can still notice physical RGB subpixels if I look closely, but it is very mild and the edges don't look "more exaggerated than the rest" at all.

              On the bad devices, it looks like the fringing is extra full-size colored pixels that are visibly placed to the left and right of a filled shape, instead of colors only staying within the shape's pixels themselves.

                DisplaysShouldNotBeTVs Thought quite a lot about this.

                • There have been examples of removing the McCollough effect by doing the same with alternative colors, and I saw one note about untraining it by viewing the same types of patterns at an angle. If these types of adaptations are involved, then in theory it would be possible to create overlays, colorshifts or other visual training that offsets it.
                • Imo more people should find the most reliable way to see if their symptoms turn on or off, and then do stupid experiments like putting the computer monitor at a 45 degree angle for a day. Then tilt it 45 degrees to the side. Or even do something properly hard like using it upside down for a few hours to force the brain to do some new adaptation Or simply use a video wallpaper, or put a light that hits the screen, or add a big bezel. People can laugh about it, but there are tons and tons of ways to modulate things that no one even thought about testing, and everything that provides some relief is a clue as to what needs to happen to fix it. For example I get the same type of blinking effect on text when things are really bad that you get from viewing black and white striped patterns, but if I hold the iPhone light so it shines into the screen near the text it instantly goes away. As does all glowing effects on text. So obviously I force some other pathway to get active.
                • I'm pondering about the effect of dark mode and oled / miniled screens. Usually you get a proper reaction and pupil constriction when things are too bright, but with this you can blast a pretty intense focused amount of "white" light into your eyes. On top of this you get super sharp edges, and you get the unatural combo of white text on a black background that doesn't really exist anywhere. So thinking about this McCollough effect it isn't entirely unfanthomable that this could force some advence adaptation near other edges. My guess is that our visual system mostly views the least bright areas in our visual field as we ignore the sky.
                • Also for dark mode it is really problematic with astigmatism, or if there is any problem with prescription strength or chromatic abberations.
                • Smoothing or Mac antialiasing might offset the chance of getting adverse adaptations from the Edge-Detectors (McCollough effect), but it might also make the eyes think it needs to refocus on high dpi displays.
                • Viewing desaturated photos most likely causes adverse adjustments.
                • A gamma curve with more blue in blacks seems to work better for me. Adjustable in BetterDisplay.

                async Really nice find. I'm certain there are important things to be found here. The list of random shit they are doing instead of actually taking a grid of colored pixels and displaying them is just mind blowing.

                It looks like these 3 people are involved with many papers relating to the false 3D effect (including the one I previously mentioned that was funded by Intel Labs — BTW, I just noticed that one of the authors of that paper is associated with NVIDIA)

                Profiles:

                https://neurotree.org/neurotree/publications.php?pid=153838

                https://neurotree.org/neurotree/publications.php?pid=1027

                https://neurotree.org/neurotree/publications.php?pid=15601

                Some of them are specifically related to stereoscopic displays, but there are a surprising amount of papers on these pages that apply to traditional 2D displays as well (AKA relevant to us)

                It looks like the "modern" version of this effect may have started with this paper from November 2017:
                https://dl.acm.org/doi/pdf/10.1145/3130800.3130815 👀


                EDIT: Just found this paper with a ton of information about it

                https://theses.ncl.ac.uk/jspui/bitstream/10443/5772/1/Maydel%20F%20A.pdf

                Page 78:

                Other studies have proven that accommodative responses can be elicited by simulating the effects of LCA with the three primary colours of a screen

                a method to render simulated blur that incorporates the LCA of the eye and generates retinal images similar to those found for natural defocus. They showed that this method could be used to drive the accommodative response of observers at distances of up to 1.4 dioptres away from the screen, both when viewed through a pinhole and through a natural pupil

                Including more confirmation that it worsens image quality and can cause depth conflicts:

                These results indicate that the visual system uses LCA as an important cue to accommodation, even when it is in conflict with other cues such as defocus or microfluctuations, and when it is detrimental for overall retinal image quality (as accommodating away from the screen would worsen the defocus of the image).

                Page 90:

                presented images to participants that simulated positive or negative refractive errors of up to 1.4 dioptres by differentially blurring the primaries of the screen at luminance edges, as LCA would on a real scene. Their responses [to the simulated LCA] were as robust as those triggered by an actual change in the focal distance of the target.

                […However,] all other cues such as micro-fluctuations and higher order aberrations would be indicating to the visual system that no change in accommodation was required

                Page 92:

                observers would accommodate close to the peak of their luminous sensitivity. However, our results suggest that the visual system maintains this strategy when accommodating to mixtures of narrowband illuminants, even when it might lead to suboptimal image sharpness. This means that visual displays that use narrowband primaries, particularly those that are used at near distances from the eye, might not be ideal

                Page 153 is really interesting:

                Modern digital displays are increasingly using narrowband primaries such as lasers and Light Emitting Diodes (LEDs). This allows for a wider colour gamut to be shown as well as higher energy efficiency; however, it is not clear how this might affect our perception, and in particular, our ability to accommodate and keep the image in focus.

                considering wavelength for accommodative demand would be more relevant for visual displays that are used at nearer distances from the eye. It is important to note however, that we found large individual differences in this effect

                We hypothesised that observers could be either maximising contrast at lower spatial frequencies, even when this is detrimental to contrast at higher spatial frequencies and these higher frequencies are task relevant

                For practical applications, this means that mixtures of two narrowband illuminants [[i.e. red and blue]] are not optimal for maximising retinal image quality, particularly at high spatial frequencies.

                However, the author didn't seem to realize the importance of checking whether these techniques are already being used in the devices themselves that the studies were done on (I'm very sure at this point that they are)

                  async

                  Just found two more interesting ones

                  1:

                  https://ijrat.org/downloads/Vol-2/may-2014/paper%20ID-25201456.pdf

                  Lots of technical details about the technique here + more example images

                  2:

                  https://cse3000-research-project.github.io/static/0a605a3e4f4f6388cec3388286bd0f9d/poster.pdf

                  https://repository.tudelft.nl/record/uuid:178a950e-32c3-4397-a014-5a53d740ae74

                  This is based off the 2011 Samsung one, although is more basic as it's just a small implementation done by a student (which is why the color shifting is more noticeable). However, there are some more examples here.

                  Frustratingly, the section about "ethics" literally only talks about the ethics of someone artificially editing a photo, and NOT about the repercussions of these types of images on eyesight… 🤦‍♂️ 🤦‍♂️

                  DisplaysShouldNotBeTVs Woah. Nice insights. I went down a few fabbit holes with accomodation today. There are wealths of information about how the visual system works. I tried to figure out about why some spatial frequencies trigger flickering in migraine / vss, and if it can be trained away. I'm certain there are techniques to reverse some of the issues caused by the apple screens. The best possible solution would be something that actually untrains whatever is causing the screen issues, so people don't have to fight every single screen, OS and bulb for all of eternity.

                  Tons of places where rivalry can take place and cause issues. Blue-Yellow Opponency, koniocellular vs parvocellular, different spatial frequencies. It is possible to shift things to other pathways with imagery, overlays etc. Also wondering if things like the same amount of red and green while making yellow tones and pure white causes issues.

                  There are also things that can shift how we view colors.

                  Effects of acute high intraocular pressure on red-green and blue-yellow cortical color responses in non-human primates - ScienceDirect

                  Also, Apple added support for capturing HDR screenshots/streams. Probably doesn't include all processing, but at least it might be usable for some types of diffing tools or overlays. Also they deprecated like 20 other methods and ways to capture the screen. Almost feels like it should be absolutely impossible to get the output right before it reaches the screen. I don't think there exists a single public tool that can capture with HDR. Might create a tool that measures potential rivalery, or overlays a diff of changes. Not sure if it will be useful without a capture card tho.

                  Capture HDR content with ScreenCaptureKit - WWDC24 - Videos - Apple Developer
                  Capturing screen content in macOS | Apple Developer Documentation (sample project)

                    dev