Sunspark Thanks a lot!
SSunspark
- 12 hours ago
- Joined Oct 18, 2015
- Edited
Sunspark I had one and ended up selling it. Something felt off with it even with the backlight off. When I did try it with the amber backlight on after some days my vision felt gray/hazy and colors were washed out. Must be that im sensitive to the orange phosphor? My vision did improve after a week or so of not using it. The 120hz refresh rate on it was nice, I wish it wouldve worked for me. Other little gripe which isnt a big deal is that I wish they would use a nicer shell/body since it doesnt really feel premium like I would expect it to be at the $729 price.
Dizzy its 8+2 FRC. Use 8bit color mode. The monitor is fake 10 bit
https://www.displayspecifications.com/en/model/2ebc34c1I’ve been making a huge pest of myself at local dealerships. I bring coffee shop gift cards. I’ve driven a bunch of cars multiple times, because like many others I do eventually get used to cars.
Here’s what I’ve found:
Alfa - their Giulia and Stelvio screens make my eyes a little tired but not lingering. I think I could get used to them. I’ve driven them repeatedly and may very well order a new one - they’re in my top three.
Acura - I find their displays (gauges, not infotainment) hard to look at. If I can’t focus on a screen it’s a non starter.
BMW iDrive 9 - that’s the big curved display. It is hard to focus on so like the Acura, right out.
Volvo - infotainment gave me eyestrain for a week, gauges for a day. Then it passed.
BMW iDrive 7 - gauges are ok. Infotainment is hit or miss. Some better than others.
BMW iDrive 8 - eyestrain that builds. Hard pass.
Volkswagen/Audi - seems ok at first but messes my eyes up for hours afterwards.
Mercedes - actually ok. Surprising. I’m going back for a longer drive of a c300 which might be my next car!
Subaru - 100% ok, all models. The infotainment took a little getting used to but now it’s fine. Gauges fine. All fine.
Toyota - 100% ok.
Honda - same as Acura.
Lexus - same as Toyota.
- Edited
Its an image and you will never find those fixtures online, because its a relic of old world technology only found in hard to reach places that hold on to the true light, but I opened one for you, its a pretty simple circuit, Im sure somebody can 3d print that, or not? idk. But seriously you will never have to replace the bulbs you put in that socket again.
- Edited
When Intel drivers are set to spatial dithering on a 6-bit panel (NOT temporal!), and you scroll down on a page, even a "still" spatial dithering pattern can cause photos to slightly flicker as you scroll.
This is because the image "interweaves" with different parts of the pattern as the image moves downwards, causing the pattern to appear like it's alternating back and forth, even though the pattern itself "technically" doesn't change.
On most e-ink screens like BOOX, it's even "worse" as the pattern itself regenerates in a very obvious way whenever there is movement, even though it's still when there's no motion. (This might be a solvable problem in the future, as I've heard the Modos open source e-ink monitor project implemented a different dithering method that tries to keep the pattern still…)
However, there's an interesting situation (especially if the display is a TN) where I've observed the "checkerboard pattern" created by spatial dithering at times having the side effect of "accidentally increasing LCD polarity pixel inversion flicker" in that area -- depending on the pattern -- as the checkerboard sometimes lines up with the same kinds of conditions that cause LCD inversion to be visible.
This kind of "accidental pixel inversion" means a spatial dithering pattern can occasionally still increase flicker(!) even if the pattern is not supposed to "intentionally" be changing!
Finally, Intel's version of spatial dithering sometimes decides to slightly alter the hue of a pixel to increase precision further even on grayscale content (the reason why they do this is that adding a slight color can "perceptually affect brightness" in a more fine-grained way). This is distracting to me because sometimes different gray shades in a UI can subtly look reddish or bluish when they're all supposed to "just be gray".
If the panel is good, I agree that spatial dithering is generally fine in most cases and it doesn't cause that much of an issue (especially compared to temporal) but there are still reasons to disable it.
In my case, I prefer disabling it because I have my screen set to grayscale often and I get distracted by the subtle color tints on certain shades of gray.
The flickering while scrolling as a photo interweaves with the pattern also definitely becomes noticeable enough at times to become annoying (especially when I'm scrolling very slowly)
Abstract
A short description of where I was, and where I am now. 10 years ago I was unable to use smartphones, modern computer monitors, new TVs, graphics cards, LED light bulbs without experience headaches or migraines that would persist for days. Now I am able to use all of these devices with a short adjustment period of several hours to weeks at the very worst. The symptoms I experience during the adjustment period are manageable with basic pain killers, and often by simply taking regular screen breaks. After adjustment I am able to use the devices without issues unless there is some major change to the device or my environment where re-adjustment time might be needed. There is still room for improvement, which I expect to continue, however my issues with screens are now minor compared with where I was originally.
Background
A more general background to my medical history and experience with screens. Some types of display would consistently trigger migraines, typically IPS screens. Graphics cards and software would cause more conventional headaches, and OLED screens caused a painful pinching sensation in my eyes as well as conventional headaches and mild migraines.
I also experienced severe gastrointestinal issues, fevers occurring predominately at night, muscle pain/inflammation serious enough to affect my kidney function after exercise, and an anxiety disorder.
All of the above has improved from being an serious issue, to either disappearing to becoming manageable with minor interventions.
What I did and experienced
Around 6 years ago I noticed the muscle pains I experienced had become much less frequent and less severe. I noted that for the past two years I had shifted my diet from being very high in red meat (800gram daily) to largely vegetarian but still eating diary daily and occasional red meat consumption. I did some searching and found that red meat / diary contains an inflammatory molecule (Neu5Gc) linked to various diseases. With this knowledge, I cut out all diary and red meat from my diet.
Within a few days my night fevers had entirely stopped, and my gastrointestinal symptoms had improved, confirming what I had read. What was entirely unexpected was the effect this dietary change appeared to have on my screen issues.
Below is a brief timeline of how my symptoms changed year by year:
· Years 1
o Much worse migraine pain
o Gastro, muscle pain, mental health improvement
· Year 2
o Lighter migraines compared to prior year
o Able to adapt to devices which caused minor issues
o Gastro, muscle pain, mental health improvement
· Years 3 - 4
o Migraines continue to become less painful
o Recovery time reduces
o Able to adapt to devices which caused minor issues much faster
o Able to adapt to devices which caused moderate symptoms
o Gastro, muscle pain, mental health improvement
· Years 5 – 6
o Migraines continue to become less painful, generally ‘silent’ ie without actual pain but other migraine symptoms
o Recovery time reduces, in many cases trivial ~30mins to an hour
o Able to adapt to devices which caused major symptoms
o Gastro, muscle pain, mental health improvement
Science
Here I will outline the scientific framework which explains my experience above. There is a quick summary below showing each key link in the framework, with more detailed explanations with links to scientific papers underneath.
eat read meat / diary ->
digested into small cell fragments ->
those red meat cell fragments used as spare parts in our human cells creating hybrid cells (NEU-5GC) ->
hybrid cells are attacked by immune system ->
immune system lowers level of serotonin as part of immune response (among other things) ->
low serotonin results in weak serotogenic neuron network in brain ->
impairs ability for brain to regulate activity ->
causes migraines/pain in response to flickering / unstable / blurry images
Neu5Gc
Neu5Gc is a cell component found in most mammals, but only in small amounts in humans. Human cells use a similar cell component Neu5Ga, however Neu5Gc from food is similar enough that it can be integrated into human cells. Unfortunately, because the Neu5Gc is not human in origin, any cells which contain it may be attacked by the immune system creating inflammation and other effects.
Sources for general overview of Neu5Gc
https://en.wikipedia.org/wiki/N-Glycolylneuraminic_acid
https://www.sciencedaily.com/releases/2016/10/161019160201.htm
Sources detailing human cell uptake of Neu5Gc
https://pmc.ncbi.nlm.nih.gov/articles/PMC218710/
sources showing Neu5Gc causes inflammation / increases cancer risk
https://bmcmedicine.biomedcentral.com/articles/10.1186/s12916-020-01721-8
autoimmunity
https://pmc.ncbi.nlm.nih.gov/articles/PMC4070528/
Response to chronic infection
The above shows that Neu5Gc can trigger an immune response. This means the immune has mistaken the human-Neu5Gc cells as an infection. When an infection occurs, one of the secondary response after the immune system attacks is to reduce the availability of vitamins/minerals/nutrition which could fuel an infection. Part of this response is to reduce the amount of amino acids in our body, one such amino acid is l-tryptophan. This is important because this amino acid is used to create serotonin in our bodies and brain.
Source showing chronic infection lowers serotonin
https://pmc.ncbi.nlm.nih.gov/articles/PMC4527356/
serotonin
This will focus on the role serotonin plays in migraine as it is well researched and documented. I am assume many of these conclusions also apply to conventional headaches, but the link is less clear. Study of the role of serotonin is complex, with many contradictory findings. Some studies show that people who experience migraines have low serotonin:
https://pmc.ncbi.nlm.nih.gov/articles/PMC4117050/
https://pmc.ncbi.nlm.nih.gov/articles/PMC3452194/
Other studies show higher serotonin levels in migraineurs.
https://www.sciencedirect.com/science/article/pii/S2213158218300160
However the brains responses to serotonin generally not working correctly in migraineurs is a common theme in research.
https://thejournalofheadacheandpain.biomedcentral.com/articles/10.1186/s10194-016-0711-0
Specific aspects of serotonin function have been found to cause migraine pain, where higher levels of specific receptor activity cause greater pain. I believe this is why my migraine pain initially increased. I had higher levels of serotonin activating receptors, but as is explained below, had not yet developed the beneficial aspects of higher serotonin levels which prevent migraines.
https://pubmed.ncbi.nlm.nih.gov/2045831/
These papers describe how serotonin neuron/receptor networks control/regulate brain activity
https://www.nature.com/articles/1395346
With these papers going further into how serotonin receptors control brain wave frequency. Think of this as similar to the clock speed of a CPU. Higher clock speeds speed up thinking and processing, but come at a cost of higher energy consumption. Like a CPU, we don’t want our brains to be running at a high frequency unless it is necessary because of harmful effects.
https://pmc.ncbi.nlm.nih.gov/articles/PMC3282112/
https://www.jneurosci.org/content/30/6/2211
These papers theorise that migraines are a protective response against the brain working too hard and damaging itself due to build up of toxic metabolic byproducts. This is analogous to how we experience muscle pain due to build of lactic acid after exercise.
https://www.sciencedirect.com/science/article/abs/pii/S0306987716305035
https://orbi.uliege.be/bitstream/2268/247255/1/Gross%20et%20al%20Metabolic%20f
How does this link to screens? Flickering light induces a change in brain frequency / metabolic rate in response to the frequency itself. This can cause harmful effects.
https://pubmed.ncbi.nlm.nih.gov/11355381/
https://en.wikipedia.org/wiki/Flicker_vertigo
https://pmc.ncbi.nlm.nih.gov/articles/PMC8710722/
In migraineurs this effect is stronger. This means that people who experience migraines are more sensitive to the effects of screen flicker.
https://jov.arvojournals.org/article.aspx?articleid=2770320
To summarise so far:
Insufficient serotonin receptors prevent control of brain activity in response to flickering light sources,
This triggers a migraine to protect the brain,
If a migraine is not triggered, brain disfunction could lead to a regular headache or other neurological symptoms.
This paper below is quite important. It shows in mice how low levels of serotonin (the neurotransmitter) results in lower levels of serotonin receptors and neurons (the bits of the brain that respond to the neurotransmitter). It also shows that these mice can recover when normal levels of serotonin return.
https://www.eneuro.org/content/4/2/ENEURO.0376-16.2017
Summary
A chronic immune system response to red meat/diary lowered my serotonin levels, increasing sensitivity to flickering light, resulting in migraines and other symptoms. I experienced a very slow improvement, this slow response to dietary changes seems to be caused by two factors:
It takes a long time for some cells affected by neu5gc to be replaced, some types of cells can survive for decades.
The brain takes time to grow new cells
This results in slow improvement as cells turn over and my brain develops new neural pathways which respond to serotonin. The Initial worsening of my migraines was caused by higher serotonin in absence of stronger serotonin neuron networks which take longer to develop.
How does this relate to other forum members experiences? Many forum members including myself found some relief from eye patching. I hypothesize that that eye patching reduces brain’s visual processing workload by correcting a visual defect, this lowers metabolic rate / activity, making screens less likely to trigger migraine or produce other symptoms of the brain working too hard.
Please do not read this as a fundamental explanation of our symptoms. We are complex machines and there may be other pathways which lead to our symptoms. I am expecting there to be some long term vegan forum members to whom this explanation cannot be correct. It may not even be the reason my symptoms improved, but it is the best explanation I have found.
Kind Regards,
Rob
(A picture of me receiving my doctorate in statistics & engineering)
- Edited
I'm late to the party, but I've suffered from eye strain/pain for several months. Tried different distros and different laptops with Intel and NVIDIA GPUs. No matter what settings I tried I could never fix my issue. So, I kept going back to Windows/WSL2.
Now, I'd like to report that my eye strain is finally gone! My solution was to buy a QD-OLED monitor (specifically, the HP Omen Transcend.) But I assume any other QD-OLED monitor should work.
On this new monitor, everything looks crystal clear! No matter which laptop or Graphics card I use. Even my Raspberry Pi 400 looks clear.
It did take a few days for my eyes to adjust to the new monitor, but it was worth it!
While I was struggling with this, besides trying software/hardware configurations, I also had my eyes checked.
Now, I'm running Ubuntu 24.04 with the default setup, except I installed fonts-ubuntu-classic, nvidia-driver-550 (although the nouveau driver was working fine as well.) My laptop does not have a DiplayPort, so I'm using USB-C (But HDMI also works fine)
In any case, I thought I'd share this hoping that it helps someone else!- Edited
Been a long time since I tried desktop Linux due to previous bad experience (both eye strain and app scarcity). Lately I moved my remote Linux server to NixOS. It's really something else. To the point that I decided to try a GUI version of it. The standard setup includes GNOME46. All of a sudden, it felt comfortable. Both Wayland and X11. Here are the systems parameters:
Next thing I installed Ubuntu on a different partition to test things out. Instantly felt the strain. Just as before. It does use GNOME but with its own modifications. I have no idea what it involves but the difference between plain GNOME and the Ubuntu version is really felt.
Then I installed Debian on yet another partition. Just as NixOS, it comes with plain GNOME46. It is safe too.
This is good news for me because I feel insecure using Windows. I locked system updates thru some app on the very first build of Windows 11 that I installed. I did it because I knew what happened to Windows 10 which after some point became absolutely unusable.
TL;DR
Plain GNOME46 @NixOS or Debian. Intel UHD 730. Linux kernel 6.6.56.- Edited
Well yes, but actually no. The evaluation iso is not the full version. You can use it for up to 360 days(90 days+3 reactivations). You can't activate it and continue using it forever. You also can't upgrade it to the full version, you need to reinstall windows.
Microsoft provides full versions of Windows LTSC on MVS, M365 Admin Center and OEM Portal but for them, you need to pay a high subscription fee. There is no way Microsoft will give out hashes of the actual full versions of LTSC builds to the public unless you're a bussiness/enterprise entity and bought at least five copies of it.
What this means is that you have to trust reputable forums like MyDigitalLife for hashes. The method is illustrated here.
- Edited
This also includes the alpha channel, which accounts for transparency, resulting in the RGBA format where each component takes up 8 bits, totaling 32 bits. Although this isn't always specified, in your case, the alpha channel is included in the total. No need to worry; you already have a 24-bit RGB signal.
- Edited
Hello so I decided to give it a go and ordered one of those hand made monitors from spectrumview.com. The whole process is super easy even with using Bitcoin to purchase. I bought Bitcoin on Phantom wallet app and then transfered it to the address mentioned on their site.
So my first thought is.. WOW.. the crate looks so professional! I'm use to getting monitors in cardboard boxes haha.
Now onto getting this thing setup!!
I am completely blown away by the craftsmanship and quality of this monitor. Beyond exceeds my expectations. This feels like furniture and not so much a monitor!The lightbox has intake and exhaust fans as well for keeping it well ventilated.
The laser engraving is suuuuuper cool. I love these small details. There's a lot of love put in this thing for sure.
The monitor it's self can be placed in front of a window to illuminate the display with natural light. You can also use the lightbox with the included diffuser that you insert behind the monitor if you prefer to use the incandescent light. Here is a picture of the monitor with the incandescent lightbox dialed to max brightness.
Now for the spectrometer test of the light being emitted from the display with its multi incandescent light bulb - light box. As you can see this is VERY good results as there's no blue light peak as you would find on a normal led backlit display. I will attach a comparison of a led backlit display below this image.
led backlit laptop display:
incandescent light box flicker test (Perfect sine wave. GOOD)
Just wanted to show everyone that this monitor does exist! The quality is amazing and is such a beautiful piece. The monitor came with additional light bulbs spares but luckily the size light bulbs are still commonly found as they are the size of appliance bulbs which are not banned. There is no cheap materials involved here. The power adapter is external and is a very nice Samsung adapter and not some cheap off brand adapter. I am pleased to find it's external as I heard internal power supplies could cause pixel inversion. The monitor is 4K VA and was told is true 10 bit. The seller has stated he tested it for dithering and none was found under a microscope.
There is also a joystick menu control behind the chin of the monitor as well. The side switch on light box is to turn it on/off and the knob is an analog voltage dimmer for the AC lightbulbs.
I only unboxed and set this thing up. I have not yet tested it on my eyes yet since I'm waiting for my eyes to unwind from a flare up + sick with covid which has messed my vision up. I will post an update when I get around to testing this. I just really wanted to get some content out there of it. Feel free to ask any questions!
I also get eye strain from Android 14 phones. I tend to think that temporal dithering has been implemented in either Android 13 or 14. Currently I use old phone from 2018, which has Android 9 and never update.
- Edited
I think that is moire pattern common to screens.
- Edited
tl;dr disable temporal dithering on your M1/M2/M3 with Stillcolor.
I got 16” M3 Max MBP about a month ago and it’s been absolute hell. By far the worst screen I’ve ever used. It’s like staring into lasers. 1600 nits! Previously I’ve used a mid-2012 retina MBP for almost a decade, and briefly a 2019 16” Intel MBP (at reduced brightness). Those gave me no eyestrain, no dry eyes, no light sensitivity, no inability to focus.
So within the first 24 hours of getting the M3 Max I found LEDStrain and learned about PWM and temporal dithering, and so did my adventure begin of trying to make my new laptop usable for more than 1 hour a day.
At first I thought I could just connect it to an external monitor and adjust the color profile and brightness and everything will be fine. By the time I realized how futile those hacks were my 2-week return window was through.
I tried everything mentioned on this forum, including @NewDwarf boot-args, BetterDisplay with mirrored virtual displays, Iris, SwitchResX, etc. Disabled motion, transparency and True Tone, switched to sRGB, dimmed blue light, turned on Dark Mode, turned off Dark Mode. These measures helped a tiny bit, but my eyes still became severely fatigued after 1 hour of use (even on external monitors).
If you can’t measure it, you can’t fix it
I wanted to see how PWM and temporal dithering look like. I needed quantify these things to determine if 1) they were the cause 2) if any other display adjustments I make have an effect.
Detecting PWM
Used my phone camera in manual video mode with a really fast shutter speed (1/12000). Detected PWM on
- 16” M3 Max MBP (edit: thin bars, vertical/horizontal depending on the phone's shutter direction)
- 2020 iPad Pro (thin bars)
- iPhone 15 Pro (diagonal waves)
PWM-free:
- mid-2012 15” MBP with Retina display
- M2 MacBook Air (need to re-test)
- LG 32” 4K UltraFine (IPS)
- Samsung 32” G7 (IPS)
- BenQ 24” GL2450-b (TN)
Interestingly, some of these PWM-free displays which I’ve used for years were suddenly giving me severe eyestrain when connected to the M3 Max.
Detecting Temporal Dithering (aka FRC)
To visualize temporal dithering you can run a true video capture of your screen though ffmpeg, which can create a diff of each successive frame pair and output a new video of those diff frames.
- Install ffmpeg using MacPorts
sudo port install ffmpeg
- Transform your video
input.mov
in Terminal with this command:
ffmpeg -i input.mov -sws_flags full_chroma_int+bitexact+accurate_rnd -vf "format=gbrp,tblend=all_mode=grainextract,eq=contrast=-60" -c:v v210 -pix_fmt yuv422p10le diff.mov
This uses a filter called time blend (watch this crazy demo) which layers every frame on the frame preceding it using
grainextract
blend mode (previously calleddifference128
). It gets the absolute difference of each RGB value in a pixel then adds 128. So if there’s no change from frame to frame, the output pixel should be RGB(128 128 128). We then adjust the contrast to make the difference more visible, and finally output a lossless video. I’m not an ffmpeg expert but the above command does the job. To output a compressed mp4 use-c:v libx264 diff.mp4
. This does the job of demonstrating dithering at much lower file size. Not pixel perfect but passable.Using QuickTime screen recording
This doesn’t work. I analyzed a lot of screen recordings from the 2012 MBP and the M3 Max and came to the conclusion that the recordings are at least a step before the application of temporal dithering. Whatever looks like dithering here is likely an artifact of compression.
Using a video capture card
Encouraged by @Seagull capture card thread, I got a Blackmagicdesign UltraStudio Recorder 3G. It accepts HDMI input and connects to your Mac using Thunderbolt 3. It can capture QuickTime Uncompressed 10-bit RGB at 1080p60. This is more than enough for our needs. But you gotta be careful with uncompressed videos, 3-4 seconds clock in at ~2GB.
Here are the results from the devices I had access to:
- 15” 2012 MBP: no dithering
- M2 Mac mini: dithers
- M2 MacBook Air 13”: dithers
- 16” M3 Max: dithers
Sample video demonstrating dithering (lossy compression)
Dithering ON, M3 Max, Sonoma desktop, 1s capture @ 1080p60
- INPUT: input-M3MaxSonomaDesktop-1080p60-1s-dithering.mp4
- TIME BLEND: diff-M3MaxSonomaDesktop-1080p60-1s-dithering.mp4
Just viewing the time blend video is pure torture! Makes you acutely aware of the muscles behind your eyes. If anyone wants the uncompressed recordings I can upload those but they’re heavy and look similar the compressed ones.
Time blend videos with dithering disabled simply show a plain gray screen.
Some findings about dithering:
- Dithering happens at the refresh rate, I tested up to 60Hz. If you set your display to 24Hz, pixels will flicker 24 times a second instead of 60 (obviously).
- If you play a 60fps video on a 60Hz display while dithering is enabled, you actually get no dithering in the video for the most part. There’s no time to dither.
Now that I knew that my laptop applies temporal dithering, I knew it was the cause for my eyestrain. Because the eyestrain was there even while using PWM-free displays, and it’s the only perceivable difference in output from my 2012 MBP and my M3 Max.
The hunt for a technical solution
I read all over this forum that it was impossible to disable dithering because the new Apple silicon GPUs are only capable of handling 10-bit color, and are always dithering no matter what. Seemed unbelievable and I was determined to find a solution with code.
I read this progress report on the Asahi Linux blog by marcan (they’re doing some incredibly hard and important work over there) about reverse engineering the M1 DCP (Display Coprocessor). They proved the CPU and DCP communicate back and forth. Messages like
IOMobileFramebufferAP::setDisplayRefreshProperties()
at the DCP interface were encouraging and hinted at an ability to configure the display.And over at their DCP tracer a
IOMobileFramebufferAP::enable_disable_dithering(unsigned int)
message was traced. All evidence that are there mechanisms in place to control dithering. The question now became how to send those messages.In my rabbit hole dive I also came across 2 important tools:
ioreg
which shows your I/O Registry- AllRez which dumps all display info in macOS.
So in the logs I saw a property called
enableDither = Yes
under a service calledIOMobileFramebufferShim
. All of this now seemed inter-related, and the puzzle pieces were falling into place.Based on all of that I figured a good starting point would be
IOMobileFramebuffer
which iPhone Development Wiki describes as a “a kernel extension for managing the screen framebuffer. It is controlled by the user-land framework IOMobileFramework.” On macOS it’s called IOMobileFramebuffer. It’s a private framework with not much literature on it. One way to examine it is to run it through a disassembler.So I loaded
/System/Volumes/Preboot/Cryptexes/OS/System/Library/dyld/dyld_shared_cache_arm64e
into Hopper, selectedIOMobileFramebuffer
and started looking at the symbols and found 2 interesting routines:_IOMobileFramebufferEnableDisableDithering
_kern_EnableDisableDithering
My best guess was that
_IOMobileFramebufferEnableDisableDithering
is a wrapper around_kern_EnableDisableDithering
which does the real work. Or is it the other way around?_kern_EnableDisableDithering
is a very simple function, the crux of it is this:IOConnectCallScalarMethod(r0, 0x1e, &var_10, 0x1, 0x0, 0x0)
Calls selector 30 on the
IOMobileFramebufferShim
object with a boolean value. So I quickly made a command line project in Xcode that does just that and this is what I got:(iokit/common) unsupported function
Uh oh! Disappointing. But I was not about to give up-- IOKit can return
kIOReturnUnsupported
for a variety of reasons, one of course being a complete lack of implementation, another being invalid or out of bounds arguments, or possibly a lack of privilege.So I then loaded the IOMobileFramebuffer framework dynamically with the help of this gist and invoked
IOMobileFramebufferEnableDisableDithering
directly and as expected got the same result.Maybe calling it wasn’t allowed from user space? But I wanted to dig deeper before messing around in kernel space.
So then I stumbled upon
IOConnectSetCFProperty
and I rememberedenableDither = Yes
from the registry. I thought I’ll just set it directly on the IOMFB object and maybe it will interpret it and affect the display downstream. So I did that and got:(iokit/common) unsupported function
Again. I was starting to lose hope at this point but in a last ditch attempt I thought I will just modify the I/O Registry directly, even though my understanding was that the registry was a lens into device state and modifying it from the top won’t affect the devices per se.
So I did just that.
kern_return_t ret = IORegistryEntrySetCFProperty(service, CFSTR("enableDither"), kCFBooleanFalse);
And got:
(os/kern) successful
Awesome! I ran
ioreg -lw0 | grep -i enableDither
to see if the registry was touched.| | | | "enableDither" = No | | | | "enableDither" = No | | | | "enableDither" = No | | | | "enableDither" = No | | | | "enableDither" = No
I thought I was dreaming! I wanted to verify so I plugged my capture card, recorded 3 seconds and ran it through ffmpeg and I couldn’t believe it! Dithering was gone! It was that simple.
Seeing is believing
The below video is a capture of YouTube playing a 1080p60 video.
At first dithering is enabled, then at 00:03:59 dithering is disabled and watch for yourself.
Dithering ON then OFF, M3 Max, Sonoma, YouTube @ 60fps, 1080p60
- INPUT: input-M3MaxSonomaYouTube60fps-1080p60-DitheringOnThenOff.mp4
- TIME BLEND: diff-M3MaxSonomaYouTube60fps-1080p60-DitheringOnThenOff.mp4
- TIME BLEND: YouTube link with bad compression
It turns out modifying the I/O Registry is a common way to tweak driver and device settings and there’s even a command line tool for that. However, the device driver/kernel class must allow and implement the inherited IOService::setProperties method for it to work.
Over the next couple of days I verified it with a few more recordings, including ones where I disable dithering mid-recording and the effect was immediate. Best of all I didn’t need all this proof-- I could simply use my computer again and suffer minimal eyestrain.
The downside of this method is that
enableDither
is reset back toYes
on computer restart. There’s possibly a way to avoid this by modifying the driver’s plist which might contain those properties, but that’s an exercise for another time. A simpler solution is Stillcolor which I developed to disable dithering on login and whenever a new display is connected.Introducing Stillcolor for macOs
Stillcolor is a lightweight menu bar app which simply disables dithering on login and whenever a new device connects. It’s pretty much in beta at the moment and needs M1/M2/M3 and macOS >= 13 to run. Tested on macOS 14 only, so will appreciate feedback from everyone here.
Please bear in mind that there could be unintended consequences from disabling dithering, so use the app at your own risk. The app is released under the MIT license.
(For some reason Chrome gives a suspicious download blocked warning-- I don’t know know why it does that, you can safely ignore it)
Make sure to enable “Launch at login”
To check wether it did the job, run the following in Terminal:
ioreg -lw0 | grep -i enableDither
Should see 1 or more
”enableDither” = No
.To re-enable dithering simply uncheck “Disable Dithering.”
A visual test that works for me is the Lagom LCD Gradient (banding) test
Set your built-in display’s color profile to sRGB at full brightness and look carefully at the gray parts, you should be able to see subtle banding when you disable dithering which happens in realtime.
Disabling dithering alongside other measures such reducing brightness and blue light will make using Macs enjoyable again. I think the built-in displays are still awful, and I recommend using an external monitor which you’ve previously been comfortable with.
Credits
Special thanks to my brother Ibraheem for letting me test on his Mac and display!
- Edited
I tried the Vision Pro!
Here is a comment I posted a few days ago on 9to5mac:
I tried a friend's Vision Pro and I can definitely say this is much less of an issue than I worried. I have issues with binocular vision and get eye strain with some types of screens in laptops/phones like OLED iPhones, so I was extremely surprised how comfortable VP is, maybe more than some "typical" screens. Not disoriented, felt great after 1.5hrs+ use!
I can 100% assure passthrough, although low res, is depth correct. Objects across room look far away as they normally would. (Something to note is that I did not see warping at all. The weird distortion people mention about Quest's passthrough is not there on VP.)
Your other comment about if it can create "differing ranges of focus" is actually true. Can be close, medium, far objects from both passthrough and VP's UI all within one scene. They appeared at the correct unique distances for me, able to shift focus between distances pretty easily.
The thing about eye strain in the context of close up objects is that it's less about actual distance of objects, and instead, the way eyes are moving to look at that object. Because VP's lenses allow the "close" screens to stay clear when eyes are diverged — like they usually would when looking far away — so eyes remain relaxed, more likely to focus farther than using phones. 2 unique screens (1 per eye) actually helped me "not focus wrong"!
Only eye strain was Mindfulness app, it blacks out passthrough, no "naturally moving" feed to compare depth of app graphics to. Only time I felt "too close" or "OLED taped to my face"
Based on my experience constantly browsing the web and trying a few apps for 1.5 hours…
I am actually very optimistic about Vision Pro!!
Disclaimer:
I'm typically sensitive to PWM, temporal dithering, and flickering LED/fluorescent lightbulbs. I do not have seizures or migraines at all, which means lights that are visibly flashing, like at a concert, are actually fine to me. However, my issue is when light "invisibly" flickers, such as PWM.
That's when I get extreme fatigue, eye strain, brain fog, reduced field of vision, trouble concentrating, and trouble focusing and moving my eyes correctly.
Yet all of those symptoms basically stop impacting me when I use "true zero flicker" setups, like Waveform lights, old dither-free graphics drivers, and FRC-free + DC dimmable monitors.
In the case of dithering, my sensitivity depends on the content. I seem to be OK with temporal dithering in media that has lots of motion anyway, like movies or fast paced action games. But when temporal dithering is applied to static content like on most MacBooks and newer GPUs, I cannot stand it, it completely prevents me from understanding information-dense text or feeling relaxed.
If you have more severe symptoms than I do, or also have issues with visibly flashing lights at concerts in addition to invisible flicker like PWM, my experience might not be as helpful or applicable.
A few more Vision Pro notes:
Screen resolution is adjacent to looking at a good projector, it's okay but not as sharp as reviews might lead you to think. It's not low res but not quite "retina" either. (Yes, my friend who owns the Vision Pro made sure to reset eye and head calibration and redo setup for me.)
Surprisingly, I did not notice any flicker in my peripheral vision from foveated rendering!
Despite using PWM OLED it did not feel like looking at iPhone OLED! This is probably because there is a dedicated screen per eye, as @Maxx was theorizing, so my eyes seemed to more intuitively know how to focus and I didn't get "the typical PWM strain".
BTW, Vision Pro's subpixel layout is decently closer to RGB, which is great because I vastly prefer RGB to the typical, uneven Diamond PenTile layout in OLED smartphones.
Not sure if temporal dithering is used in the visionOS UI, but I highly suspect it is (because Apple ColorSync…) What's great though is that the constantly moving background video feed actually did help reduce potential dithering strain. Since I'm always seeing natural motion in the background in sync with any tiny head movements, it's easier for my brain to relatively understand that still elements on the same screen should be processed as actually still.
Of course, a true 3D display like this prevents the issue that some bad 2D screens have where they "appear to have depth when they actually shouldn't". This screen actually does have depth, so flat objects appeared properly flat.
Temporal dithering is present in the screen stream when using Mac Virtual Display, almost hilariously so. The background of dark windows is obviously changing every split second or so, and text is literally twitching around at a magnitude that even someone with no sensitivity would notice LOL.
The funny thing is that I actually preferred this "so obvious it made me laugh" dithering to the usual macOS dithering, because it's NOT trying to convince my brain that it's static at all, and thus what my brain processes stays in sync with what my eyes see. Native visionOS apps don't seem to have this obvious flicker.
I hope that if Apple "fixes" this, they do it by removing dithering from Mac Virtual Display entirely, not by trying to make it "invisible"
Here is the absolute best feature about the Vision Pro:
Eye control. With default settings, the primary way to select what to click on Vision Pro is entirely through moving your eyes. I thought this would be a dealbreaker for me, because "bad screens" typically have a huge impact on my eye movement ability, but I am so happy to report that it is the opposite and almost therapeutic.
I tried desktop versions of websites with lots of tiny links, and even tried navigating around a complex web app (the web version of Visual Studio Code). Shockingly, I was totally able to click the right thing 90% of the time just by moving my eyes.
Because you have to move and focus your eyes correctly to even be able to click the thing you want, if you are using it and successfully navigating around, that was possible because your eyes did the right thing.
This means every time you interact with a UI element on the Vision Pro, it is literally training your eyes how to move better, and more in sync with your intentions.
I think this control method is THE reason why I had such a surprisingly comfortable Vision Pro experience, despite the use of PWM and (possibly) dithering.
You can also select letters on the virtual keyboard with just your eyes, which actually felt like a cool way to "exercise" darting my eyes around to precise locations. (I thought typing with my eyes like this would make me dizzy, but it surprisingly didn't!)
I actually came out feeling like I could control my eyes in the real world better, which is a really good sign that my theory that the Vision Pro might help "train my eyes" might actually be accurate. This is in contrast to typical "bad screens", which usually even make reality look worse for a few hours after using them.
I came out of the Vision Pro feeling pretty great with only some slight strain.
Of course, a custom "0 PWM, 0 FRC, 0 Dither" laptop or desktop setup definitely can still be more comfortable in the end and can offer wayyy more sharpness and clarity…
But the Vision Pro actually outclassed many "typical, unmodified modern displays" for me in comfort (including the rest of Apple's current lineup, LOL), which is awesome and something I didn't expect at all.
Just don't use the Mindfulness app! (or any other app that similarly blacks out the background)
Back when YouTube changed their video renderer making it trigger my symptoms, I wrote a desktop app to play YouTube videos through PotPlayer so I could mess with the video controls and try and find a pain free display setup
All it took was dropping the Saturation 50% (from 50%to 25%) and all YouTube videos became pain free. I've been watching daily for months with zero problems.
So whatever setup you have try dropping the saturation by 1/2 and see if it helps. I strongly believe that flicker/PWM/dithering has nothing to do with many of our issues and the problem is actually color temperature and rendering.
- Edited
NPR's Short Wave podcast has reported about the awful 60 Hz flicker being used by many holiday lights this year. This may be one of the first reports of flicker from LED lights harming peoples' health in the mainstream media, other than isolated anecdotes of individual people.
https://www.npr.org/2023/12/22/1198908957/led-lights-flicker-headache
Yes, that is currently a big problem. The companies have finally started to care about PWM, but they don't care much about remaining flicker. It is as if they think that no one would be affected by smaller flicker. There is little effort to produce truly constant light output. They label their products as "flicker-free" when there is measurable ripple that still causes eye strain and headaches. The current flicker standards don't help, as their thresholds of both flicker frequency and flicker percentages are set way too low. Progress in this area is unbelievably slow.
For example, I see it on the current TV I'm using as a monitor: 22 kHz, < 1%* ripple, usable. But when I switch inputs, the backlight readjusts itself to 22 kHz, 3-5%* ripple. That's too much already - symptoms within seconds. The current flicker standards consider these values "safe" by a large margin. The research data the standards are based on must be completely wrong. It seems they didn't ask persons that are sensitive enough.
*(Flicker percentages only roughly measured, as "(a - b) / a", a formula which produces higher values than those found in the standards papers.)Another problem is that some LED backlights (or even LED room lighting, while we're at it) may take several minutes to "warm up", during which they flicker much more than later on. If you come across a flicker review it is never clear if they took this into account. The problem here is that symptoms can be caused instantly and persist for hours. A safe LED (back)light needs to have instant stable light output.
If anyone wants to measure with equipment, have a look at our oscilloscope thread: https://ledstrain.org/d/312-homemade-oscilloscope-to-detect-pwm-diy-guide
Try 20H2, it's been the best for me.