ShivaWind E-ink definitely requires dithering because it is a 1-bit display technology.. something I remember well from black and white 9" screen Mac days. I looked up your monitor, says it has 3 dithering modes A2 which is black and white only, A16 which attempts to simulate 16 shades of grey, and Floyd which is their implementation of a Floyd-Steinberg algorithm for watching video with. Nice thing to have, enjoy it.
Temporal Dithering Sensitivity - My Solution
this is a better video
https://www.youtube.com/watch?v=0y-I3hqQgCQThere is more to it than just the panel. There are colors that are fixed, and colors that dither. the dithering pattern can be sort of Celtic knots or like moving dots. there is dithering that always moves, and dithering that only moves when in proximity to a moving mouse cursor. I can say with certainty that the source of dithering is not the Dasung panel because the dithering pattern follows the mouse pointer in video other people have uploaded of thier computers to utube. so the dithering was following Their mouse, not mine.
- Edited
I can also see the difference between makes of video card or when I have dithering enabled for the video card using xrandr and when I have it disabled. The primary source is upstream of the panel. I am sure that the panel does dither, but the patterns here are too big and sparse to approximate Grey scale. they are definitely upstream,
I rounded up some old ccfls, did not work for me.
- Edited
JTL The Dell U3417W, which I have is 8 bit + FRC = 10 Bits: https://www.displayspecifications.com/en/model/4c9077e
When I use ditherig on a laptop connected to this monitor, banding is visible on the laptop screen, but not the external display.
https://www.dell.com/community/Monitors/10-bit-vs-8-bit/td-p/5158579
ryans In theory an external display should always be presented as 8-bits to the GPU and drivers and the display would do any needed dithering on it's own, but sadly that isn't the case and many GPU's and GPU driver do dithering even when presented with a 8-bit color depth from the displays EDID.
Your using 8-bit, you'd have to enable 10-bit manually, and many applications don't support it.
Came across this for possibly disabling dithering in Linux: http://philtechnicalblog.blogspot.co.uk/2012/05/temporal-dithering-good-for-colour.html
ryans It's been covered here before, but the settings for dithering in nvidia-settings
looks like this
(And no it does nothing for bad 9xx or 10xx cards)
ryans The hack the Amulet Hotkey made, is that to disable OS X dithering? My understanding is OS X does dithering constantly.
Yes and yes.
They only give the hack out with their $600 video capture boxes though.
I'm not going to say much but I have been doing research on how to do similar for the past 6 months
Email me sometime and I'll give you a TL:DR since I'm not done yet.
- Edited
JTL I see, this gets confusing because there can be multiple things that dither, including OS X, Windows 10 Anniversary Update, Intel Graphics, and other GPUs.
I don't see a reason why dithering is enabled on OS X with no user option to turn it off? There's no reason to dither if the display is 8-bit.
I looked at the Amulet Hotkey PDF. If someone was able to obtain the .dmg, I think that'd disable dithering and we wouldn't have to use their video capture boxes.
A few years ago I filed a bug report with Apple Accessibility about an unrelated issue with an API call and it eventually got fixed. Maybe we can petition Apple to make dithering user customizable on OS X and iOS.
Does anyone know if Ubuntu does temporal dithering?
ryans I looked at the Amulet Hotkey PDF. If someone was able to obtain the .dmg, I think that'd disable dithering and we wouldn't have to use their video capture boxes.
I presume it checks for the box first. That being said, there are zero downloads of the .dmg that are public.
ryans A few years ago I filed a bug report with Apple Accessibility about an unrelated issue with an API call and it eventually got fixed. Maybe we can petition Apple to make dithering user customizable on OS X and iOS.
How complex was the bug?
ryans Does anyone know if Ubuntu does temporal dithering?
It depends if the graphics driver itself dithers.
JTL How complex was the bug?
Not super complex, I'm not OS Kernel developer, but I think it was at most a month's worth of engineering time.
JTL t depends if the graphics driver itself dithers.
Sure, but our understanding is OS X will dither no matter if Intel, Nvidia, etc graphics driver, because the code that does dithering is in the OS X Kernel and not graphics driver code?
Kray Here's a picture taken with a Canon 80D of a BenQ GW2760HS connected to a desktop with a Quadro K4000 with dithering disabled.
https://i.imgur.com/DjlwGdg.jpg
Even though it's a compressed image meaning lower quality, try zooming in and out. You can see a similar pattern on a static image.
- Edited
JTL I emailed someone from a mailing list (he's a CS professor, so I assume he's pretty technical) about Intel graphics. This is his response:
What happened is that over time (say, maybe three months), my eyes stopped noticing the temporal dithering. I can sort of notice it if I pay close attention, but it doesn't cause me any more eyestrain. I was really surprised that this happened, and very grateful! This is in sharp contrast to (say) PWM flickering, which you never get used to. The other thing that can be done is to switch to Linux, because the Linux Intel drivers turn off temporal dithering (actually, you can configure it whichever way you want). Right now I'm using Mac computers (a laptop and a Mac Pro), both of which use temporal dithering. I also use F.Lux to cut down on the blue light, but that's a separate issue.
So he claims:
1. Linux drivers turn off temporal dithering? Really? Or it's a configuration, which maybe explains why some sufferers are okay and others are not?
2. Maybe I should ask intel-gfx (mailing list about intel drivers) about it?
3. I wonder what dithering is like on a virtual machine? Let's say I disable dithering on Linux and run Ubuntu in VMWare or VirtualBox. If that Windows 10 Anniversary Build nonsense is fixed, perhaps Windows would be usable. IIRC the vm would not use Intel graphics drivers.
- Edited
ryans 1. Linux drivers turn off temporal dithering? Really? Or it's a configuration, which maybe explains why some sufferers are okay and others are not?
It depends on the driver.
ryans 2. Maybe I should ask intel-gfx (mailing list about intel drivers) about it?
Good luck
ryans 3. I wonder what dithering is like on a virtual machine? Let's say I disable dithering on Linux and run Ubuntu in VMWare or VirtualBox. If that Windows 10 Anniversary Build nonsense is fixed, perhaps Windows would be usable. IIRC the vm would not use Intel graphics drivers.
Unless your using PCIe passthrough to pass through a "real" GPU into the virtual machine it would just use the emulated GPU hardware you configure in the virtual machine settings. Here's OS X running under a KVM hypervisor with a generic VESA graphics adapter, the image looks steady
https://cdn.jtl.pw/osxvm/ (feel free to download and zoom into the individual pixels if you'd like)