Sorry guys if I'm asking too many questions lol. But say an external device introduces dithering to to a display would it cause that temporal flickering like dithering to even a true 8 bit panel that doesn't use frc?
A little confused
- Edited
Link Jeez a lot of questions. Let me try to answer some:
Plasma "pwm" is not an issue because the phosphor persistence time on plasmas is so long that any modulation of the signal is incredibly mitigated if not impossible to see/measure. In fact, it's such a long persistence that there's an afterimage when you turn the panel off that lasts for a few seconds... On an LED it's starkly visible because the persistence on the light is very short.
Plasma "dithering" is a subfield dithering, it's VERY large and stark. It's disconcerting if you sit close, if you are further away you don't notice it. But it's literally an ordered pixel dance that cycles different colors through areas of constant color to provide shading/depth, sort of similar to old tube TV's, and definitely can be distracting if you don't like it. I don't mind it, and it helps color depth substantially. The "temporal dithering" on other units is done at the actual pixel level. A solid purple screen on a Plasma is ... purple. Maybe it's mostly purple with a bunch of green sort of floating through it because you asked for a deeper purple. A solid purple screen with temporal dithering is... every single pixel flickering back and forth from red to blue in unison (or a pattern) to simulate purple, and literally hurts to look at.
We know the XBox and PS4 dither because ... well, you can SEE it if you get close to the screen. It's actually visible. Temporal dithering looks like a shimmer where the color swims and won't sit still. You can easily see it on large screens. On small screens it can be tough because of pixel density.
The Switch might dither, to be honest. nVidia is nowhere near as safe as they used to be. Last-gen Tegra chipsets had ZERO temporal dithering. New ones might. I haven't tried it. The built-in screen has good and bad versions, I've played with a few, but I've never plugged one into my plasma to see how it looks when piped over HDMI.
And yes, of COURSE if a device specifies dithering the monitor will do it, because it doesn't say "hey, dither that image!", it presents a shifting series of pixels. The TV can't tell if that's a moving image or a still image with dithering!
Listen, a true 8-bit PWM-free LED... and that's super rare and super expensive, because DC direct current dimming requires a LOT more electronics so it's usually a LIE... would be nice, assuming the color output wasn't also painful. A lot of people here are sensitive to blue light in addition to the PWM. But the reality is that if the source is dithered - and all ATI chipsets are dithered regardless of whether the output device has 8 bit color or not - it won't help.
Temporal dithering was designed to solve a problem. The problem was (and still is) that most displays have a CRAP color gamut. They're mass-market produced, and designed to make colors "pop" so that the average consumer thinks they have a great display and buys it. Most output devices ASSUME that the display they are attached to can't render the colors they output properly... and dither just to make sure.
One last thought since I saw another question - specifying 30 bit color only helps if the ST50 can produce 30 bit color. Can it? I don't know offhand, it's a nice unit but...
- Edited
Gurm So the XBOX1 when plugged in to the ST50, the X1 gives me the option to display at 24 bit "8 bit," 30bits "10 bit," and 36 bits "12 bit" color depth. What would be the best settings to use for the XBOX1 and ST50? Would using 36 bit setting mitigate Xbox1 dithering? I am assuming it can reproduce 30 bit and 36 bit color as with the x1 plugged into a monitor it doesn't give me the option. It only gives option for 8 bit with monitor.
About the nintendo switch, it utilizes a sm20b gpu and Tegra x1 chip which they say is a maxwell family nvidia chip. Do you suspect dithering?
Hypothetically if you have a true 8 bit panel hooked up to a source outputting at 8 bit why would there still be dithering? ATI can have dithering enabled but if say in the Xbox ones case, if you lock it in to display at 8 bits on a true 8 bit panel doesn't that mean there's no need for dithering?
In the case of plasma isn't dithering also being displayed by quickly alternating red and blue pixels to simulate purple? As far as pwm, I know exactly what you mean by image persistance and it leaving a afterglow even after you turn off the TV. Such was the case with a older Panasonic g20 plasma I had. But that never happens on the st50. As panel tech got newer I believe they improved phosphor speeds to reduce other artifacts. Also as you probably know different manufacturers use different ways to drive their subfield drives. For example 2012 Panasonic plasmas imploy a focused field drive of 2500hz and up as opposed to sub field drive if 600hz. Could this be why some people perceive flicker on some plasmas? Do oleds handle dithering the same way lcd's do? As in flicker between red and blue for purple?
I know I know a lot of questions.....
Thanks for the help Gurm!
- Edited
JTL so, does that mean an amd quadro using 10 bit hooked up to a true 10bit eliminate dithering? At least on the gpu side? And ditherig would take care of Intel?
so, does that mean an amd quadro using 10 bit hooked up to a true 10bit eliminate dithering?
First AMD video cards are their own brand. Quadro is an nVidia brand.
Second of all, in theory a native 8-bit display (most VA panels, some high-end IPS) SHOULD eliminate dithering. Unfortunately it can be so so, such as with the Intel cards as I'm sure you've read.
The dithering.exe software should work for Intel. I don't know if it's been tested on more than just laptops with integrated displays or not though.
And getting a 10 bit "workflow" working is very complicated, usually reserved for just photographers and such.
http://nativedigital.co.uk/site/2014/02/achieving-10-bit-display/
- Edited
So the big problem is that even when a display reports that it can produce 8-bit (or 10-bit) output, many cards/chipsets are "locked" in dithering mode. AMD/ATI is notorious for this. There used to be a registry hack to disable dithering but it looks to have been removed in later versions of the drivers, maybe? I don't know, as I haven't had an ATI card in a long time.
Macbooks are also notorious for this. You can plug a Macbook into the nicest display known to man and it will still dither, no matter what, in all modes. Nothing to be done about it unless JTL is successful in figuring out his kext injector.
And no, plasma dithering is not achieved by rapid pixel flicker, but rather by SLOW pixel progression. This is annoying to some people, but you can literally chase the moving pixels around with your finger on a plasma, which indicates that they're cycling at rather less than 60hz. It's ... sort of ... a combination of temporal and spatial dithering. Like a moving spatial dither. To me it looks more "natural" and my eyes tolerate it very well. But to others it's crazy annoying. You're supposed to sit far enough away that you don't see it, though.
OLED dither is achieved the same way as LCD, yes.
Last response (I hope I've hit all your points) is that I have no idea how the new nVidia chipset will perform on the Switch. I have used several Switches, but only in handheld mode. One was gorgeous and easy on my eyes, one was ... fine ... and one was slightly painful (like using an iPhone screen). Since all of them were running identical software, that leads me to suspect that the panels are not the same. Perhaps they use several manufacturers as they are attempting to meet demand? Or perhaps the ones that I liked less were newer, and they've cut corners to meet demand? Either way, I would expect that since at least one of the panels was ok for me that the output to my TV would also be ok? But I don't know for sure, Nintendo is odd - the WiiU has had its ups and downs for me but the handheld unit is ALWAYS fine, albeit being only 720P with I suspect lousy display characteristics (slow refresh, bad vieweing angle as compared to the switch).
For me, nintendo switch works perfect for eyes. And xbox one s as well. No strain.
- Edited
Plsnostrain okay cool. BTW, which display do you use to play Xbox and do you know what you have your color depth under display settings set to? Also, as for the Nintendo switch, do you play that in handheld mode and on a display with no strain?
- Edited
Gurm Nothing to be done about it unless JTL is successful in figuring out his kext injector.
I think a kext patcher would work better. It would need to be coded and reinstalled for each version of the kext though, and it would require SIP to be disabled.
Amulet Hotkey never figured out Intel GPU's either and I don't see that they use the AAPL
IORegistry keys.
The main problem right now is I only have this computer as my main one and I need a second computer I can use for kext debugging, so I can do breakpoints on the existing kext code and debug it in realtime.
Looking into a new graphics card for desktop (Probably a Quadro as I rarely game) and a new monitor, but that's another discussion entirely.