A little confused
Link I always thought that because plasma's had no backlight, they didn't use PWM, but maybe I am wrong. I thought the flicker on plasma's was actually the heavy "coarse" PWM, as lowering the brightness on a plasma (on most plasma's this would mean lowering the contrast) usually decreases the flickering effect.
If you are really worried cut the plasma out of your routine for a couple of weeks and see if your eye's improve.
Here is a great post I found from @Gurm about plasma dithering:
Gurm Plasmas use a much "coarser" temporal dithering, and they always have. A "grey box" will actually be a marching series of pixels, similar to old analogue TV snow, and this is actually perfectly OK for my eyes. But it would be foolish to assume it's ok for everyone else's eyes. There is no "individual pixel temporal dithering" that occurs on a Plasma but rather "field dithering" which may certainly trigger other people's eyestrain.
As for how I know that the Xbox One is still dithering on the plasma... well, the plasma is the BEST way to view the xbox one, but I can still only do so for short times. Since it's a Radeon chip, my assumption is FRC/Temporal. It's very hard to directly test for!
This entire thread is a goldmine for plasma technology!!! Very interesting.
http://www.avsforum.com/forum/40-oled-technology-flat-panels-general/1047145-480hz-sub-field-drive.html
Plasmas don't use "PWM", inasmuch as PWM means "Pulse Width Modulation", and that's not a relevant term unless you're talking about an LED or a Fluorescent tube. Plasmas use millions of individual encapsulated glass vacuum tubes filled with a gas plasma and cause them to fluoresce by passing current across them. The amount of current determines the strength of fluorescence.
But to answer the OP's questions - XBox One and PS4 use AMD/ATI chipsets which DO use dithering by default. It is quite noticeable on some TV's and painful on others. It's pretty ok on my Plasma with the color output and input set properly, but really bad on some other TV's. The Switch uses an nVidia chip and is ok for me on the big screen but not ok on the small screen which is why you would buy a switch. The WiiU had a similar ATI chip, wasn't quite as bad as the PS4 (the worst of the lot) but still not as good as the Wii (no pain at all) or the XBox 360 (no pain at all) or the PS3 (best display EVER for me).
Actually I get confused too.
So far I collected these three possible causes:
1. Color Dithering
2. PWM
3. White Point(color temperature)
No.1 is controlled by graphics driver (software)
No.2 & No.3 is controlled by display manufacture (hardware)
I don't know which one is the main cause, and which one is of secondary. Maybe we should open a new discussion specifically collect these infomation.
No, you're still confused.
Color dithering can be caused by display driver, output hardware, display hardware, or display software. There are many causes, but in every case it is intended to compensate for either reduced bandwidth or insufficiency in the display. Ironically, some of the nicest displays use the most aggressive dithering, totally missing the point thereof.
PWM is most often a function of the panel, yes.
White point is controlled by hardware and software, both source and destination.
- Edited
Gurm okay interesting information. So I have a panasonic st50 would setting the bits to 36bits mitigate the Xbox one dithering? Is there any way to turn off dithering on ps4? You said ps4 is the worst do you think ps4 pro would differ? Is there any setup be it pc/laptop or console that would ensure that dithering is removed to ensure a dither free gaming experience?
Also it would seem just as pwm differs on plasma from led does dithering also differ? I've kinda been concerned about plasmas by people demonizing it calling it a heavy dithering heavy pwm display. Saying it's the worst and pwm free true 8 bit led is the way to go. So just confusing.
Also does dither become a factor when it comes to things like a cable box or a roku 3? How would one watch movies media etc either dither?
Also wondering a Nvidia shield tablet should be dithering free correct? Nvidia shield TV also should be dithering free?
Sorry guys if I'm asking too many questions lol. But say an external device introduces dithering to to a display would it cause that temporal flickering like dithering to even a true 8 bit panel that doesn't use frc?
- Edited
Link Jeez a lot of questions. Let me try to answer some:
Plasma "pwm" is not an issue because the phosphor persistence time on plasmas is so long that any modulation of the signal is incredibly mitigated if not impossible to see/measure. In fact, it's such a long persistence that there's an afterimage when you turn the panel off that lasts for a few seconds... On an LED it's starkly visible because the persistence on the light is very short.
Plasma "dithering" is a subfield dithering, it's VERY large and stark. It's disconcerting if you sit close, if you are further away you don't notice it. But it's literally an ordered pixel dance that cycles different colors through areas of constant color to provide shading/depth, sort of similar to old tube TV's, and definitely can be distracting if you don't like it. I don't mind it, and it helps color depth substantially. The "temporal dithering" on other units is done at the actual pixel level. A solid purple screen on a Plasma is ... purple. Maybe it's mostly purple with a bunch of green sort of floating through it because you asked for a deeper purple. A solid purple screen with temporal dithering is... every single pixel flickering back and forth from red to blue in unison (or a pattern) to simulate purple, and literally hurts to look at.
We know the XBox and PS4 dither because ... well, you can SEE it if you get close to the screen. It's actually visible. Temporal dithering looks like a shimmer where the color swims and won't sit still. You can easily see it on large screens. On small screens it can be tough because of pixel density.
The Switch might dither, to be honest. nVidia is nowhere near as safe as they used to be. Last-gen Tegra chipsets had ZERO temporal dithering. New ones might. I haven't tried it. The built-in screen has good and bad versions, I've played with a few, but I've never plugged one into my plasma to see how it looks when piped over HDMI.
And yes, of COURSE if a device specifies dithering the monitor will do it, because it doesn't say "hey, dither that image!", it presents a shifting series of pixels. The TV can't tell if that's a moving image or a still image with dithering!
Listen, a true 8-bit PWM-free LED... and that's super rare and super expensive, because DC direct current dimming requires a LOT more electronics so it's usually a LIE... would be nice, assuming the color output wasn't also painful. A lot of people here are sensitive to blue light in addition to the PWM. But the reality is that if the source is dithered - and all ATI chipsets are dithered regardless of whether the output device has 8 bit color or not - it won't help.
Temporal dithering was designed to solve a problem. The problem was (and still is) that most displays have a CRAP color gamut. They're mass-market produced, and designed to make colors "pop" so that the average consumer thinks they have a great display and buys it. Most output devices ASSUME that the display they are attached to can't render the colors they output properly... and dither just to make sure.
One last thought since I saw another question - specifying 30 bit color only helps if the ST50 can produce 30 bit color. Can it? I don't know offhand, it's a nice unit but...
- Edited
Gurm So the XBOX1 when plugged in to the ST50, the X1 gives me the option to display at 24 bit "8 bit," 30bits "10 bit," and 36 bits "12 bit" color depth. What would be the best settings to use for the XBOX1 and ST50? Would using 36 bit setting mitigate Xbox1 dithering? I am assuming it can reproduce 30 bit and 36 bit color as with the x1 plugged into a monitor it doesn't give me the option. It only gives option for 8 bit with monitor.
About the nintendo switch, it utilizes a sm20b gpu and Tegra x1 chip which they say is a maxwell family nvidia chip. Do you suspect dithering?
Hypothetically if you have a true 8 bit panel hooked up to a source outputting at 8 bit why would there still be dithering? ATI can have dithering enabled but if say in the Xbox ones case, if you lock it in to display at 8 bits on a true 8 bit panel doesn't that mean there's no need for dithering?
In the case of plasma isn't dithering also being displayed by quickly alternating red and blue pixels to simulate purple? As far as pwm, I know exactly what you mean by image persistance and it leaving a afterglow even after you turn off the TV. Such was the case with a older Panasonic g20 plasma I had. But that never happens on the st50. As panel tech got newer I believe they improved phosphor speeds to reduce other artifacts. Also as you probably know different manufacturers use different ways to drive their subfield drives. For example 2012 Panasonic plasmas imploy a focused field drive of 2500hz and up as opposed to sub field drive if 600hz. Could this be why some people perceive flicker on some plasmas? Do oleds handle dithering the same way lcd's do? As in flicker between red and blue for purple?
I know I know a lot of questions.....
Thanks for the help Gurm!
- Edited
JTL so, does that mean an amd quadro using 10 bit hooked up to a true 10bit eliminate dithering? At least on the gpu side? And ditherig would take care of Intel?
so, does that mean an amd quadro using 10 bit hooked up to a true 10bit eliminate dithering?
First AMD video cards are their own brand. Quadro is an nVidia brand.
Second of all, in theory a native 8-bit display (most VA panels, some high-end IPS) SHOULD eliminate dithering. Unfortunately it can be so so, such as with the Intel cards as I'm sure you've read.
The dithering.exe software should work for Intel. I don't know if it's been tested on more than just laptops with integrated displays or not though.