JTL wait you're saying you caused yourself pain avoiding pwm? Also let's say that you hook up a ps4 or Xbox one to a pwm free 8 bit panel it could still have dithering? What about cable boxes and stuff? Is there anyway to ensure to avoid anything that may be problematic aside from not using anything?
A little confused
- Edited
Link wait you're saying you caused yourself pain avoiding pwm?
No, as in not avoiding PWM and trying to ignore it.
Link Also let's say that you hook up a ps4 or Xbox one to a pwm free 8 bit panel it could still have dithering?
Yes, in the video card
Link What about cable boxes and stuff?
Not really.
Link Is there anyway to ensure to avoid anything that may be problematic aside from not using anything?
Maybe someone could build a DVI signal analyzer that goes between a device and the screen.
I made this crude diagram.
+-----------------------------+
| |
| |
| DVI signal analyzer |
| |
| |
| |
+--------+--------------------+
|
|
|
|
+-----------------------------+-----------------+--------------+
| | |
| | Screen |
| | |
| | |
| +--------------+
|
+--------------------------+
|Signal output from device |
| |
+--------------------------+
JTL interesting... Is there any way to confirm whether or not gaming systems use temporal dithering? Is there a definitive way to game without any pwm/dithering etc. I'm guessing it would have to be pc/laptop to ensure dithering isn't active?
Also are there pwm/dithering free cell phones? How about Samsung galaxy s2? I know it's pwm free but also dithering? Any tablets? I have a windows 10 tablet nuvision tm800w610l which passes the phone camera pwm test I can confirm but not sure about dithering.
What about cable boxes or something like a roku 3 can they utilize dithering?
Anyway to test all your devices for dithering?
Link I always thought that because plasma's had no backlight, they didn't use PWM, but maybe I am wrong. I thought the flicker on plasma's was actually the heavy "coarse" PWM, as lowering the brightness on a plasma (on most plasma's this would mean lowering the contrast) usually decreases the flickering effect.
If you are really worried cut the plasma out of your routine for a couple of weeks and see if your eye's improve.
Here is a great post I found from @Gurm about plasma dithering:
Gurm Plasmas use a much "coarser" temporal dithering, and they always have. A "grey box" will actually be a marching series of pixels, similar to old analogue TV snow, and this is actually perfectly OK for my eyes. But it would be foolish to assume it's ok for everyone else's eyes. There is no "individual pixel temporal dithering" that occurs on a Plasma but rather "field dithering" which may certainly trigger other people's eyestrain.
As for how I know that the Xbox One is still dithering on the plasma... well, the plasma is the BEST way to view the xbox one, but I can still only do so for short times. Since it's a Radeon chip, my assumption is FRC/Temporal. It's very hard to directly test for!
This entire thread is a goldmine for plasma technology!!! Very interesting.
http://www.avsforum.com/forum/40-oled-technology-flat-panels-general/1047145-480hz-sub-field-drive.html
Plasmas don't use "PWM", inasmuch as PWM means "Pulse Width Modulation", and that's not a relevant term unless you're talking about an LED or a Fluorescent tube. Plasmas use millions of individual encapsulated glass vacuum tubes filled with a gas plasma and cause them to fluoresce by passing current across them. The amount of current determines the strength of fluorescence.
But to answer the OP's questions - XBox One and PS4 use AMD/ATI chipsets which DO use dithering by default. It is quite noticeable on some TV's and painful on others. It's pretty ok on my Plasma with the color output and input set properly, but really bad on some other TV's. The Switch uses an nVidia chip and is ok for me on the big screen but not ok on the small screen which is why you would buy a switch. The WiiU had a similar ATI chip, wasn't quite as bad as the PS4 (the worst of the lot) but still not as good as the Wii (no pain at all) or the XBox 360 (no pain at all) or the PS3 (best display EVER for me).
Actually I get confused too.
So far I collected these three possible causes:
1. Color Dithering
2. PWM
3. White Point(color temperature)
No.1 is controlled by graphics driver (software)
No.2 & No.3 is controlled by display manufacture (hardware)
I don't know which one is the main cause, and which one is of secondary. Maybe we should open a new discussion specifically collect these infomation.
No, you're still confused.
Color dithering can be caused by display driver, output hardware, display hardware, or display software. There are many causes, but in every case it is intended to compensate for either reduced bandwidth or insufficiency in the display. Ironically, some of the nicest displays use the most aggressive dithering, totally missing the point thereof.
PWM is most often a function of the panel, yes.
White point is controlled by hardware and software, both source and destination.
- Edited
Gurm okay interesting information. So I have a panasonic st50 would setting the bits to 36bits mitigate the Xbox one dithering? Is there any way to turn off dithering on ps4? You said ps4 is the worst do you think ps4 pro would differ? Is there any setup be it pc/laptop or console that would ensure that dithering is removed to ensure a dither free gaming experience?
Also it would seem just as pwm differs on plasma from led does dithering also differ? I've kinda been concerned about plasmas by people demonizing it calling it a heavy dithering heavy pwm display. Saying it's the worst and pwm free true 8 bit led is the way to go. So just confusing.
Also does dither become a factor when it comes to things like a cable box or a roku 3? How would one watch movies media etc either dither?
Also wondering a Nvidia shield tablet should be dithering free correct? Nvidia shield TV also should be dithering free?
Sorry guys if I'm asking too many questions lol. But say an external device introduces dithering to to a display would it cause that temporal flickering like dithering to even a true 8 bit panel that doesn't use frc?
- Edited
Link Jeez a lot of questions. Let me try to answer some:
Plasma "pwm" is not an issue because the phosphor persistence time on plasmas is so long that any modulation of the signal is incredibly mitigated if not impossible to see/measure. In fact, it's such a long persistence that there's an afterimage when you turn the panel off that lasts for a few seconds... On an LED it's starkly visible because the persistence on the light is very short.
Plasma "dithering" is a subfield dithering, it's VERY large and stark. It's disconcerting if you sit close, if you are further away you don't notice it. But it's literally an ordered pixel dance that cycles different colors through areas of constant color to provide shading/depth, sort of similar to old tube TV's, and definitely can be distracting if you don't like it. I don't mind it, and it helps color depth substantially. The "temporal dithering" on other units is done at the actual pixel level. A solid purple screen on a Plasma is ... purple. Maybe it's mostly purple with a bunch of green sort of floating through it because you asked for a deeper purple. A solid purple screen with temporal dithering is... every single pixel flickering back and forth from red to blue in unison (or a pattern) to simulate purple, and literally hurts to look at.
We know the XBox and PS4 dither because ... well, you can SEE it if you get close to the screen. It's actually visible. Temporal dithering looks like a shimmer where the color swims and won't sit still. You can easily see it on large screens. On small screens it can be tough because of pixel density.
The Switch might dither, to be honest. nVidia is nowhere near as safe as they used to be. Last-gen Tegra chipsets had ZERO temporal dithering. New ones might. I haven't tried it. The built-in screen has good and bad versions, I've played with a few, but I've never plugged one into my plasma to see how it looks when piped over HDMI.
And yes, of COURSE if a device specifies dithering the monitor will do it, because it doesn't say "hey, dither that image!", it presents a shifting series of pixels. The TV can't tell if that's a moving image or a still image with dithering!
Listen, a true 8-bit PWM-free LED... and that's super rare and super expensive, because DC direct current dimming requires a LOT more electronics so it's usually a LIE... would be nice, assuming the color output wasn't also painful. A lot of people here are sensitive to blue light in addition to the PWM. But the reality is that if the source is dithered - and all ATI chipsets are dithered regardless of whether the output device has 8 bit color or not - it won't help.
Temporal dithering was designed to solve a problem. The problem was (and still is) that most displays have a CRAP color gamut. They're mass-market produced, and designed to make colors "pop" so that the average consumer thinks they have a great display and buys it. Most output devices ASSUME that the display they are attached to can't render the colors they output properly... and dither just to make sure.
One last thought since I saw another question - specifying 30 bit color only helps if the ST50 can produce 30 bit color. Can it? I don't know offhand, it's a nice unit but...