One last thought since I saw another question - specifying 30 bit color only helps if the ST50 can produce 30 bit color. Can it? I don't know offhand, it's a nice unit but...
A little confused
- Edited
Gurm So the XBOX1 when plugged in to the ST50, the X1 gives me the option to display at 24 bit "8 bit," 30bits "10 bit," and 36 bits "12 bit" color depth. What would be the best settings to use for the XBOX1 and ST50? Would using 36 bit setting mitigate Xbox1 dithering? I am assuming it can reproduce 30 bit and 36 bit color as with the x1 plugged into a monitor it doesn't give me the option. It only gives option for 8 bit with monitor.
About the nintendo switch, it utilizes a sm20b gpu and Tegra x1 chip which they say is a maxwell family nvidia chip. Do you suspect dithering?
Hypothetically if you have a true 8 bit panel hooked up to a source outputting at 8 bit why would there still be dithering? ATI can have dithering enabled but if say in the Xbox ones case, if you lock it in to display at 8 bits on a true 8 bit panel doesn't that mean there's no need for dithering?
In the case of plasma isn't dithering also being displayed by quickly alternating red and blue pixels to simulate purple? As far as pwm, I know exactly what you mean by image persistance and it leaving a afterglow even after you turn off the TV. Such was the case with a older Panasonic g20 plasma I had. But that never happens on the st50. As panel tech got newer I believe they improved phosphor speeds to reduce other artifacts. Also as you probably know different manufacturers use different ways to drive their subfield drives. For example 2012 Panasonic plasmas imploy a focused field drive of 2500hz and up as opposed to sub field drive if 600hz. Could this be why some people perceive flicker on some plasmas? Do oleds handle dithering the same way lcd's do? As in flicker between red and blue for purple?
I know I know a lot of questions.....
Thanks for the help Gurm!
- Edited
JTL so, does that mean an amd quadro using 10 bit hooked up to a true 10bit eliminate dithering? At least on the gpu side? And ditherig would take care of Intel?
so, does that mean an amd quadro using 10 bit hooked up to a true 10bit eliminate dithering?
First AMD video cards are their own brand. Quadro is an nVidia brand.
Second of all, in theory a native 8-bit display (most VA panels, some high-end IPS) SHOULD eliminate dithering. Unfortunately it can be so so, such as with the Intel cards as I'm sure you've read.
The dithering.exe software should work for Intel. I don't know if it's been tested on more than just laptops with integrated displays or not though.
And getting a 10 bit "workflow" working is very complicated, usually reserved for just photographers and such.
http://nativedigital.co.uk/site/2014/02/achieving-10-bit-display/
- Edited
So the big problem is that even when a display reports that it can produce 8-bit (or 10-bit) output, many cards/chipsets are "locked" in dithering mode. AMD/ATI is notorious for this. There used to be a registry hack to disable dithering but it looks to have been removed in later versions of the drivers, maybe? I don't know, as I haven't had an ATI card in a long time.
Macbooks are also notorious for this. You can plug a Macbook into the nicest display known to man and it will still dither, no matter what, in all modes. Nothing to be done about it unless JTL is successful in figuring out his kext injector.
And no, plasma dithering is not achieved by rapid pixel flicker, but rather by SLOW pixel progression. This is annoying to some people, but you can literally chase the moving pixels around with your finger on a plasma, which indicates that they're cycling at rather less than 60hz. It's ... sort of ... a combination of temporal and spatial dithering. Like a moving spatial dither. To me it looks more "natural" and my eyes tolerate it very well. But to others it's crazy annoying. You're supposed to sit far enough away that you don't see it, though.
OLED dither is achieved the same way as LCD, yes.
Last response (I hope I've hit all your points) is that I have no idea how the new nVidia chipset will perform on the Switch. I have used several Switches, but only in handheld mode. One was gorgeous and easy on my eyes, one was ... fine ... and one was slightly painful (like using an iPhone screen). Since all of them were running identical software, that leads me to suspect that the panels are not the same. Perhaps they use several manufacturers as they are attempting to meet demand? Or perhaps the ones that I liked less were newer, and they've cut corners to meet demand? Either way, I would expect that since at least one of the panels was ok for me that the output to my TV would also be ok? But I don't know for sure, Nintendo is odd - the WiiU has had its ups and downs for me but the handheld unit is ALWAYS fine, albeit being only 720P with I suspect lousy display characteristics (slow refresh, bad vieweing angle as compared to the switch).
For me, nintendo switch works perfect for eyes. And xbox one s as well. No strain.
- Edited
Plsnostrain okay cool. BTW, which display do you use to play Xbox and do you know what you have your color depth under display settings set to? Also, as for the Nintendo switch, do you play that in handheld mode and on a display with no strain?
- Edited
Gurm Nothing to be done about it unless JTL is successful in figuring out his kext injector.
I think a kext patcher would work better. It would need to be coded and reinstalled for each version of the kext though, and it would require SIP to be disabled.
Amulet Hotkey never figured out Intel GPU's either and I don't see that they use the AAPL
IORegistry keys.
The main problem right now is I only have this computer as my main one and I need a second computer I can use for kext debugging, so I can do breakpoints on the existing kext code and debug it in realtime.
Looking into a new graphics card for desktop (Probably a Quadro as I rarely game) and a new monitor, but that's another discussion entirely.
JTL i agree that it might be different for some Thunderbolt Display models. I'll have to look into that.
I play XBox One (original) on my LG Plasma - color depth doesn't seem to matter, and "regular" versus "PC" color gamut (the option on the XBox) changes the dither pattern on the TV but doesn't seem to do much else.
... actually now that I say that I've never really PLAYED with it set to "PC". I'll do that today at lunchtime and report back.
Link
Sorry for late answer. I use a samsung monitor. I did nothing with color depth. Ive also used a Sony tv playing mario kart and zelda for hours on switch with no eyestrain. Same with Xbox one s. The only thing I noticed maybe a little strain is when I used the xbox one s webbrowser, but that was nothing compared to the laptops with intel hd I have owned.
Gurm well the rgb range is only gonna affect color space i.e 0-255 being full or 16-235 being limited. Most tvs only accept limited and full generally is used on monitors. Though some tvs do accept full color space signal. For example my st50 I can select full but I have to go into hdmi settings and change my tvs color space to full also. If I leave it on default which for tvs is limited I end up with black and white crush. The setting within Xbox that I was referring to is color depth rather than color space, which is going to either be 8,10 or 12 bit.