A little confused
And getting a 10 bit "workflow" working is very complicated, usually reserved for just photographers and such.
http://nativedigital.co.uk/site/2014/02/achieving-10-bit-display/
- Edited
So the big problem is that even when a display reports that it can produce 8-bit (or 10-bit) output, many cards/chipsets are "locked" in dithering mode. AMD/ATI is notorious for this. There used to be a registry hack to disable dithering but it looks to have been removed in later versions of the drivers, maybe? I don't know, as I haven't had an ATI card in a long time.
Macbooks are also notorious for this. You can plug a Macbook into the nicest display known to man and it will still dither, no matter what, in all modes. Nothing to be done about it unless JTL is successful in figuring out his kext injector.
And no, plasma dithering is not achieved by rapid pixel flicker, but rather by SLOW pixel progression. This is annoying to some people, but you can literally chase the moving pixels around with your finger on a plasma, which indicates that they're cycling at rather less than 60hz. It's ... sort of ... a combination of temporal and spatial dithering. Like a moving spatial dither. To me it looks more "natural" and my eyes tolerate it very well. But to others it's crazy annoying. You're supposed to sit far enough away that you don't see it, though.
OLED dither is achieved the same way as LCD, yes.
Last response (I hope I've hit all your points) is that I have no idea how the new nVidia chipset will perform on the Switch. I have used several Switches, but only in handheld mode. One was gorgeous and easy on my eyes, one was ... fine ... and one was slightly painful (like using an iPhone screen). Since all of them were running identical software, that leads me to suspect that the panels are not the same. Perhaps they use several manufacturers as they are attempting to meet demand? Or perhaps the ones that I liked less were newer, and they've cut corners to meet demand? Either way, I would expect that since at least one of the panels was ok for me that the output to my TV would also be ok? But I don't know for sure, Nintendo is odd - the WiiU has had its ups and downs for me but the handheld unit is ALWAYS fine, albeit being only 720P with I suspect lousy display characteristics (slow refresh, bad vieweing angle as compared to the switch).
For me, nintendo switch works perfect for eyes. And xbox one s as well. No strain.
- Edited
Plsnostrain okay cool. BTW, which display do you use to play Xbox and do you know what you have your color depth under display settings set to? Also, as for the Nintendo switch, do you play that in handheld mode and on a display with no strain?
- Edited
Gurm Nothing to be done about it unless JTL is successful in figuring out his kext injector.
I think a kext patcher would work better. It would need to be coded and reinstalled for each version of the kext though, and it would require SIP to be disabled.
Amulet Hotkey never figured out Intel GPU's either and I don't see that they use the AAPL
IORegistry keys.
The main problem right now is I only have this computer as my main one and I need a second computer I can use for kext debugging, so I can do breakpoints on the existing kext code and debug it in realtime.
Looking into a new graphics card for desktop (Probably a Quadro as I rarely game) and a new monitor, but that's another discussion entirely.
JTL i agree that it might be different for some Thunderbolt Display models. I'll have to look into that.
I play XBox One (original) on my LG Plasma - color depth doesn't seem to matter, and "regular" versus "PC" color gamut (the option on the XBox) changes the dither pattern on the TV but doesn't seem to do much else.
... actually now that I say that I've never really PLAYED with it set to "PC". I'll do that today at lunchtime and report back.
Link
Sorry for late answer. I use a samsung monitor. I did nothing with color depth. Ive also used a Sony tv playing mario kart and zelda for hours on switch with no eyestrain. Same with Xbox one s. The only thing I noticed maybe a little strain is when I used the xbox one s webbrowser, but that was nothing compared to the laptops with intel hd I have owned.
Gurm well the rgb range is only gonna affect color space i.e 0-255 being full or 16-235 being limited. Most tvs only accept limited and full generally is used on monitors. Though some tvs do accept full color space signal. For example my st50 I can select full but I have to go into hdmi settings and change my tvs color space to full also. If I leave it on default which for tvs is limited I end up with black and white crush. The setting within Xbox that I was referring to is color depth rather than color space, which is going to either be 8,10 or 12 bit.
Plsnostrain no problem thanks for that info! If you don't mind please share with us which model tv and monitor you use for gaming on those systems.
I have something very interesting for you guys. I ran some tests on a Samsung cfg70 monitor. It uses a VA panel and is supposedly from all reputable sources a true 8 bit panel. The tests I ran for dithering are http://www.lagom.nl/lcd-test/gradient.php and https://nouveau.freedesktop.org/wiki/Dithering/
Now here's the interesting part. I ran those patterns on the Playstation 4, Playstation 3, Nintendo switch and Xbox one. All consoles except for Xbox one showed a smooth image. Only the Xbox one showed banding. I opened these tests in the consoles respective browsers. My monitor was set in the sRGB mode with default settings. The consoles were also set with proper setting such as color space and color depth. Interesting thing is Xbox one specifically let's me pick 8 bit color depth and it still is showing banding. I don't know if the cfg70 is not a true 8 bit panel or the Xbox one is at fault or the other consoles are using dithering. Thoughts?
I think you're at the mercy of the ATI chipset there. Also MS rolls out new updates to Windows 10 to the XBox hardware, and we know that they've screwed around with compositing. I too have noticed more banding lately on newer games, while at the same time increased eyestrain, even though my XBox and my TV are both set for wide color gamut and my panel accepts true 8-bit (it's an LG 60" Plasma, the last one they made...)