A little confused
- Edited
@Gurm I literally just walked up to my plasama and spend a few minutes just staring at the pixels walking across the screen like you described. Thought it was kind of neat, no discomfort at all. Lots of interesting "effects" up close actually lol, not a clean display at all but yet it is completely comfortable.
- Edited
So I'm still not understanding guys lol... Xbox one or ps4 set to display at 8bit and the games more than likely are encoded to 8bit hooked up to a true 8bit monitor, even with Radeon chipsets forcing dithering what is there to dither?
- Edited
JTL from what I understand dithering is used to replace missing colors correct? If the source is 8 bit why would the chipsets in consoles force dither? Even if it is set to on by default there's nothing to dither right? What is it going to dither a 8 bit source to a 8 bit colors? Doesn't add up to me. Say the source was 6bit then it makes sense that some sort of dithering is used to simulate higher bit depth. I don't think any modern games are encoded outside of 24bit true colors, aside from hdr capable games, as that has been the standard for a long time.
The other source of temporal dither can be a monitor that's not true 8 bit being fed a source that is 8bit.
- Edited
Found something interesting... If this is true and I'm understanding it correctly ps4 is truly 10bit than a true 10bit would disable dithering? http://www.edepot.com/playstation4.html
Edit: probably not accurate info on that site.. Ps4 must be native 8bit
Link We are as confused as you. Why does the ATI chipset force dithering on 8 bit displays? Unknown. Why does the Macbook output force dithering on 8 bit displays? Unknown.
We are pretty confused by it, but it's pretty obvious to see. Get REALLY close to a solid color image (like the grey desktop background) on an OSX system, on ANY display, and you'll see the pixels shimmering. It's 100% on all the time and it's inexplicable.
Could it be possible to just use like hdmi 1.2 or something, a older hdmi that doesn't even support higher colors being sent? Would that help?
Link XBox360 and PS3 were flawless for me in any display mode on any display. I am not sure what to expect from XBox one X, it might be great or it might be 10x worse. But I have a hunch it might be better, since it can do real 4k HDR and it can downsample higher resolution higher framerate games better than the original Xbox One.
If temporal dithering is a problem... Couldn't we test it out by just setting the color on a display to zero, essentially rendering the screen black and white?
- Edited
Guys what about this? https://arstechnica.com/gadgets/2017/09/theres-a-directional-120-hdmi-cable-that-actually-improves-your-picture-quality/
Maybe this cable can somehow help in smoothing out or masking any dithering? It would be cool if someone can test this out.
It'd be interesting to hear if anyone has any experience with something like the nvidia shield pro which has the same Tegra x1 chip as the Nintendo switch. It allows to stream certain games from the cloud also. Or something like the OUYA which has a Tegra 3.
degen I haven't read this whole thread yet but wanted to share that everything posted here is very interesting. I own 4 plasma TV's and have had PS4, PS2, Wii, Original XBox and for years had a computer hooked up to it with multiple operating systems. I was so frustrated with LED just being too complicated that when I used a plasma I never have eye pain. No matter what operating system I had installed Ubuntu/Lubuntu/Win XP/Win 7 / Win 10 / Mint I never had issues. I have a BenQ GW2270 that I bought 1 month ago it seems if I wear SCT orange glasses I can use it. I was playing a game on the computer a couple of weeks ago for six hours as long as I had the glasses on I was fine. Windows 10 was installed. The moment I take the orange glasses off It starts to get to me. LED is just insane I want to buy more plasma TV's