Link It could be any number of things. Color balance changes, compositing changes, sub pixel color rendering changes, brightness change, contract change, dithering change, etc. No way to know

@Gurm I literally just walked up to my plasama and spend a few minutes just staring at the pixels walking across the screen like you described. Thought it was kind of neat, no discomfort at all. Lots of interesting "effects" up close actually lol, not a clean display at all but yet it is completely comfortable.

So I'm still not understanding guys lol... Xbox one or ps4 set to display at 8bit and the games more than likely are encoded to 8bit hooked up to a true 8bit monitor, even with Radeon chipsets forcing dithering what is there to dither?

  • JTL replied to this.

    Link what is there to dither

    The force-dithering by the Radeon chipset, which can happen even in a "full" 8-bit color signal chain.

    • Link replied to this.

      JTL from what I understand dithering is used to replace missing colors correct? If the source is 8 bit why would the chipsets in consoles force dither? Even if it is set to on by default there's nothing to dither right? What is it going to dither a 8 bit source to a 8 bit colors? Doesn't add up to me. Say the source was 6bit then it makes sense that some sort of dithering is used to simulate higher bit depth. I don't think any modern games are encoded outside of 24bit true colors, aside from hdr capable games, as that has been the standard for a long time.

      The other source of temporal dither can be a monitor that's not true 8 bit being fed a source that is 8bit.

      • Gurm replied to this.

        Found something interesting... If this is true and I'm understanding it correctly ps4 is truly 10bit than a true 10bit would disable dithering? http://www.edepot.com/playstation4.html

        Edit: probably not accurate info on that site.. Ps4 must be native 8bit

        Link We are as confused as you. Why does the ATI chipset force dithering on 8 bit displays? Unknown. Why does the Macbook output force dithering on 8 bit displays? Unknown.

        We are pretty confused by it, but it's pretty obvious to see. Get REALLY close to a solid color image (like the grey desktop background) on an OSX system, on ANY display, and you'll see the pixels shimmering. It's 100% on all the time and it's inexplicable.

        • Link replied to this.

          Could it be possible to just use like hdmi 1.2 or something, a older hdmi that doesn't even support higher colors being sent? Would that help?

          Gurm BTW Gurm u were cool with the xbox 360 and ps3 ri? No dithering?

          Did u ever try the ps4 or Xbox one with a true 8bit monitor to see if it makes a difference?

          So you expect the newer Xbox one x to not have these issues?

          • Gurm replied to this.

            Link XBox360 and PS3 were flawless for me in any display mode on any display. I am not sure what to expect from XBox one X, it might be great or it might be 10x worse. But I have a hunch it might be better, since it can do real 4k HDR and it can downsample higher resolution higher framerate games better than the original Xbox One.

            • Link replied to this.

              JTL this is going to be our winning angle - tagging along to photography (or maybe VR) - where there are very specific signal chain / display requirements

                Gurm do you intend to test the xbox one x?

                reaganry isn't everything specific right now to 8bit? As long as you have a true 8bit display?

                If temporal dithering is a problem... Couldn't we test it out by just setting the color on a display to zero, essentially rendering the screen black and white?

                • JTL replied to this.

                  Link I only see such a thing causing problems. There isn't a way to mask dithering, short of possibly a higher PPI display, and higher PPI displays have other implications such as having on-screen items too small. And no one knows if said chip in the cable does dithering of it's own either.

                  It'd be interesting to hear if anyone has any experience with something like the nvidia shield pro which has the same Tegra x1 chip as the Nintendo switch. It allows to stream certain games from the cloud also. Or something like the OUYA which has a Tegra 3.

                  5 months later

                  degen I haven't read this whole thread yet but wanted to share that everything posted here is very interesting. I own 4 plasma TV's and have had PS4, PS2, Wii, Original XBox and for years had a computer hooked up to it with multiple operating systems. I was so frustrated with LED just being too complicated that when I used a plasma I never have eye pain. No matter what operating system I had installed Ubuntu/Lubuntu/Win XP/Win 7 / Win 10 / Mint I never had issues. I have a BenQ GW2270 that I bought 1 month ago it seems if I wear SCT orange glasses I can use it. I was playing a game on the computer a couple of weeks ago for six hours as long as I had the glasses on I was fine. Windows 10 was installed. The moment I take the orange glasses off It starts to get to me. LED is just insane I want to buy more plasma TV's🙂

                    dev