Products to try or avoid? PWM Flicker and Temporal Dithering Testing
- Edited
You are right about linux mint. I connected the T480s via hdmi to the LG and I can't see dithering at least with my scope and camera. The monitor is only running at 30hz so if temporal dithering is occuring, I should be able to see it with the high speed camera easily. However, using a usb-c to displayport cable still has dithering. I am very happy with being able to use the LG even if not at high refresh rate. The hdmi cable I am using is the one that came with the LG monitor (it says 8k 60hz on s only running at 30hz so if temporal dithering is occuring, I should be able to see it with the high speed camera easily or even with normal speed camera (half the refresh rate dither). However, usiUpdate: I do see some very high frequency pixel flicker in slow motion video but it is clearly different than when I test this monitor with the usb-c to display port cable. The HDMI output on the T480s is also bandwidth limited which relates with discoveries in the the mac stillcolor discussion.
ng a usb-c to displayport cable still has dithering. I am very happy with being able to use the LG even if not at high refrethe cable). My T480s only has Intel UHD Graphics 620. It does not have the Nvidia MX150 discrete gpu as some other versions do. monitor
Also, I discovered my Nikon camera has hdmi out that also does not seem to cause dithering on the LG monitor which is 8 bit + FRC. The Nikon's output sets the monitor at 30hz. Again, I could not find anything with the scope unlike Win 11 at 60hz on the 7i. I was viewing smintome photos, and it was the first time in a while that the photos looked "stable".
Update: I still see some very high frequency pixel flicker in slow motion video but it is clearly different than when I test this monitor with the usb-c to display port cable. The HDMI output on the T480s is also bandwidth limited which relates with discoveries in the the mac stillcolor discussion.
photon78s oh wow that's great! I'm almost thinking Linux might end up being the best long term solution. There's actually a way to get displays to support 24hz when they don't by default, Maybe that's the best way to test for dither if it happens at the half of refresh rate. I think low Hz 10-15hz could cause photosensitive epilepsy in people that have it btw.
- Edited
I think I've disabled or modified the dithering at one level as with @aiaf with using bandwidth limited hdmi port or cable to a modern and demanding 4k monitor and what I'm still seeing is some kind of pixel inversion or similar (TCON generated dithering flicker?) on the LG. The still existing flicker is still fast when played back in slow motion so it is not at low 15hz (half refresh rate of 30hz). Using the computer, the mouse cursor lags as expected for 30hz refresh rate.
The only way to know health wise is to test for yourself which type of pixel flicker is more comfortable. Even with 100% no dithering, generated by the computer or monitor, you still have to deal with pixel inversion unless you go OLED or other types with all their own different set of limitations and issues.
https://www.reddit.com/r/thinkpad/comments/q13awt/t480s_4k_over_hdmi_or_usbc_or_thunderbolt/?rdt=50463
Yes, the T480s hdmi output is limited so one could use it for positive effect.
- Edited
T480s HDMI 1.4b cannot do 10bit?
https://www.reddit.com/r/Monitors/comments/d20vcn/hdmi_14_10bit_color_and_1440p_resolution/
https://uniaccessories.com/blogs/blog/hdmi-1-4-vs-2-0-what-is-the-major-difference
https://www.hdmi.org/spec/hdmi1_4b
https://www.electronics-notes.com/articles/audio-video/hdmi/hdmi-versions.php
- Edited
T480s -> LG monitor banding (linux mint, hdmi 1.4b connection)
https://ibb.co/CtxGm65 (not screenshot, taken with phone camera so ignore the moire)
Now same except using high bandwidth usb-c to displayport cable (32.4Gbps, 8k/60hz)
Current list of pixel flicker observations (encompassing dithering and/or inversion or other artifacts):
- Pixel inversion is dependent on display refresh rate (1/2 refresh rate?).
- A display's pixel flicker may change over time possibly due to temperature or other factors.
- Some pixels flicker in sync to computer coil whine when a power hungry usb device is plugged in. The flickering stops when the usb device is removed.
- A single pixel may flicker at a different rate compared with neighbors (what is this artifact?)
- Camera sensor noise and noise reduction algorithm artifacts may be confused with pixel flicker at low display brightness.
Updated tbd by findings…
- Edited
About pixel inversion:
https://display-corner.epfl.ch/index.php/LCD_dynamics
The open/close state of a (sub-)pixel cell is controlled by a voltage, where the amount of light being blocked by the cell only depends on the absolute voltage but is independent of the voltage polarity. However, the liquid crystal fluid in the cell actually degrades if the mean voltage is different from zero, which is why the voltage polarity has to be inverted at a high enough frequency. In a monitor, the polarity is inverted at the monitor's refresh frequency. It appears to be technically difficult though to meet exactly the same absolute voltage levels at both polarities, even for static image content. Any residual difference in absolute voltages causes an according difference in the cell states and, thus, in pixel luminance. These luminance fluctuations might be perceived as an according pixel flickering at half the refresh frequency. In order to make such flickering less apparent, both polarities are used at the same time but for different sub-pixels, so that potential differences can average out across space (i.e., across adjacent sub-pixels) and over time (i.e., over refresh cycles). Because the pattern of how polarities are distributed across sub-pixels is very regular, pixel-inversion artifacts can still become quite obvious, especially if the temporal averaging is compromised by eye movements of certain velocities, which makes the spatial polarity distribution pattern become more apparent for short periods of time. Pixel-inversion artifacts, or more generally, voltage stability artifacts, can also surface in other forms, like color shifts or cross-talk within pixel rows or columns. These artifacts possibly show up under only very specific circumstances, which makes testing and quantification difficult. Although high pixel densities and high refresh rates both can help in hiding pixel-inversion artifacts, those features also make it technically more challenging to avoid such artifacts in the first place.
So if your eye movements are not at the "certain velocities", then will this help with avoiding eyestrain?
- Edited
https://www.ncbi.nlm.nih.gov/books/NBK10991/
Types of eye movements.
saccades, smooth pursuit movements, vergence movements, and vestibulo-ocular movements
- Edited
Eizo CG2700X Monitor Opple4 Results at 27% brightness:
What really flickers are the front panel button indicator LEDs which thankfully can be turned off.
It is beginning to look like this is a true 10-bit monitor not 8bit + FRC. More later…
photon78s Someone recently posted a video claiming there are issues with the backlight/illumination of the CG2700X. As I haven't got my hands on a CG2700X (yet), do you have any comment?
- Edited
I see some slight uniformity differences but not as bad as the video. This is very concerning considering the reputation and obvious high cost. I think I can live with it but I remember the last Eizo I tested CS2740 was slightly more uniform. Actually, it was the first thing that impressed me about that "cheaper" Eizo.
On the positive side, I am testing with my 7i and T480s first using bandwidth limited High Speed with Ethernet hdmi cable like what @aiaf was using and the pixel flicker under scope compared with the LG is night and day difference. I'm guessing that the slight flicker I see on the Eizo is just due to pixel inversion at 30hz (1/2 refresh rate) or with some OS/Gpu generated dithering but it is much less than the LG. I assume this difference is exactly due to the lack of FRC on the Eizo.
To be consistent both monitors were tested at 60hz (the LG can do 144hz but that distorts fair comparison). The Eizo can tell me in the menu that my signal is 10 bit when I use a 30Gbps usb-c to displayport cable but it does not display anything when using this old HDMI cable. For this test, the 7i was running Win 11 on Nvidia GPU with I think dithers and the T480s was running linux mint.
Brightness is indicated in the menu using candela per square meter instead of percentage brightness so the 27% is roughly 150 cd/m2 out 550 cd/m2 (max brightness).
- Edited
Make sure you have a good return policy if you do decide. Like the CS2740, I don't like the matte screen texture (its ok but I prefer slightly more "smoothness" but not fully glossy). Wish there were more affordable true 10-bit non FRC displays. The problem I have with the Apple XDR other than the even higher cost is the miniLED blooming and possible fatigue from just that backlight tech.
Uniformity picture. Not too too bad (ignore the bright spot in the center which is a reflection)
- Edited
Eizo specifies to wait 3 to 20 minutes for the monitor to fully warm up for color accuracy. The tentative observation is that I notice less flicker (most likely inversion) after this 3-5 minutes mark and then flickering level stabilizes but does not go away completely.
The second observation is that my monitor has two settings to control uniformity. The unit I'm testing was set to tradeoff high brightness for worse uniformity out of the box which I didn't know how to change. Now, setting it to optimize uniformity makes a subtle improvement. However, unit variance still applies anyways.
At the very least, this exercise shows the differences and potential benefits of non FRC 10 bit rendering.
Update: This Eizo is not a keeper. The matte coating itself is strain producing at least for me.
- Edited
A list of realistic (has been done before) specs for an eyestrain minimal "basic" display:
- True 8 bit (no need for true 10 bit for most usage) with no FRC or similar tech of any kind that adds to pixel flicker
- DC dimming or other backlight tech with no
or minimalfluctuations verified under rigorous oscilloscope or similar ground truth measurement (and no PWM miniLED!) - Minimal pixel inversion (good panel manufacturing and quality control)
- Greater or equal than 120Hz panel refresh rate (also to get pixel inversion frequency higher)
60Hz option always available due to preferences and sensitivities to refresh rate effects. - Light matte or semi-glossy texture with low haze percentage and no matte haze rainbow shimmer (better for reading text on light backgrounds)
- 24 inch or above size category
- High dpi of at least 150 pixels per inch.
- VESA mount compatible
- Static display contrast around 1000:1 or better
- Emitted light spectrum has no weird unnatural spikes (Update: For example, backlighting other than WLED that does not produce strain or a harsh light effect)
- Move the ac to dc conversion away from the device (e.g. wall wart not built into the display itself inches from your face)
- No fancy gimmicks like rgb lighting, use control panel buttons with no leds or leds that don't flicker.
Anything else?
Matte versus Glossy info: https://pcmonitors.info/articles/matte-vs-glossy-monitors/
Rainbow effect (I don't agree with the author opinion on Eizo's matte coatings): https://imagescience.com.au/blog/eizo-cg2700x-evaluation
photon78s Anything else?
Preferably no fluctuations at all. The engineers should do the absolute best they can, with the best tech available. No marketing guy saying "no" if this means the device would cost $10 extra. If they do it the lazy way yet again, chances are we have yet another device that's causing sensitive people eye strain and headaches due to whatever tiny flicker remains. There's no room for laziness here.
Higher refresh rates do not necessarily mean less issues. They could even introduce them. The option 60 Hz should always remain (it probably will - just saying).
Spectrum should really be based on anything but White (O)LED technology. Because I believe that's what people who are sensitive to LED brightness have issues with (for whatever reasons). Though I think the spectrum issue can't be resolved without getting feedback from volunteers (test subjects). We have Quantum Dots and other technology available with vastly different spectra.
Ideally a manufacturer sends us their prototypes to check them out. I think having the most sensitive users here test prototypes would quickly provide new insight.
No more excuses like "but this is how its always been done" and "that's what I learned in the engineering textbook" kind of thing. Some companies do a lot just to save a fraction of a cent.
I see different comments with "gaming" versus 60Hz refresh rate with regards to inversion in particular. Yes, ideally no inversion at all but none of the drawbacks of OLED, etc. More generally, a modular system adaptable to different situations (environmental lightning, use case, and general state of health affecting eyestrain, a users pattern of eye movements) can be another approach (but very expensive).
I heard a story years ago from a VR pioneer who commented on the hard or harsh lighting of the modern generation of computer displays and VR googles.
They absolutely should get feedback from us. I hope the prototypes are not cherry picked best ones that manufacturing tolerances can produce.
Hi guys good stuff. i’m planning to purchase the pro 7i.
It comes with win 11 though. i’m no computer expert at this point in my life, is it possible to roll back to win 19 using the same product key? i’d like to use the build suggested here.
i was originally planning to purchase the mac air m3 because of still color but giving up gaming and is a bit much though i do t game a lot at all.
is the pro 7i with the omissions better than the mac in terms of eye strain?