Vividblu99
Oh, don't get me started on OLED allergies! I hate it when people sitting close to me wore an Apple watch! In my peripheral vision, I could even see it flickering in a fast motion. Urg. Exposure to it always gives me a headache.
I also came to the same conclusion about new LCD screen with support of higher hertz such as 90 or 120. Yes I think you are on the track in saying that there is some form Anti Motion Blur & Anti Aliasing software involved. The first LCD phone I used with high refresh 120hz, back in 2017, the Razer Phone 1, felt comfortable to me. However the next wave of high refresh LCD have been giving me problem since.
The Galaxy A32 5g for instance, with a 720p and 90 hertz, gave me eyestrain, dizzyness and mild headache problems even with short usage.
The really strange thing about this PLS LCD Panel is that 720p on a 6 inch looked incredibly sharp. Now I have owned multiple devices with 720p on a 6 inch and I am fully aware of the limitation of the sharpness, due to their lower PPI (Pixel per inch) count. This is simply math. The wider the screen, the less dense are the distance between each pixel, and thus the less sharp it becomes. So back to this galaxy A32 5g, it looked unnaturally sharp! There is no reason it is so!
One cannot simply just "create another layer of pixels" through software. How on earth did these manufacturers create another faux layer of pixels over the gaps between each pixel?
It may be completely doable for some emulation, say citra 3ds emulator where they ran the same 3d game at least 8 times and merged them into one single layer. (think of it as pixel binning from our high resolution camera, e.g. 4 different 12mp from a 50mp image sitched into a single 12mp picture through software)This was how they manage to increase the resolution from 240p to up to 4k. Now this is possible because the source was already in software. However, the pixels in our phone screen is hardware! Something indeed abnormal is going on!
I gave some thought into this and I think I have it figured out. This form of "artifically" creating a faux layer of software pixels in an attempt to fill up the gaps between hardware's individual pixel to increase sharpness did not began with smartphone. It began way back from the days of the nintendo 3DS. I think this was where I first experienced these sort of dizziness and discomfort from it.
Now, for the 3ds XL it uses an extremely low resolution of 240p. Naturally everything will look incredibly blur on a 4.88 inch screen. However, when the 3D effect is enabled, suddenly everything looked 720p sharp. This again defies the hardware limitation logic of pixels.
So I have been reading up on this it seems they will using some sort of software and hardware tricks to give an illusion of perceived increased sharpness. Their 3D technology responsible for this increased perceived sharpness is called parallax barrier btw. It works by rendering two of the same on screen image and merge them together, along with blue and red lights flashing (sounds like temporal dithering but at the extreme, thus it is not just temporal dithering already), and lastly behind a curtain of evenly spaced black bars (sounds like pixel inversion here).



When I first saw the Motor G73, I immediately got the same symptoms I got from my 3DS Xl (while 3D is enabled at the lowest setting). Like the 3DS, everything looked unnaturally sharp, with colors "popping out" really unnaturally. The overall screen UI and content has a subtle "3D" effect to it and I am not referring to 3D graphic renders here. I feel as though the screen is not even stable and appears to be whobbling at an incredibly high speed.
Thus if I were to descible how I feft looking at the G73 IPS LCD screen, it would be like the below illustration I did:

This is freaking how my brain perceive screens like Honor Magic 5 pro or the Motor G73.
As you can see, a static image like my illustration is not capable of producing PWM or temporal dithering. Just by staring and focusing at the icons in my own illustation for prolonged period already creates tension headache, dizzyness and serious discomfort.
Therfore I really would wish that people would stop arguing with the beaten to a dead horse argument "But….but…but its temporal dithering! Please for god's sake, disable it!"
We are literally looking at something much more damaging here for sensitive people. There are so many other screen "software optimisation" today responsible for the discomfort and reaction from our body.
I think I might have a way to detect the above effect. If the above relies on a software-based "parallax barrier" which resembles pixel inversion to complete this increased perceived sharpness effect, all I have to do is to get a camera that is able to detect this pseudo pixel inversion at below 20khz flicker range. For I am generally able to tolerate with any subtle flickers above this threshold. (I struggled with flicker below this threshold, such as Mate 20's LCD PWM of 15khz btw. It immediately gave me eye discomfort)
I think I have one in mind already and it is the Leica camera, capable of shutter speed that goes up to 1/40000 hertz.