Vividblu99
Thank for sharing on those articles and it is really new information to me.
Gosh! I cannot believe this is real. Vsync baked into android. Disabling Vsync was always the first thing I would do whenever I start a new game on PC. And I thought my days of those pc gaming terms are over. What's the team at google intending to with Android? To turn it into a gaming platform like Windows?
I did not even knew vsync was slipped into the OS back in 2015!
https://www.youtube.com/watch?v=1iaHxmfZGGc
On PC, Vsync typically aims to reduce screen tearing by "forcefully" syncing with the framerate. However, this leads to even more lag and I always found that it has given me nothing but problems. I got more discomfort when I use it compared to not using it. From my perspective, this was how I see Vsync:

But let's give Android the benefit of the doubt and see how google defines their Vsync.
"The VSYNC signal synchronizes the display pipeline. The display pipeline consists of app rendering, SurfaceFlinger composition, and the Hardware Composer (HWC) presenting images on the display. VSYNC synchronizes the time apps wake up to start rendering, the time SurfaceFlinger wakes up to composite the screen, and the display refresh cycle. This synchronization eliminates stutter and improves the visual performance of graphics."
Right so the eliminating stutter(I supposed it is screen tearing) is nothing new and as expected of VSYNC standard claim. However I don't really know what does "improves the visual performance of graphics." mean. To my knowledge, Vsync only targets to reduce screen tearing thus I am not sure what is the development team at Google cooking here. Something about "improving visual performance of graphics" smells fishy and I am not appreciating it.
If my memory still serves me right (from my early colleges day), everything you see on your screen are considered graphics. Your UI, web browser, apps, icons etc. Thus it wouldn't be too far-fetch to think that Google's version of Vsync, along with their "extra enhancement" might have something to do with the discomfort we experience. Let's move on to what else google has to say about their support for higher refresh rate.
"A 60Hz display refreshes the display content every 16.6ms. This means that an image will be shown for the duration of a multiple of 16.6ms (16.6ms, 33.3ms, 50ms, etc.). A display that supports multiple refresh rates, provides more options to render at different speeds without jitter. For example, a game that cannot sustain 60fps rendering must drop all the way to 30fps on a 60Hz display to remain smooth and stutter free (since the display is limited to present images at a multiple of 16.6ms, the next framerate available is a frame every 33.3ms or 30fps). On a 90Hz device, the same game can drop to 45fps (22.2ms for each frame), providing a much smoother user experience. A device that supports 90Hz and 120Hz can smoothly present content at 120, 90, 60 (120/2), 45(90/2), 40(120/3), 30(90/3), 24(120/5), etc. frames per second."
Ok that's correct in saying that a 60 hz refresh at every 16.6ms since we can obtain this number by taking 1000ms divid by 60 hz and that gives us 16.667ms.
But I totally could not understand what Google is attempting to inform here about 90 hertz and above how that translate to a smoother user experience. They claimed of jitter free experience however we are reporting on jitters experienced with screen that ships with support of higher frame rate. Also, they were talking about about display content, and then suddenly to game FPS, and then to User experience. It's like they are throwing thing out at random and with "if you can't convince the audience, confuse them".
I'll be delighted if someone can elaborate on what Google is attempting to say here.
Moving on with higher refresh rate…
"The higher the rendering rate, the harder it is to sustain that frame rate, simply because there is less time available for the same amount of work. To render at 90Hz, applications only have 11.1ms to produce a frame as opposed to 16.6ms at 60Hz."
Fair enough. Which then brings to the quote you have highlighted.
"
…a Pixel 4 device running at 60Hz, where the application is woken up 2ms after the vsync event and SurfaceFlinger is woken up 6ms after the vsync event. This gives 20ms for an app to produce a frame, and 10ms for SurfaceFlinger to compose the screen.
When running at 90Hz, the application is still woken up 2ms after the vsync event. However, SurfaceFlinger is woken up 1ms after the vsync event to have the same 10ms for composing the screen. The app, on the other hand, has just 10ms to render a frame, which is very short.
To mitigate that, the UI subsystem in Android is using “render ahead” (which delays a frame presentation while starting it at the same time) to deepen the pipeline and postpone frame presentation by one vsync. This gives the app 21ms to produce a frame, while keeping the throughput at 90Hz."
OMG! Now I finally get what you were trying to direct to. If Google had to implement this "render ahead" as a workaround even for 90hz, 120 hz would be actually worse!
If the OS requires 20ms to produce a frame, and had to use this software "render ahead" to postpone frame, then at 120 hz the given time frame will be even much shorter at 8.3ms! (1000ms / 120hz).
Gosh,even with the supposed 1920hz refresh rate where I predict that OLED will finally be free from the symptoms of PWM, we have a situation where "render ahead" has to work far even harder. At 1920hz, the timeframe to produce a frame is only a mere 0.5ms. That's up to 40 times shorter than the standard 60 hz's 20ms. 🤦♂️🤦♂️🤦♂️🤦♂️. Well, that would mean even more aggressive software optimsation from Google's end!
IMO that's really backends, not forward. Why would increasing in refresh rate handicap its own timeframe for producing a frame? That's like shooting oneself in the foot. Their increase in refresh rate is not in proposition to other factors.
I wonder if we can access Android's Developer's Option and disable Vsync and higher refresh rate for good.