I recently used a flicker meter to compare an iPhone 4s which is of the "usable" kind and an iPhone 4 which gives me symptoms of eye strain in just a few seconds.
The good 4s barely touches 1% ripple flicker. It is below 1%. The bad 4 is somewhere between 1% and 3%.
It might be coincidence at this point as I need to test more devices, but so far this is a hint that the safe flicker percentage could be lower than most of us expected. At least lower than 3%. Which means when you are sensitive and you look into a supposedly flicker-free display that in reality still has alternating light intensity from 100% to 97% (or even up to 99%), the flicker might already be too high and could cause symptoms.
It is difficult to see such small changes on an oscilloscope output.
All my "flicker-free" monitors are in between 1% to 7% ripple flicker (so in fact they do flicker). Only one of them is usable, and is somewhere between 1% to 3%, amongst others who are not usable. Further strengthening the theory that the safe flicker percentage is somewhere below 3%. Sadly I could not track this down further yet as the flicker meter's scale won't display other low values than either nothing at all (means below 1%), 2% (means between 1% and 3%) and 4% (between 3% and 5%). My cheap oscilloscope can not show such low amplitude flicker, not to mention the high frequencies. I had not imagined that I would need even more resolution than steps of 2%. All I cared about was being able to detect flicker of high frequencies, hundreds of kilohertz.
I tested my OnePlus 3 with the brightness app "Brightness Manager", with which you can quickly change brightness in steps of 10. Changing back and forth from brightness level 255 to 245 is still consciously visible. This would be a flicker of just 4%. Imagine such a flicker at very high frequencies, say 200 kHz, and you have your average "flicker-free" display that might still hurt your eyes.