diop How were apps working on the laptop? e.g. Could 'bad' apps such as Chrome etc work without symptoms?
Everything I used worked for me. I spent a lot of the 14 days gaming as I've got about a decade of games to catch up on!
diop Presumably a PC build w/IPS G-Sync monitor is going to be quite expensive?
It depends on what you go for but it certainly doesn't have to be that expensive - I'm going for a top spec machine (Ryzen 7 // RTX 3080)and it's coming in a little over the cost of the laptop (which was £1600) but for a huge leap in power. The G-Sync monitors aren't ridiculous from what I've found so far but need to do a bit of digging.
diop In some respects it makes sense why G-Sync doesn't cause symptoms. AFAIK dithering algorithms are designed to work at a fixed frequency. Or baked-in dithering could be a failsafe by GPU manufacturers to ensure a consistent output across all external displays. I assume G-Sync (when handshaking with a G-Sync monitor) 'trusts' the capabilities of the panel and doesn't resort to trickery such as temporal dithering. There is also Freesync which is another avenue to explore in the future.
It's difficult to know, but I have read that G-Sync involves a specific hardware module in the monitor which enables the refresh rate trickery, whereas Freesync doesn't require this. When checking out Freesync laptops it appears that they don't necessarily bypass the built in iGPU in the same way that Nvidia G-Sync does, so that could be an issue.
I'm tempted to try and return another G-Sync laptop, because that would begin to rule out just 'striking it lucky' with the first one I tried. Having read about Nvidia Advanced Optimus, I'm wondering if this could potentially be a game-changer, given my good experience with G-Sync.
Advanced Optimus has a hardware display switcher between the iGPU and dGPU:
This means that the frame buffer of the discrete GPU isn't being sent to the integrated GPU (ie Intel / Radeon) to be drawn on the screen. Therefore as long as you can use one GPU successfully, it'll be directly connected to the display. Knowing that I am able to use the Nvidia dGPU in the HP Omen (but not the Intel iGPU), the ability to dynamically switch to dGPU only seems like it could be a great thing.
It feels like it would also rule out the physical hardware of the iGPU output stage as an issue. The HP Omen has a MUX (display switcher) that requires a restart when you switch between GPUs, but when connected directly to the dGPU the display just feels completely different to me, in a very good way.
Here's a video on it to check out:
On another note, between testing out the Macbook M1 (horrible), a MS Surface (horrible) and the HP Omen (beautiful) I hadn't used my XPS 15 in a long time. Going back to it, somehow it had updated drivers and OS functionality (still on 2004 though) without me realising. I used it for half a day, blissfully unaware that the Intel drivers (amongst many others) had updated before I started to get an almighty migraine which is still lingering 2 days on.
Sometimes little 'blind tests' like this are handy to satisfy me that I'm not completely nuts.