hard to know the right "dithering" thread, but thought this was helpful from the displayCAL forum. (I havent seen much talk here of using monitor calibration software)
How do I know if a graphics card is capable of dithering?
Looking at a smooth grayscale gradient in an image viewer with color management turned off(!) is usually a reasonable test. I.e. first look at it with the videoLUT set to linear (no calibration) to make sure banding is not in the image itself or introduced by the display (some panels are less than 8 bit and use dither internally to generate intermediate steps). Then load your calibration. If it still looks smooth, the graphic card either dithers, or ouputs a higher bit depth (both should suffice to reduce or eliminate calibration-related banding artifacts).