DisplayPort 1.4 & Chroma Subsampling
DisplayPort 1.4 was finalized early in 2016 and has only very recently started appearing in 2018 but ASUS is already crashing into its limits with the ROG Swift PG27U. It has the bandwidth to transmit 4K SDR video up to 120 Hz at 8-bit color, or 4K 120 Hz with 10-bit color as long as Display Stream Compression (DSC) is used. Switching to High Dynamic Range (HDR) eats up some bandwidth, so this reduces your maximum refresh rates at 4K HDR down to 98Hz.
So how exactly can Asus get the ROG Swift PG27U up to 144Hz at all, let alone with HDR running? Chroma Subsampling. What exactly is this? Any image information on a display is made up of two parts, Chroma (Color info such as red, green, or blue) and Luma (Brightness info such as black, grey or white). Asus takes advantage of the fact that this display can render more colors than a typical eye can perceive by reducing color information sent to free up some extra bandwidth. This is done by letting adjacent pixels share color information while each still retains its luminance information. The Human eye is far more sensitive to light levels, so brightness information can’t be tampered with without being quite noticeable but you might not notice a slight reduction in color quality.
The one place you can sometimes see this is small text with high contrast, such as black letters over a white background. With two adjacent pixels sharing an average color between them, it almost has the effect of antialiasing. What would normally be a sharp contrast between a letter and its background might turn out to be a slightly blurry grey instead.
A couple of interesting things to note here. First is that SDR is always running at 8-bit color, even in WCR mode.
Next is that at 120Hz, we see HDR mode starting to have to make compromises, dropping from 10bit color to 8bit with dithering. At 144Hz, both SDR and HDR content both have to drop to 4:2:2 Chrome sub-sampling to meet the bandwidth requirements, but HDR is able to jump back up to 10-bit color.
What happens if we drop the resolution, can we run full color with no compromises at say, 1440p?
Here we hit something interesting. It seems even though we set the monitor to 2560x1440 resolution which Nvidia control panel and Windows itself is quite happy to do, the monitor will only accept signals at its native 3840x2160 resolution. This forces the display data to be upscaled to 2160p before leaving the video card and still saturates the DisplayPort connection.
Even dropping down to a pedestrian 1080p, we see the same behavior exhibited. So much for that idea huh?
Just to confirm, even with the display running at a supposed 1080 from Windows and Nvidia, the monitor does indeed still show it’s input as 2160p@144Hz HDR.