Although HDR technology has been available on most TVs for the last three years, it has recently appeared on the computer screen. Perhaps the reason is that until 2018, Windows has not yet supported the HDR display technology. Now, users can experience this technology on Windows to look at color with greater depth and clarity.
Make sure your device supports HDR first
Not all Windows machines can display HDR content. HDR only runs through DisplayPort 1.4 or HDMI 2.0a cables. These are requirements that the screen must meet to be able to display HDR content on Windows 10.
- HDR or TV screen supports HDR10 and DisplayPort 1.4 or HDMI 2.0 or higher. Recommended screen is DisplayHDR certified.
- Windows 10 must have a graphics card that supports PlayyReady 3.0 DRM (for protected HDR content). Possibly the following cards: NVIDIA GeForce 1000 Series or better, AMD Radeon RX 400 Series or higher, Intel UHD Graphics 600 Series or higher. Graphics cards that support 10-bit hardware-accelerated decoding for HDR video codecs are recommended.
- Windows 10 PCs must install a codec to decode 10-bit video (eg HEVC or VP9).
- Encourage the use of the latest WDDM 2.4 drivers on Windows 10. Check the Windows Update section in Settings or check out the manufacturer’s Web site.
Enable HDR on Windows 10
If your computer, monitor, and cable are up to date, and your PC updates to the latest Fall Creator’s Update release by the end of 2017, you’re ready to go. To enable HDR on Windows, open Start and go to Settings.
From there, click on Display, and you’ll see a button underneath the Night Light option that says “HDR and WCG”.
Just turn it on as your monitor can display HDR content, but note that non-HDR content looks pale. This is because Windows automatically adjusts the entire color palette on the system to display HDR content, meaning that anything you do (email, web browsing) that is not configured for HDR will look greyed out. darker than usual.