Skip to main content

4k & UHD

In recent years, there has been a move away from identifying televisions by the number of lines of pixels they had on them—before, we had 720p and 1080p, and now, we have 4K and UHD.  But what’s the difference between 4K and UHD?  Why are some TVs labeled to be both at the same time?  And where did those names come from in the first place?

The short answer is that 4K is a cinema standard, and UHD is consumer standard.  But there’s more difference than just what type of thing you’re watching and where you’re watching it.

In 2004, Sony introduced a new style of projector for cinemas that had twice as many pixels in either direction as current projector resolutions.  The old technology was called 2K in the movie world, so it only made sense that the new technology should be called 4K, since it was twice as large.  A few years later, the Digital Cinemas Initiative (DCI) would adopt the resolution as an industry standard, and DCI 4K was born.  It is a staggering 4096 x 2160 (nearly 9 million pixels).

Unfortunately, your TV isn’t the same aspect ratio as a cinema screen.  When you go to the movies, you see films projected in a 1.9:1 ratio.  On your TV at home, the default resolution is 16:9 (or 1.78:1).  So, when the industry wanted to start making similarly-high-res screens for consumers, they changed their screen a bit to match, leading to it being only slightly slimmer 3840 x 2160.

Strictly speaking, since 4K was an established DCI standard already, in order to keep using the moniker on TVs, they’d need to keep the same resolution.  Unfortunately, the standard in TV is 16:9.  To keep as close to the cinema resolution as possible, manufacturers made the screens have just as many horizontal lines as DCI 4K did, and the resulting resolution was 3840 x 2160, and thus, Ultra-High Definition (UHD) was born.

To be labelled UHD, a product must meet a few industry standards.  First, it needs to have a resolution of at least 3840 x 2160.  Second, it must be capable of 10-bit color depth or higher.  Third, it must be capable of a much greater range of pixel brightness and darkness than previous TVs (specifically, 0.05 – 1000 nits for LED and 0.0005 – 340 nits for OLED, in case you are curious).

Of course, the important thing to note there is that screens can exceed those parameters and still be labelled UHD.  The recently announced 8K screens will be automatically considered to be UHD, since they have  all of the specifications required to be marked UHD.  So, to keep confusion to a minimum about what a consumer is getting, manufacturers label their products as 4K UHD.

Of course, just because you have a 4K UHD screen does not mean that you are getting 4K content.  In order to get the most out of your TV, you’re going to need input that matches your screen’s native resolution.  Otherwise, you’re going to just have an upscaled 1080p on your hands.

After you get your new TV, make sure you get the most out of it with our new Blu-Ray series, the Best of UHD.