Skip to main content

What TV Did I Buy?

What TV did I buy?  Cause I am confused

Having been to the big-box store or surfed the web for the latest TV for you and your family, you will almost certainly have looked at 4k TV’s and UHD TV’s as well as TV’s that have HDR.

I am going to assume that as you are reading this you bought one of these, if not this might help clear some of the confusion up for you.

That new great looking TV is in your home but, what exactly did you buy and how can I see this amazing content that will showcase why I spent a little bit more to get this bad boy in my house?  It says it is UHD but everyone keeps telling me about 4k and the person I bought the TV from said they are the exact same thing!

Well, let’s start with the easy stuff.  Whether it says UHD or 4k on the box, it essentially the same thing for you but, they are different.  The difference between UHD and 4k is in essence something you do not need to worry about and has more to do with the difference in Cinema standards, 4k.  And broadcast/ TV displays, UHD.  It has become apparent throughout the world that 4k is looking like the standard word that we will use for Ultra High Definition and 4k, so from here on out let’s just say CONGRATULATIONS you have a 4k TV.  I will cover HDR at a later time.

Now there is a lot of technical stuff that I could go into but, I am guessing that if you really wanted a history lesson on this subject you would have already done so and I want to get to the stuff that is more interesting;-).

 

“My 4k TV is up and running”. Is it?

The 4k TV is up on the wall and I want to start watching 4k content so I can show all my friends how cool this thing is.  I plugged in my cable cable box, connected it via HDMI and look my cable service looks great, I can’t believe the difference in picture quality WOW!!!!!!!  I feel like I am actually there with the actors, this is so cool. Sadly, unless you have a cable company that offers 4k content and they have supplied you a 4k compatible Set-top-box you are not getting 4k you are getting a better picture but you have not even started to see what real 4k looks like.

Let’s start with the easy stuff that people just forget to tell you.

Did you get a 4k compatible HDMI cable?  If you are like the vast majority of people I have spoken to the answer is no.  Back to the store and make sure the HDMI says it is 4k, quick tip for you, I buy mine at Sam’s or Costco they are cheaper and they work fine plus having a spare is never a bad idea.

I have the 4k HDMI cable and now I am ready………..right?

You are getting there but make sure you read the TV Manufacturers booklet, many TV’s have a specific HDMI port for 4k and that should be shown in the getting started section of the pamphlet.  If you threw it away because you already knew how to plug a TV in (that’s the kinda thing I do) look at where the HDMI ports are and if the HDMI cable needs to be in a specific port it usually has a label next to that port.  Don’t panic if it does not have a specific port, more and more TV’s are being made where you can plug that cable into any port.  Now you are all plugged in the TV is ready, all you need is to find the content 😉

Next time we will discuss where to get your 4k content and touch on HDR.

Resolution 101 – What it is and when it matters

When you’re looking at buying a television in today’s market, there are going to be a lot of numbers thrown at you.  They all mean something different and are important for varying reasons, but the most important among them is the resolution of the television.  All other factors aside, most of the sharpness and fidelity of an image on the screen will be dictated by its resolution.

Simply put, the resolution of the television is the number of pixels that are packed into the screen.  A pixel is a dot of color which, when combined with a bunch of other pixels, makes up an image.  The more pixels there are, the sharper the image.  When you’re looking at the resolution of your television, phone, or computer monitor, it will be represented in a number of different ways.

The most common resolutions you’ll encounter these days are 1080p and 4K, though there are many, many others.  All of the different resolutions you can find have specific ratios of pixels in them as well, which are usually based upon the most common broadcast ratio (which is 1.78:1 or 16:9, commonly referred to as “widescreen”).

Let’s take the lowest resolution you can generally find in a television these days: 720p.  What that means is that there are 720 lines that are 1280 pixels wide, all stacked on top of each other.  The little “p” at the end means that each line is progressively scanned each time the screen refreshes (as opposed to an interlace scan, which only refreshes every other line and is not commonly used in any media these days); manufacturers these days all use progressive scans on their televisions, so you don’t have to worry about whether or not that top-of-the-line 8K television comes equipped or not.

Generally, when you buy a television, the exact resolution will be listed, but just in case you were curious, here is a handy table that tells you how many pixels you should expect when purchasing a television.

 

Resolution Dimensions Total Pixels
720p 1280 x 720 921,600
1080p 1920 x 1080 2,073,600
Ultra HD (4K) 3840 x 2160 8,294,400
DCI “Cinema” 4K 4096 x 2160 8,847,360
8K 7680 x 4320 33,177,600

 

One quick look at that chart, and you can see what the biggest difference is: the total pixel count.  A 4K screen has nearly ten times the pixels of a 720p, and an 8K screen has four times that.  And the more pixels, the sharper the image is going to be.  That won’t be the only measure of pixels that matter, though.  The number of pixels per inch (PPI) is also important to consider—and it’s also the biggest difference between phones, computer monitors, and televisions.

Some of the best-selling smart phones on the market are 1080p, and a great portion of the computer monitor market offers 4K.  A 1080p iPhone, a 1080p computer monitor, and a 1080p television all have the same number of pixels, but compared to the TV, the pixels are crammed tighter into every inch available on the screen—the pixels are actually smaller than they are in a TV.  Computer monitors and smart phones are meant to be viewed from closer distances than TVs, meaning that their images become harder to see the further you get away from the screen.  A TV, on the other hand, is meant to be viewed from a distance, so it needs bigger pixels to get that job done.

The rule follows for TVs of different sizes, too, however.  So the bigger your TV is, the bigger your pixels are going to be relative to a smaller screen with the same resolution.  So while that 20” 1080p screen might look super nice, if you go much larger, you might consider getting a 4K or even 8K screen in order to maintain image fidelity.  Just remember, more pixels are going to produce a sharper image!

Get the most out of your 4K TV with our new Blu-Ray series, the Best of UHD.

4k & UHD

In recent years, there has been a move away from identifying televisions by the number of lines of pixels they had on them—before, we had 720p and 1080p, and now, we have 4K and UHD.  But what’s the difference between 4K and UHD?  Why are some TVs labeled to be both at the same time?  And where did those names come from in the first place?

The short answer is that 4K is a cinema standard, and UHD is consumer standard.  But there’s more difference than just what type of thing you’re watching and where you’re watching it.

In 2004, Sony introduced a new style of projector for cinemas that had twice as many pixels in either direction as current projector resolutions.  The old technology was called 2K in the movie world, so it only made sense that the new technology should be called 4K, since it was twice as large.  A few years later, the Digital Cinemas Initiative (DCI) would adopt the resolution as an industry standard, and DCI 4K was born.  It is a staggering 4096 x 2160 (nearly 9 million pixels).

Unfortunately, your TV isn’t the same aspect ratio as a cinema screen.  When you go to the movies, you see films projected in a 1.9:1 ratio.  On your TV at home, the default resolution is 16:9 (or 1.78:1).  So, when the industry wanted to start making similarly-high-res screens for consumers, they changed their screen a bit to match, leading to it being only slightly slimmer 3840 x 2160.

Strictly speaking, since 4K was an established DCI standard already, in order to keep using the moniker on TVs, they’d need to keep the same resolution.  Unfortunately, the standard in TV is 16:9.  To keep as close to the cinema resolution as possible, manufacturers made the screens have just as many horizontal lines as DCI 4K did, and the resulting resolution was 3840 x 2160, and thus, Ultra-High Definition (UHD) was born.

To be labelled UHD, a product must meet a few industry standards.  First, it needs to have a resolution of at least 3840 x 2160.  Second, it must be capable of 10-bit color depth or higher.  Third, it must be capable of a much greater range of pixel brightness and darkness than previous TVs (specifically, 0.05 – 1000 nits for LED and 0.0005 – 340 nits for OLED, in case you are curious).

Of course, the important thing to note there is that screens can exceed those parameters and still be labelled UHD.  The recently announced 8K screens will be automatically considered to be UHD, since they have  all of the specifications required to be marked UHD.  So, to keep confusion to a minimum about what a consumer is getting, manufacturers label their products as 4K UHD.

Of course, just because you have a 4K UHD screen does not mean that you are getting 4K content.  In order to get the most out of your TV, you’re going to need input that matches your screen’s native resolution.  Otherwise, you’re going to just have an upscaled 1080p on your hands.

After you get your new TV, make sure you get the most out of it with our new Blu-Ray series, the Best of UHD.