Those old enough to remember analog television sets and video cassette recorders (VCRs) no doubt appreciate the evolution of digital media technology over the last 25 years. Starting with the advent of the digital optical disc (DVD) of the 1990s and the high-definition TVs of the early-2000s, the image quality of popular films, shows, sports broadcasts, video games, home movies, and even security footage has been improving at a steady pace thanks to advancements in video technology. This, in a nutshell, is the evolution of video resolution.
To better understand how resolution affects image quality, imagine being given a box of 100 red tiles, 100 green tiles, and 100 blue tiles with the task of putting them together to recreate a photo of a flower, which is also given to you. Now imagine doing the same task only this time you’re given 1000 of each colored tile instead of 100. Besides being exhausted from doing 10x the labor, you’d notice the larger recreation, when viewed in its entirety, is much richer and detailed than the smaller image viewed at the same relative scale.
In short, the more individual pieces which can be utilized to express an artificial version of reality, whether it’s colored tiles or colored pixels, the more realistic the representation will look.
With DVDs remaining a staple in many households, let’s define today’s lowest acceptable resolution as 480p. The upper limit as of 2018 is 8K Ultra HD. In between these two are 720, 1080, and 4K.
Despite advances in resolution, 1080 remains the gold standard of video quality as of 2018. Whether it’s the family TV or the recordings of an IP wireless security camera, the most common resolution encountered today is 1080. This is mostly to do with the relationship between image quality and pricing. As superior as 4K and 8K resolution is compared to 1080, the lower-grade high-definition still looks pretty good to most people. Since manufacturing techniques improve every year, costs go down and a technology like 1080p resolution becomes more and more affordable.
Meanwhile, the 4K Ultra HD and 8K Ultra HD represent state of the art video image quality and this is reflected in the pricing. Most consumers are still essentially unable to justify spending that much money for the sake of a significantly better picture.
Give it time. Ten years ago it was the same debate only back then the argument was whether or not to buy a 1080p TV, which back then was state-of-the-art from a consumer’s point-of-view. As the prices for 4K and 8K video technology start to come down, the gold standard may need to be defined. It happened before and it will happen again.
This brings us to only one logical conclusion: the upper limits of consumer-grade video technology are only going to get even better than 8K in the years ahead. How is that even possible? Only time will tell, but there’s no reason not to assume the video technology of the 2020s and 2030s will be so crisp and clear as to be mistaken for reality. In fact, many argue we’ve already reached the lower end of this achievement via 8K Ultra HD.
Maybe the video technology of tomorrow will arrive at a point where something like the Holodeck from Star Trek becomes a reality? In a few decades, perhaps a person can stand inside a room encased in video screens and projection imagery capable of recreating reality down to the finest detail?
Again, only time will tell. But if the past and present is any indicator, video resolution is destined to only get better and better.