YouTube have long had the ability to tweak or adjust video quality from as low as 144p all the way up to 4K or 2160p in order to provide the best streaming experience. This is affected by several factors, including internet speed of the user, device used to stream or resolution of the display. But that’s all there is to it —- modifying the video quality.
Display or screen resolution, on the other hand, is a little more complicated since it already involves both the hardware and software to set specific resolutions properly displayed on your phone, tablet, laptop or TVs for example.
It certainly affects your viewing experience, with the higher the resolution typically means you get sharper image and better quality overall.
Hence, display resolution is an important factor to consider when purchasing your new gadget, especially when it comes to smartphones —where entry-level phones usually have lower resolutions than those of the flagship models with over-the-top display parameters.
In this guide, we are going to take a closer look at the most common display resolutions and discuss how they impact your viewing experience.
Disclaimer: Computer monitors have different naming conventions when it comes to screen resolutions, such as Video Graphics Array (VGA), Extended Graphics Array (XGA), and Hyper XGA (HXGA). The guide won’t be discussing these, but instead will be focusing on much more known screen resolutions accessible via TV or mobile devices.
Table of Contents
What is a display resolution?
The display or screen resolution simply refers to the number of distinct pixels (horizontally and vertically) that make up an image in an entire screen. Which means the more pixels a screen can display, the sharper and more detailed the image will be. A higher resolution also means that more content can fit on the screen at once.
The display resolution is often described as “format” such as 1920 x 1080 accompanied by its base name—1080p, whereas it typically refers to the height of the resolution.
Sometimes, higher resolutions like Ultra HD are given a name of 4K, where the ‘K’ stands for kilo (equivalent to 1,000), and it is called like that since the resolution for UHD (3840 x 2160) almost reaches 4,000 horizontally.
There’s also the aspect ratio that refers to the proportion between the width and height of an image or screen. It is expressed as two numbers separated by a colon, such as 16:9 or 4:3, where the first number represents the width, and the second number represents the height.
Different aspect ratios are suited to different types of content and can affect how the content is displayed on a screen.
While aspect ratio and resolution are closely related to each other, they are NOT interchangeable terms. We’ll discuss more on this later.
To not overcomplicate things, I have categorized each commonly known resolutions to groups according to their quality or akin by their base names.
Low quality screen resolutions
3G-enabled feature phones were able to play videos at up to 240p resolution. While Standard Definition (SD) was used in most older televisions. It has a resolution of 640×480 pixels, which is much lower than the resolutions used in modern displays.
The term “standard” refers to the fact that this was the most common display resolution when televisions were at their peak of popularity.
High Definition (HD) resolutions
HD (1280 x 720)
HD, or 720p, is the lowest resolution considered to be high definition.
This resolution is commonly found on smaller devices such as budget-friendly televisions, tablets, and smartphones. While it may not offer the same level of detail as higher resolutions, it still provides a significant improvement over standard definition.
Full HD (1920 x 1080)
Full HD or 1080p—also known as widescreen—is the most common resolution for modern devices.
It offers a crisp, clear picture and is suitable for most uses, including gaming, streaming, and general use. Full HD is found on most mid-range to high-end devices, including televisions, monitors, and laptops.
Quad HD (2560 x 1440)
Quad HD, or 1440p, offers a significant increase in resolution over Full HD.
Commonly found on high-end smartphones, gaming monitors, and high-end laptops. Quad HD as its name suggests, is four times higher than HD.
The increased pixel density provides sharper images and text, making it ideal for tasks such as photo editing or video production.
Ultra HD (UHD) resolutions and beyond
Ultra HD (3840 x 2160)
Ultra HD or 4K, is the resolution of choice for high-end TVs, monitors, and projectors.
UHD offers four times the resolution of Full HD or 16 times higher than HD and provides an incredibly sharp and detailed image. This requires powerful hardware to run smoothly, as expected.
Modern gaming has now pushed this resolution to be a standard, with latest home consoles such as the PlayStation 5 running games at 4K resolution without any hassle.
8K (7680 x 4320)
8K is the latest and greatest in display resolution to date.
This resolution offers an incredibly sharp and detailed image, with an even higher pixel density than Ultra HD. However, it is not yet widely available and requires an even more powerful computer or TV to run smoothly.
Additionally, there is currently limited content available in 8K. But the possibility of it going mainstream is not totally blurred.
What does the ‘p’ stand for?
You may have thought that the letter ‘p’ next to a resolution’s base name stood for ‘pixel’. I myself thought the same, but I was wrong. It actually stands for ‘progressive scan’.
Progressive scanning is one of the two primary methods used to display visual information on a TV screen. The other method is interlaced scanning. Unlike interlaced scanning, which transmits lines of visual information in a specific pattern, progressive scanning transmits all the lines that make up a single frame at all once.
In the past, broadcasting networks used analog signals to transfer video. This was improved over time, but US analog television only allowed for a 480i resolution. In the 1980s, digital video for commercial use was developed, leading to new possibilities.
In the 1990s, the Federal Communications Commission (FCC) created new video resolution standards that included “high-definition” 1080- and 720-line video made possible by digital video.
This marked the beginning of the digital era of television, and with the further rise of digital video, progressive scanning has become the norm, as it should be.
Why some resolutions have a plus sign?
This is where aspect ratio comes more relevant. Smartphones, and even computer monitors today had gone outside the conventional 16:9 aspect ratio.
Phones becoming taller and taller, display monitors becoming wider and wider, however, that doesn’t mean the screen resolution becomes any better in quality. It’s just a more spacious display for us to enjoy and utilize but the resolutions stay the same.
For example, compare the Google Pixel with a 5-inch 1920 x 1080 screen to a computer monitor spanning 24” with the same resolution. Even though there’s a big distinction in their screen sizes, they project images or content at the same resolution, 1080p.
But let’s go back to the plus suffix. To describe it in a more technical way, resolutions having a plus sign beside them simply means that it has more pixels on one dimension (usually, on the horizontal or the x-axis) with another having a constant resolution.
Let’s cite an example, the Redmi 10A has an HD+ resolution with 1600 x 720 pixels. The default format of HD resolution is at 1280 x 720 pixels. See where this is headed?
Now, the aspect ratio of the Redmi 10A is 20:9. So, with it exceeding the usual 16:9 aspect ratio, it’s just expected to have more pixels on one side too, thus the HD+ naming convention.
That logic applies the same with Full HD+ (or FHD+). Let’s take the Nothing Phone (1) for instance, it has 2400 x 1080 pixels which is greater than the usual 1920 x 1080 format, therefore bearing the FHD+ resolution.
One last thing: Pixels-per-inch (PPI)
You may have heard of PPI, which stands for Pixels Per Inch. It is a measure that indicates the number of pixels found on a square inch surface and is used to calculate the pixel density of displays, such as computer monitors, TVs, and smartphones.
Manufacturers disclose this information on the spec sheet of their products, and PPI determines the quality of images, fonts, and smoothness of lines.
In the past, CRT screens only had around 96 PPI, but now, laptop and desktop LCD screens have around 100 to 140 PPI. For small smartphone displays, however, they need higher PPI to display more information. For instance, the first iPhone released in 2007 had 160 PPI, but three years later, the iPhone 4 had a Retina Display screen with 326 PPI.
Speaking of Retina display, the term was first introduced by Apple in 2010 as a marketing term to describe displays with a pixel density ‘high enough’ that the human eye is unable to discern individual pixels at a typical viewing distance.
It’s worth noting that the human eye can only detect so much, with the sweet spot being around 400 PPI. If a display has more than that (like the Samsung Galaxy S23 Ultra with ~500 PPI), you’ll likely notice barely any difference.
But Retina display was used to differentiate Apple’s high-resolution displays from other displays on the market at the time. And it has since been used on a variety of Apple products until today, including iPhones, iPads, and MacBooks.
Conclusion
Display resolutions have come a long way since the days of analog signals and standard definition. From low quality resolutions on feature phones to ultra-high definition on high-end TVs and smartphones, there are literally a bunch of resolutions available to suit different needs and preferences.