Contents
The screen resolution or display resolution was not a big deal in earlier technological eras. Windows only had a few basic selections, and you had to install a driver for your video card to get a larger resolution, more colors, or both. Better video cards and displays became available as time went on. When it comes to displays, their quality, and the available resolutions, we have many options nowadays. In this post, we walk you through some background information and define all the key terms, including abbreviations for screen resolutions like 1080p, 2K, QHD, and 4K. Let’s start now.
IBM and CGA served as the catalyst
IBM was the company that initially created the color graphics technology. Color graphics adapter (CGA), enhanced graphics adapter (EGA), and video graphics array (VGA) came first, second, and third, respectively. You would still need to select from one of the limited options provided by your graphics card’s drivers regardless of the capabilities of your monitor. Here is a glimpse at how things appeared on a once-famous CGA display for the sake of nostalgia.

Choosing a screen resolution is more complicated than it used to be with the introduction of high definition video and the growing popularity of the 16:9 aspect ratio (we’ll explain aspect ratios in a moment). There are, however, a lot more options available now, with something to satisfy almost everyone’s preferences. Let’s examine the vocabulary used today and what it means.
What is the source of the screen?
When referring to the quantity of pixels on a screen, the term “resolution” is incorrect. That doesn’t indicate how closely the pixels are packed together. Another indicator, known as PPI, accounts for that (Pixels Per Inch).
Technically, “resolution” refers to the amount of pixels per square inch rather than the overall number of pixels. As opposed to the 100% technically correct usage, we are using the phrase in this article as it is popularly understood. The quantity of pixels organized horizontally and vertically on a monitor has always been used to characterize resolution, whether precisely or not. For instance, 640 by 480 pixels equals 307200. The options were dependent on the video card’s capabilities, and they varied from manufacturer to manufacturer.

Because Windows only supported a certain number of resolutions, if you lacked the driver for your video device, you were forced to use a lower-resolution display. You might have briefly seen the 640 x 480 low-resolution screen if you’ve watched an older Windows Setup or installed a newer video driver. Despite being unattractive, it was the Windows default.
Windows started to provide a few additional built-in options as monitor quality increased, but if you needed a really high-resolution display, you were still primarily dependent on the graphics card manufacturers. The most recent versions of Windows can recognize and adjust for the monitor and graphics card’s default screen resolution. Even while it may not always be the greatest option, what Windows selects works and may be changed if necessary after you see how it appears.

Be cautious when it comes to screen resolutions.
You might have seen terms like 720p, 1080i, or 1080p used to describe the screen resolution. Why does that matter? The letters first explain to you how the image is “painted” on the monitor. Progressive is denoted by a “p,” and interlaced by a “i.”
The interlaced scan is a remnant from early CRT monitors and television. Pixels are organized in horizontal lines across the monitor or TV screen. Older monitors and TVs had lines that were fairly simple to see if you stood close to them, but today’s screens have pixels that are so small that they are difficult to detect, even with magnification. Too swiftly for the eye to see, the monitor’s circuitry “paint” each screen line by line. All the odd lines are painted first on an interlaced display, followed by all the even lines.
Flicker has always been an issue with interlaced scans since the screen is painted in opposite lines. This issue has been addressed by manufacturers in a number of ways. The most popular method is to raise the refresh rate—the number of times a full screen is painted in a second. The average refresh rate was 60 Hz, which is acceptable for most users but may be increased a little to eliminate the flicker that some users still noticed.

Refresh rate and frame rate were terms that evolved when consumers moved away from older CRT displays as a result of how an LED monitor operates differently. The speed at which the monitor shows each individual frame of data is known as the frame rate. LED screens do not flicker, and the most recent versions of Windows set the framerate at 60 Hertz or 60 cycles per second. Additionally, because the new digital screens are so much faster, the system switched from interlaced scan to progressive scan. Instead of painting the odd and even lines on the screen in that order, a progressive scan paints the lines in that order.
For example, 1080p is used for televisions that have a progressive scan and a horizontal resolution of 1080 lines, if you want to translate. Progressive scan is a very mind-boggling example of the distinctions between progressive and interlaced scans in Wikipedia. Also read Interlaced video for a different fascinating historical lesson.
The resolutions 720p, 1080p, 1440p, 2K, 4K, and 8K are interesting.
Manufacturers created a shorthand to describe the display resolution of high-definition TVs when they started to become the standard. 720p, 1080p, 1140p, or 4K are the most typical resolutions. As we’ve seen, the letters “p” and I indicate whether the display is progressive-scan or interlaced-scan. Even though a monitor typically has a higher definition display than a TV, these abbreviations are also occasionally used to designate computer monitors. The number always refers to the display’s total number of horizontal lines.
The shorthand is translated as follows:
- The standard for HD or “HD Ready” resolution is 720p (1280 x 720).
- The resolutions 1080p (1920 x 1080), often known as “Full HD,” and 1440p (2560 x 1440), also known as “Quad HD,” are typically found on gaming monitors and high-end smartphones.
- The resolution of 1440p is four times that of 720p HD or “HD ready.” Many high-end smartphones have a so-called 2960×1440 Quad HD+ resolution, which nonetheless fits into 1440p, adding to the confusion.
- Ultra HD resolution, also referred to as 4K or 2160p, is 3840 by 2160 pixels. On high-end TVs and computer monitors, it is a big display resolution. Because of its close to 4000 pixel width, 2160p is referred to as 4K. It offers four times as many pixels as 1080p FHD, or “Full HD,” in other words.
- 16 times as many pixels as the standard 1080p FHD or “Full HD” resolution are available in 8K, also known as 4320p or 7680 x 4320. 8K is now only available on pricey TVs from Samsung and LG. However, you can use this 8K video sample to see if your machine can render such a big quantity of data
The absence of 2K for consumer devices is the issue with this technology
The 2K resolution, which stands for 2048 x 1080, is used in filmmaking. However, it would be regarded as 1080p in the consumer market. Even worse, some display manufacturers refer to resolutions with a horizontal resolution of 2000 pixels or higher as “2K,” such as 2560×1440. This resolution is 1440p, or Quad HD, not 2K, therefore that is unfortunately inaccurate.
This is why it’s false to say that a TV, computer display, smartphone, or tablet has a 2K resolution. The actual resolution is probably closer to 1440p or Quad HD.

Can low-resolution screens display high-resolution videos?
You might be unsure if you can view a high-quality movie on a screen with a lower resolution. Is it possible, for instance, to watch a 1080p film on a 720p TV? Yes, is the response. You can watch any video on your screen regardless of resolution, regardless of the resolution of the video (higher or lower). However, if the video you wish to watch has a greater resolution than your display, your device will downscale it to match your display’s resolution. Downsampling is the term for this.
A video with a 4K resolution, for instance, will be presented as 720p on a 720p screen because that is the most resolution that your screen can display.
What does aspect ratio mean?
Aspect ratio, which describes how broad a picture is in relation to its height, was first employed to describe motion movies. The 4:3 aspect ratio was originally used for movies, and it was later adopted by television and early computer screens. Motion picture aspect ratio moved considerably more swiftly to a wider screen, which required that movies be cropped or the image be altered in some other way when they were shown on television.

Widescreen screens have also been adopted by TV and monitor makers as display technology has advanced. “Widescreen” was first used to describe anything wider than a standard 4:3 monitor, but it soon came to represent a 16:10 ratio and then a 16:9 ratio. Nowadays, almost all computer monitors and TVs are solely widescreen, and television broadcasts and websites have made the necessary adjustments.
The most common aspect ratio for widescreen computer displays up until 2010 was 16:10. However, 16:9 has replaced 16:9 as the high-definition standard aspect ratio as a result of the popularity boom of high definition televisions, which used high definition resolutions like 720p and 1080p and made these names synonymous with high-definition.
You can only use resolutions that are particular to the width and height of your display, depending on its aspect ratio. The following are some of the most popular resolutions that can be used for each aspect ratio:
- Resolutions with a 4:3 aspect ratio include 640 x 480, 800 x 600, 960 x 720, 1280 x 960, 1400 x 1050, 1440 x 1080, 1600 x 1200, 1856 x 1392, 1920 x 1440, and 2048 x 1536.
- Resolutions with a 16:10 aspect ratio are 1280 x 800, 1440 x 900, 1680 x 1050, 1920 x 1200, and 2560 x 1600.
- Resolutions with a 16:9 aspect ratio include 1024 x 576, 1152 x 648, 1280 x 720 in HD, 1366 x 768 in FHD, 2560 x 1440 in QHD, 3840 x 2160 in 4K, and 7680 x 4320 in HD (8K).
Is there a connection between display orientation and aspect ratio?
The two most popular screen orientations are landscape and portrait, and the display orientation describes how you view a screen. With a landscape configuration, the screen’s width is greater than its height, and in a portrait orientation, the converse is true. The majority of large screens, including those on our laptops, TVs, and PCs, are in landscape orientation. Smaller screens, like those on our smartphones, are typically used in portrait mode but can also be used in landscape mode due to their size and ease of rotation.
The ratio between the screen’s longer and shorter sides is known as the aspect ratio. As a result, when you see a screen in landscape mode, the aspect ratio of the screen shows you the ratio of the width to height. In portrait mode, the aspect ratio is not utilized to define screens or any other rectangular shapes.

In other words, you could claim that a 16:9 aspect ratio is equivalent to a 9:16 aspect ratio, although the latter is not a recognized way to talk about aspect ratio. However, there are two ways to refer to the screen resolution. For instance, a resolution of 1920×1080 pixels and 1080×1920 pixels are equivalent; the only difference is in the orientation of the pixels.
What impact does screen size have on resolution?
Windows does not even give you the option to view in widescreen, despite the fact that a 4:3 TV can be set up to display black bars at the top and bottom of the screen while a widescreen movie or show is being shown. This is because it does not make sense to do this with a monitor. Although your media player does this, you can view movies with black bars to simulate watching a TV screen.

The ability of the monitor to display higher resolution images is more crucial than its size. The graphics on the screen get smaller the higher the resolution is adjusted, until eventually the writing on the screen is so small that it is unreadable. On a larger monitor, the resolution can be increased quite a bit, but if the pixel density isn’t adequate, you won’t obtain the highest resolution before the image becomes illegible.
If you instruct Windows to use a resolution that the monitor cannot support, the monitor frequently displays nothing at all. To put it another way, do not anticipate miracles from a low-cost monitor. You unquestionably get what you pay for when it comes to high-definition monitors.
Do you have any other inquiries regarding screen resolutions?
If you’re not technically savvy, you could be perplexed by all the technical terms relating to displays and resolutions. Hopefully, this post has helped you better grasp the aspect ratio, resolutions, and type of displays that are most important to know about. Please feel free to express any questions you may have about this topic in the comments box below.
>> Read more:
- The 5 Best Methods for Transferring Files Between an iPhone and a Mac
- Best gaming laptop to buy for 2021 at all sizes and prices
- How to remove windows from my Mac computer easily and quickly
- Hulu Free Trial: How to Get 30 Days of Hulu for Free
- How to add a footnote in Word in the best easy and quickly way
- How to put a video in Google slides easily and quickly
- How to turn a PDF into a Word Document easily and quickly
- How to take a screenshot on a Hewlett Packard easily and quickly
- 2022’s Top Online Photo Printing Services
- Why I can’t skip ads on Youtube and solution