With so many high- and ultra-high definition resolution formats on the market, it can be hard to tell the difference between them. For example 1080i and 1080p. From the outside, little to nothing is revealed about their attributes or differences.
High-definition (HD) refers to a screen resolution of 1920 pixels wide and 1080 pixels high (hence the use of “1080”). This means that both 1080i and 1080p have the same resolution. So, what’s the difference between them? Read on to find out.
The Difference Between 1080i and 1080p
The first thing to note is that the letters in 1080i and 1080p refer to which raster scan technique is used. A raster scan is simply how an image is reconstructed onto a display monitor.
The “i” in 1080i stands for interlaced scan, and the “p” in 1080p stands for progressive scan. These refer to two distinct methods of producing an image on a screen at 1920 x 1080 resolution. So, if both resolutions have 2,073,600 total pixels, what’s the difference?
Imagine your TV screen as rows of pixels. It’s 1080 pixels high, so there are 1080 rows of pixels from the top to the bottom of the TV. How fast the pixels are refreshed is referred to as the refresh rate. Most TVs and display monitors work at a refresh rate of 60hz (60 refreshes a second).
For the video display to work, each pixel in a digital screen must be refreshed fast enough to perceive it as motion (even though the screen is technically just flashing individual images).
The difference between 1080i and 1080p is how these pixels are refreshed to generate a consistent, easily watched “moving” image.
What Is 1080i and How Does It Work?
An interlaced scan produces an image by displaying the odd and even rows of pixels alternatingly. So all the odd rows are refreshed at 30 times a second, and all the even* rows are refreshed at 30 times a second, in sequence.
Both the odd and even rows are refreshed 30 times a second, so an interlaced scan effectively doubles the frame rate to 60 with no added bandwidth usage.
The 1080i method was produced to counteract the effect when the whole screen is refreshed from top to bottom too slowly, which results in the top of the screen displaying half of a different image to the bottom in older cathode-ray screens. In older screens, the top of the screen became duller and less illuminated than the bottom at the end of each scan.
The interlaced scan format was especially important when technology was limited, and it was essential to use as little bandwidth as possible. For broadcast television, it was an absolute necessity. But with the rise of better technology, 1080p came along.
1080i vs. 1080p
1080p is the format generally used on all modern screens and TVs. Instead of refreshing half of the pixels at a time—like 1080i—1080p refreshes the entire screen at once. For this reason, 1080p is sometimes referred to as “true HD.”
With the entire screen being refreshed at once, 1080p is effectively processing twice the amount of information that 1080i is at the same frame rate. The way 1080p refreshes the screen simultaneously is usually in a “wave” from top to bottom, with each row being refreshed at a time. This generally means that (with a 60Hz monitor) each row will refresh at 1/60th of a second.
This is why 1080p requires larger bandwidth than 1080i and why 1080i was used more historically. Now that this is no longer a limitation, 1080p has become the primary format for newer digital screens.
Interestingly, a lot of TV programs are still broadcast in an interlaced format—typically 1080i. This means that 1080p capable screens have to have a deinterlacing component to display the image correctly and avoid visual artifacts.
Deinterlacing is the process used to construct a full image from the two image fields of alternating rows of pixels that utilize 1080i. When this occurs, the image quality is somewhat reduced compared to true 1080p.
What About 4K?
Most brand-new TVs and many computer monitors boast 4K capabilities. 4K is called “ultra-high definition” and has a resolution of 3840 x 2160 pixels—almost four times that of 1080p or 1080i (and don’t get me started on 8K). This resolution brings a massive change in image quality, clarity, and sharpness.
But, as with 1080p being still limited by broadcast technology, 4K broadcast in cable or satellite will be even more limited. Having said that, major sporting events are now being broadcast in 4K, meaning that it will likely become more mainstream with time.
One setback is that a lot of 4K is compressed for more effective transmission. This means that a lot of the time, you aren’t experiencing true 4K.
Which Is Better: 1080i Or 1080p?
The main drawback of 1080i is when fast motion is being displayed. Because only half the image is being displayed at a time, fast motion can cause what’s referred to as “motion artifacts.” These are odd visual effects that result from images being displayed at different positions at the same time.
1080p avoids this issue, displaying much better image quality in fast motion scenes. Further, 1080p is generally more vivid and realistic, which most people prefer. The higher image quality (around 60% better) comes from the fact that in 1080i, the even and odd rows of pixels aren’t displayed simultaneously. In other words, 1080i is similar in quality to 720p.
But, one problem is that a lot of satellite and TV broadcasts are still in interlaced format, meaning that the full quality of 1080p isn’t broadcast.
With consistent technological improvements in this space, progressive scanning is already becoming the primary format for digital displays. Eventually, most broadcasts will likely use the progressive scan format.
About The Author