How to set up smartphones and PCs. Informational portal
  • home
  • Iron
  • What is Full HD and why is this format good.

What is Full HD and why is this format good.

1080p is currently the excellent high definition TVs and video projectors available to consumers. 1080p represents 1080 lines (or rows of pixels) of resolution that are displayed sequentially. In other words, all lines or pixel lines are scanned or displayed progressively, providing the most detailed high definition video currently available to consumers.

As a result, nowadays an increasing number of available televisions as well as projectors are referred to as "1080p televisions / projectors". However, what does this mean?

What is 1080

A 1080p HDTV has a native resolution of 1920 × 1080, and displays these pixels in progressive scan range. TVs that can only display progressive images are: Plasma TVs, LCD TVs, and DLP projectors.

In addition, all TVs or TVs with native (real) 1080p resolution can upscale all 480p, 720p, 1080i and other input sources to 1080p for displaying the input signal on the screen. Basically, the 1080p function on a TV screen can be done internally upscaled or reworked, or take a direct 1080p input signal.

1080p / 60 vs 1080p / 24

Almost all TVs that accept 1080p input can also accept the well-known 1080p / 60. By the way, 1080p / 60 is a 1080p signal that is transmitted and displayed at 60 frames per second (i.e. 30 frames are displayed twice per second).

However, with the advent of Blu-Ray discs, a new 1080p variation has been promoted and marketed as: 1080p / 24. By the way, 1080p / 24 is the frame rate of standard 35mm film, which translates directly into its native 24fps from the source (for example, from film to Blu-ray Disc). This means that in order to create an image on a TV or high definition HD screen, it must be able to accept a 1080p input at 24 frames per second. For TVs that do not have this capability, all Blu-Ray disc players can also be set to output 720p, 1080i, or 1080p / 60 signals.

720p

If you buy a TV either with a native resolution of 1024 × 768 or 1366 × 768 (720), this means that these TVs, the projector can display the number of pixels on the screen.

In addition, some 720p do not accept 1080p input signals at all, but are capable of receiving 1080i input signals. The number of pixels included is the same, but they are input in interlaced format (each line of pixels is sent alternately in an even / odd sequence), not in progressive format (each pixel in a line is sent sequentially). In this case, the 720p TV not only knows how to scale the input signal, but must either convert the interlaced image to a progressive image in order to display the movie on the screen.

So if you buy a TV or projector with a native resolution of 1024 × 768 or 1366 × 768 pixels, then you will see an image with a resolution of 1080 on the screen, but a 1920x1080p image will be slightly larger than a 720p or 480i image and will be converted to 720p. The picture quality will depend on how well the video processing circuitry in your TV or projector works.

If you are thinking of buying a Full HD TV or high definition with a screen less than 40 inches, then the actual visual difference between the three main HD resolutions, 1080p, 1080i, 720p will be minimal.

The larger the screen size, the more noticeable the difference between 1080 and other resolutions will be. If you are thinking of buying a high-definition TV or projector with a screen size of 40 inches or larger, then you better pay attention to the real resolution 1080p, and it is better to consider equipment that has 1080p / 24 capabilities.

1080i vs 1080p

1080i vs 1080p - similarities and differences

What you need to know about 1080i, 1080p / 60, 1080p / 30, and 1080p / 24

1080i and 1080p are both high definition display formats for TVs and. 1080i and 1080p actually contain the same information. Both 1080i and 1080p represent 1920 × 1080 pixel / resolution. The difference between 1080i and 1080p lies in the signal path that is sent from the source component or displayed on the HDTV screen.

In 1080i, each frame is sent or displayed in alternate areas. That is, 1080i fields consist of 540 lines of pixels or lines of pixels extending from the top to the bottom of the screen to the odd-numbered fields, which are displayed first and displayed per second. Together, both fields create a full frame of all 1080 lines of pixels or lines every 30 seconds.

At 1080p, each frame is sent or displayed progressively. This means that both even and odd fields (all 1080 pixel lines or pixel lines) that make up the full frame are displayed together. This results in a smoother image, with less motion artifacts and jagged edges.

Differences in 1080p

1080 also can be displayed (depending on video processing) there are: 1080p / 60 (most common), 1080p / 30, and 1080p / 24 formats.

1080p / 60 essentially the same frame is repeated twice every 30 seconds. (Enhanced video frame rate.)

1080p / 30 the same frame is displayed once every 30 seconds. (Standard broadcast or recorded video frame rates.)

1080p / 24 the same frame is displayed every 24 seconds (Standard movie frame rate).

The main thing in processing

1080p processing can be done in-source, such as upscaling a DVD player, Blu-ray disc player, or HD-DVD player - or it can be done by the HDTV itself.

Depending on the actual video processors used, may or may not be able, the difference is that the TV or projector will already do the final processing (called deinterlacing) stage of converting to 1080i or 1080p.

Any differences will be more noticeable on larger screen sizes of a projector or TV.

1080p, Blu-ray discs and HD-DVD

Also, keep in mind that with Blu-Ray and HD-DVD (although the HD-DVD format has already been discontinued), the actual information on the disc itself is in 1080p / 24 format (Note: There are some cases of content placed on a Blu-Ray disc 720p / 30 or 1080i / 30, but these are exceptions).

Most Blu-ray disc players have the ability to output 1080p / 24 to a compatible TV.

1080p / 60 and PC source

It is also important to note that when connected to a HDTV / projector via DVI or HDMI, the
the display actually feeds 60 discrete frames every second to the PC signal (depending on the source material), instead of repeating the same frame twice as in a movie or video based on Blu-ray Disc material. In this case, no extra is required to "create" the 1080p / 60 frame rate by converting.

Ultimately, the proof is in the actual view — what the image looks like in the real world with your particular hardware, combined with your source devices.

If you make measurements, or compare results using different TVs and sources yourself, you still may not be able to get the full benefits of 1080p. The main thing is being processed, and of course, not all HDTVs and video processors will be created equal - let your eyes be your guide.

I hope the article "What is 1080" helped a little.
Please leave comments below so I can get back to you.
Do not be afraid of me and add to

Screen resolution is the number of dots (pixels) that make up the image.

Typically, the resolution depends on the type and size of the TV screen. In particular, CRT TVs have a real resolution of about 768x576, mid-size plasma TVs have resolutions of 1024x720 or 1024x768, small and mid-size LCD TVs usually had a resolution of 1366x768 a few years ago. Currently, there is a tendency towards complete replacement in the LCD TV segment with models with a resolution of 1920 × 1080, as in the largest LCD and plasma TVs.

Resolution TVs 1280x720, 1366x768, 1400x900 or 1680x1050 may be called HDReady, with resolution 1920x1080 - FullHD

It is necessary to distinguish itself TV resolution and the resolution of the signal it reproduces. There are several signal resolutions with well-established abbreviations. DVD resolution (720х576 in PAL system or 720х480 in NTSC system) is called "standard" resolution (standart definition or, in abbreviated form, SD). The resolution of movies 1280x720 is often referred to as 720p and 720i (progressive and interlaced respectively). The highest resolution currently available, 1920 x 1080, is called depending on the scan 1080p or 1080i.

Currently, the next generation television infrastructure, conventionally called 4K or Ultra HD... The screen resolution of the respective TVs, as well as the video signal resolution, is 3840x2160(4 times more than FullHD). However, such systems are far from widespread, since in addition to televisions themselves, video cameras, means of processing, transferring and storing information are needed. Nevertheless, I am glad that the cost of entry-level Ultra HD TVs on the market dropped below 30 thousand rubles.

It should be noted that the "useful" image resolution will always be equal to the lower of the resolution of the TV screen and the video signal. Those. if you are watching DVD on a FullHD TV screen, then the "usable" resolution will be equal to the DVD resolution. Similarly, if you send a 1080p (FullHD) signal to a TV with a resolution of 1280x720 (HDReady), then the "useful" resolution will be equal to the resolution of the TV. it is smaller.

In general, the higher the resolution, the better.

However, there are exceptions to any rule. The fact is that in order to display a signal with a low resolution on a panel with a high resolution, its processing is required, the so-called upscaling... Moreover, if the signal is of not very high quality (for example, on-air TV), then the work of the processing algorithms is greatly complicated. Therefore, if the reception of terrestrial TV is the main intended mode of operation of the purchased TV, then be sure to visually assess the quality of reception in the store... Perhaps, in this case, the picture quality on a TV with a lower resolution will be better.

When choosing a TV or monitor for their computer, many people first of all pay attention to the company, cost, availability of certain functions, diagonal and, of course, screen resolution. Today, the screen with a resolution of 1080, also known as the HD standard, is quite popular among buyers. It should be noted that it, in turn, is divided into two types: 1080 resolution with the "p" value and with the "i" value. But what is the difference between them?

In fact, the difference between the two is in the way the image is displayed on the screen. 1080p is progressive output and 1080i is interlaced. Each of them has its own advantages and disadvantages. This is what affects the performance of these types of resolution and determines which one is better.

1080i

This standard is designed to receive a digital signal and a screen with an aspect ratio of 16x9. In this case, the screen must be 1080 pixels in height, and 1920 in width. This format is capable of broadcasting video of various formats, including in Full HD format.

Monitors with HD resolution most often have a refresh rate of 60 or 50 Hz. You can configure it yourself by going into the program settings on your TV or monitor.

The formation of a picture takes place in two stages. To begin with, even lines are displayed on the screen, that is, the second, fourth and further, and then the odd ones. This is invisible to an ordinary person, since each field is highlighted in one second about 30 times. Due to this way of displaying information, the image quality becomes not clear enough. This is the main disadvantage of 1080i TVs.

The positive qualities of devices with HD resolution include their low cost. Also, this technology allows you to reduce the flow of video data by 2 times, and the schematic and technical solution in such devices will be much simpler.

1080p

This standard is no different in terms of matrix resolution. However, in addition to standard refresh rates like 1080i, 24Hz is also supported. In addition, the image is not displayed line by line, but in one cycle, as a result of which the picture is clearer. True, this technology is a little more complicated, as a result of which the cost of such TVs becomes higher.

On TVs with this display standard, the picture is smoothed, which makes it more enjoyable. In addition, on such a screen, text written in small print will be much better read, and videos with fast frame rates or high speed will be clearer and without breaks.

What is the difference

After reviewing the specifications of the 1080i and 1080p standards, the main differences can be distinguished:

  • 1080i splits the frame in two and displays them as lines one at a time, while 1080p does it sequentially without splitting the frame;
  • 1080p is clearer and free of motion artifacts;
  • Modern TVs additionally process the 1080i signal, smoothing the image, while the HD signal with the "p" value does not need it;
  • HD i has less bandwidth than 1080p;
  • HD images with a “p” value do not lose quality when scaled, and text written in any font and size is much better read.

What's better

Summing up the comparison, we can note that 1080p is an order of magnitude better than 1080i. The picture will be refreshed more often and at the same time will be clearer, as a result of which the strain on the viewer's eyes will be reduced to almost a minimum. However, it is worth the fate that the cost of a device with this type of resolution and image display will be an order of magnitude higher than the price of a 1080i TV.

Devices with one-cycle image display also support some non-standard video files. You should also take into account the fact that devices with the 1080p standard can, if necessary, work in the 1080i standard. To do this, you just need to set this permission in the settings. Working with two standards has become possible due to the fact that it is possible to decompose one frame into two, but on the contrary, it is impossible to add two frames that are read in turn, and not simultaneously.

However, it should be borne in mind that few terrestrial television channels are broadcast in this standard. But do not worry, because, according to experts, many channels will soon switch to this format. In fact, the picture quality of TV channels will be an order of magnitude higher than it is now. But, all these have some nuances regarding the television signal.

1080P in DVR modes can be listed as AHD-H and 1080N as AHD-NH. So what's the difference? It seems that both means a 2MP resolution (as some unscrupulous sellers claim). In fact this is not true. 1080P is a resolution of 1920 pixels horizontally by 1080 pixels vertically. 1080N - 960 horizontal by 1080 vertical. That is, 1080N has half the horizontal resolution of 1080P. Here is a picture to make it clearer and clearer:

But when displaying a picture, 1080N has the same aspect ratio as 1080P - 16: 9. This is due to the coding principle of the 1080N signal. 1080N recording is interlaced. And during playback, the picture is stretched and smoothed horizontally.
This is what the picture looks like at 1080P:

And like this at 1080N without hardware processing:

It is noticeable that at 1080N the image is narrower.
If we compare 1080N after hardware processing and 1080P, then the differences are not very noticeable:
1080N

1080P

The differences are already apparent in the details, if you zoom in on the image:
1080N

So the detail on 1080N is worse than 1080P, but not as catastrophic as you might think. This is with regard to the "cons".
On the plus side, I can tell you that 1080N DVRs are cheaper than 1080P. Among other things, 1080N recording takes up less hard disk space, which is important in some situations. Therefore, where the detail of the picture is not very important, you can safely connect AHD DVRs with 1080N mode. Conversely, where picture quality matters, for example, above the cash register, you should look at more expensive video surveillance systems.

List of DVRs that support 1080N recording

Top related articles