Resolution coexistence: The way forward for graphics
Resolution coexistence: The way forward for graphics
By John Watkinson
Some years ago, when life was simpler, there were only two scanning standards to cater for: 525 line and 625 line. Computers were too slow to work on images in real time, and special hardware had to be built. But things change. Today there is a veritable morass of scanning standards, not only in television, but also in other areas where digital technology has been applied to images — digital cinema and computers, for example.
Figure 1. Digital image technology has resulted in a number of different scanning standards. This diversity encompasses two opposing trends. First, an increasing pixel count is being used for increased quality. The second is the use of compression and low bit rates to lower costs.
Figure 1 shows this diversity, in which there are two opposing trends. The first of these is an increasing pixel count intended for increased quality; the other is the use of compression and low bit rates to allow low-cost or hitherto impractical services.
Figure 2 shows how standards can vary in a large number of technical parameters. One irritating incompatibility is that computer display formats use square pixels whereas Rec. 601 SDTV doesn’t. There are few different aspect ratios in video and computers, whereas in film one loses count.
Figure 2. Digital imaging standards can vary in a number of technical parameters, including aspect ratio and gamma.
Get the TV Tech Newsletter
The professional video industry's #1 source for news, trends and product and tech information. Sign up below.
Gamma is universal and standardized in TV, whereas computers may use linear light coding or even a nonstandard gamma. Given that the eye sees less detail in color, some different approaches to color coding are to be expected. The RGB output of the traditional computer needs too much bandwidth for production and broadcast, where color difference working is the norm. The color difference data may be downsampled in a variety of ways. Production equipment uses 4:2:2, whereas DVB and DVD use 4:2:0, which has to be interpolated vertically to allow use with interlaced systems.
Clearly the single-format or 525/625 switchable graphics unit is dead. The question is, what should replace it? Fortunately, given the massive flexibility and speed of modern processors, the answer is straightforward. Stated simply, equipment should be built that is so flexible that it really doesn’t care what the format is. Whatever the format coming in, the equipment should work at that standard.
Figure 3. Digital processing consists mostly of multiplication and adding, as when a digital transversal filter is used.
While digital recording and transmission can be completely lossless, this is not true when processing is carried out. When processing takes place, pixel values will be multiplied and added in various ways. Figure 3 shows a common processing tool, the transversal filter used in most DVEs and re-sizers. Another common process is conversion from RGB to color-difference signals. A matrix like the one shown in Figure 4 is used for this process.
The finite precision of digital systems makes some quality loss inevitable. Some code values require extra bits to carry the full resolution. These extra bits are easily carried inside processors, but are lost when a standard word length output is needed. For the best results, the rounding off to the standard word length should be done only once. Anything else will cause generation loss.
Figure 4. Color coding is different in different imaging technologies. In some cases, a matrix converts RGB output to color-difference signals for use in broadcast and production.
These issues apply equally to the task of format conversion. Consider a traditional single-standard graphics processor. If a signal of another standard is to be used, there has to be a format conversion on the input. The graphics process then takes place, followed by conversion back to the original format. This causes three stages of quality loss instead of one. Clearly the correct solution is to avoid unnecessary format conversions. Instead of converting the input format to suit the graphics processor, the graphics should be generated in the original format so that the only processing needed is the keying step.
Consider the example of generating and keying text over video. The outline of the text is generated as a wire frame in a high-resolution format. In order to key over incoming video, the wire frame has to be converted from the internal high-resolution format to a key signal having a format identical to that of the input standard. The only modification to the input pixels is where keying takes place. In other areas of the picture, there is no change at all. In this way the highest quality is maintained.
To prevent quality loss during graphics creation, broadcasters should try to avoid unnecessary format conversions. Quantel’s gQ graphic system holds mixed-resolution images in their original format.
With the above approach, it is easy to create material with multiple resolutions. The same graphic art can be re-sized to suit any format, so it could be keyed into SDTV or HDTV with equal ease. Therefore, in a multi-resolution production environment, the highest quality is always available in each resolution used.
John Watkinson is a high-technology consultant and author of several books on video technology.
Acknowledgement: This article was adapted from “Resolution co-existence-format-agnostic post production,” which is available on the Quantel Web site, www.quantel.com (click on resources). Additional articles from Mr. Watkinson are available on the Quantel site.