Interfacing with common carriers
Common carriers play an important role in the delivery of television programs to private homes. The original analog monochrome television transmission concept was remarkably simple. It consisted essentially of:
- A camera and microphone.
- A studio-to-transmitter link (STL).
- A visual and aural transmitter.
The processing, distribution and transmission of analog video signals is characterized by less than ideal performance in terms of linear distortions, nonlinear distortions and noise.
The analog world
The use of less than ideal distribution equipment affects the shape of video signals. In an analog world, there is a direct relationship between the wave-shape and the picture quality. The resulting picture impairments can be judged subjectively by observing the picture quality on a picture monitor.
In the early 1950s, various national and international bodies developed subjective picture-quality grading criteria, resulting in minimum acceptable performance figures. Maintaining local and intercity network equipment to meet these figures resulted in satisfactory performance for carrying standard monochrome television signals but unsatisfactory for carrying compatible color signals.
Therefore, the introduction of color in the 1950s resulted in tightening the tolerances of the performance figures as well as the introduction of new performance indicative parameters. These new parameters quantify the chrominance-to-luminance and luminance-to-chrominance interaction.
The overall performance of a distribution system can be predicted with a certain degree of accuracy by applying a formula developed by CCIR for predicting the performance of international video signal distribution networks.Objective equipment performance measurements use standardized test waveforms. These waveforms are tailored to contain frequency domain components best suited for the measurement of specific types of impairments.
Early approaches involved a television test signal generator at the origination point (MCR) and video test equipment at the destination. This required station shut-off and a long time of tests.
Later approaches involved transmission of vertical interval test signals (VITS) inserted in on several blanked horizontal lines in the vertical blanking interval. This allows the performance tests to be carried out at any convenient time without requiring transmission shut-off. Recent test equipment such as the Tektronix VM700 carries the testing automatically and generates a performance test results printout referenced to specifications.
The DS3 concept
In the transition period from analog to digital systems, some types of hybrid systems were likely to exist. Early approaches digitizing the signal distribution resulted in the introduction of the DS3 digital distribution network operating at 45Mb/s. Essentially, the NTSC composite video signal is sampled at 4Fsc, and a DPCM compression process is applied to reduce the overall bit rate to nominally 45Mb/s.
This concept made its appearance on the market before MPEG-2. In those days, it was generally agreed that the normal approach to digitization and compression would be the adoption of a subcarrier-related digital sampling strategy. The system consists in:
- An encoder that digitizes and compresses one composite NTSC video channel and, typically, four audio channels. The inputs are NTSC analog video with typical analog video signal performance specifications and analog audio with typical analog audio performance specifications.
- A decoder that decompresses and converts to analog one NTSC composite video signal and four analog audio signals.
The performance specifications are typical of analog video and audio systems, and one could ignore that this is a digital compression/decompression system. The DS3 concept has enjoyed an enormous success. Even today it is generally less costly to install and use than some MPEG-2 systems.
The MPEG-2 world
MPEG-2 has revolutionized the broadcasting world. Using the MPEG-2 video compression system, a 270Mb/s signal can be reduced to 4Mb/s bit rate without visible picture quality reduction.
A contemporary MPEG-2 encoder is typically 1RU high and accepts analog composite NTSC and 270Mb/s SDI. The typical NTSC internal (or external) manufacturer-provided decoder uses sophisticated comb filters for luminance and chrominance separation. Earlier encoders were quite bulky and used relatively inexpensive comb filters. The user may choose MPEG-2 4:2:2 or 4:2:0, between 1.5Mb/s and 50Mb/s bit rate, and the I-B-P sequence.
The contemporary MPEG-2 decoder is typically 1RU high and has SDI 270Mb/s and analog composite NTSC outputs. Some decoders have color-black reference inputs, which allow the NTSC output to be timed and phased to studio reference.
The video performance specifications of MPEG-2 systems are analog NTSC and, without exception, no reference is made to compression/decompression-related picture impairments. The advertised data are, therefore, useless. Manufacturers ignore the fact that the picture quality changes dynamically depending on the data rate, the picture complexity and the encoding algorithm used. So, buyers have no available data expressing the picture quality in comparison to that of a number of competing systems.
A significant problem confronts the user of large systems made up of a concatenation of encoders/decoders (codecs). The overall system performance is unpredictable. For instance, codec A followed by codec B may not produce the same set of impairments as codec B followed by codec A.
Additionally, the use of statistical multiplexing adds a time-varying aspect to the data rate available for the compression signal, i.e. of creating a time-varying quality factor as pictures have to be subjected to greater compression to fit in the allocated bit rate.
How do you handle it?
Picture quality is of concern to all people in the broadcast chain. To achieve it, the system design group must use a combination of subjective and objective approaches in specifying and selecting equipment.
Whenever possible, the performance of a given piece of equipment should be measured against an industry standard using a picture quality analysis (PQA) system. System commissioning teams should be specifying the required picture-quality levels, and these should be followed throughout production and post-production through to final transmission. Using the selected equipment, an experimental system should be set up in a test lab, and its overall performance, from input of the encoder to the output of the decoder, should be measured using a PQA test system.
On several occasions, I have used a Tektronix PQA system, which presents the measurements results in a single numerical quantity called picture quality rating (PQR). It also presents peak signal to noise ratio (PSNR) values.
Several test sequences are available on a CD-ROM. Select a sequence with a lot of detail and one with a lot of movement. Stick to them for all present and future tests to ensure uniformity. Pending an agreed upon set of performance test results, generate your own and keep them for future reference. To maintain low final impairment, the highest standards have to be maintained throughout.
All the problems reviewed in this article deal with standard-definition systems. To my knowledge, there currently is no PQA test equipment designed specifically for HDTV.
Michael Robin, fellow of the SMPTE and former engineer with the Canadian Broadcasting Corp.'s engineering headquarters, is an independent broadcast consultant located in Montreal. He is co-author of “Digital Television Fundamentals,” published by McGraw-Hill.
Get the TV Tech Newsletter
The professional video industry's #1 source for news, trends and product and tech information. Sign up below.