Multiplexing
As its most basic function, the multiplexing operation combines several programs into one overall program. That's the simple view. In common practice, multiplexing can be quite complex, and the overall purpose is to combine separate programs — usually video, audio and data — into one logical signal entity for the sake of contribution, distribution and transmission, and often, to conserve bandwidth.
The most common methods of transmission multiplexing all involve the generation of transport streams. Within these streams, the various programs are compression encoded. Figure 1 shows this in a very simple form. Multiplexing is often done in a hierarchical manner too. In this figure, the programs could be completely unrelated entities, e.g., different entertainment programs, or they could be the interrelated video and audio components (and associated data) of a larger program. In a transmission that includes mobile service, the multiplexer (or mux) will also include an IP encapsulator, which formats the stream to allow Internet-like accessibility, and a signaling generator, which packages together the various information tables associated with the mobile broadcast.
The simplest approach to multiplexing is to encode each program at a constant bit rate (CBR); the overall transport stream bit rate is thus the sum of the individual program rates, plus a small amount for transport overhead. CBR encoding works by setting a tight target on the number of bits per frame of video, without regard to the video content itself. (See the top portion of Figure 2.) While each frame type (i.e., Inter, Intra) can get a different number of bits, the ratio of bits between different frame types is kept to a relatively constant factor, as is the total number of bits in a reference time, usually one Group-of-Pictures (GOP). This often means that a more complex frame (more detail and motion) will get a lower encoded quality than a less complex one because it would require more bits to encode a more complex frame.
In order to produce a more-constant encoding quality, variable bit rate encoding (VBR) can be used. A VBR encoder sets a target number of bits depending on an analysis of the video content itself, resulting in a bit stream whose bit rate varies between a minimum and maximum level. While this bit allocation can be done on the fly, the highest efficiency can be achieved by analyzing many frames of video and then performing the bit allocation according to a calculated “schedule,” as shown in the bottom portion of Figure 2. DVD encoding, as it can be done in non-real time, can actually perform this two-pass encoding by first analyzing the entire length of the program, and then going back and applying the bit allocation frame by frame. Live VBR encoding, however, must constrain itself to an acceptable latency, usually on the order of frames or even a few seconds.
While multiplexing with CBR encoding results in the lowest latency of the process, it is inefficient as far as bandwidth use, especially due to the fact that multiple programs do not always dynamically need all the bits allocated to them. This provided the motivation to develop statistical multiplexing (stat-mux) using VBR encoding, where each program is given a frame-by-frame bit allocation that also depends on the relative need of all the programs taken as an ensemble. With a stat-mux, the increase in efficiency can often mean adding one more channel into a mux for every three when using CBR and simple multiplexing. The cost, of course, is both latency and hardware complexity. In addition, for a mux that is heavily loaded with many programs, compression artifacts will become noticeable; “pumping” at the GOP rate can result in a periodic overall blockiness of the picture, and random blockiness can occur when the stat-mux robs bits from one channel to feed another with a higher priority.
Remultiplexing and rate control
Another variant of multiplexing is that of remultiplexing. While a mux used for transmission will typically be the last such device before the transport stream reaches a consumer's receiver, this same mux will often be preceded by a contribution or distribution mux at a site further up the signal chain. Cascaded multiplexers will, therefore, require a re-multiplexing operation, and this will often involve a partial transcoding of the streams; not only can source and destination bit rates be different, but the program manifest within the source and destination transport streams may differ as well, requiring adding or dropping programs (grooming). It should appreciated that even if a stream is already encoded and stored, it may necessitate a partial recoding when it is muxed into another stream. This can happen when there is insufficient bandwidth in the new stream, or when the playback buffer management requires a different state of what is called the Video Buffer Verifier (VBV). Either way, the instantaneous bit rate may need modification, and this requires changing the bit allocation (up or down) in an encoded stream.
Get the TV Tech Newsletter
The professional video industry's #1 source for news, trends and product and tech information. Sign up below.
Grooming can also take place as programs become active and inactive during a broadcast day, such as when a group of SD programs is replaced by an HD program. In order to ensure that receivers accurately describe the program availability, the PSIP tables must be correctly managed and updated by the final transmission mux. For this purpose, the Virtual Channel Table (VCT) of the transport stream makes use of “hidden” flags so that receivers are correctly signaled with the active content.
With a stat-mux, it is also more practical to dynamically change the channel allocation as the programming need varies. This will become more useful as non-real-time (NRT) broadcasting enables the download of data, files and programs. Because an NRT download is transferred in bursts over a long span of time, a transmission mux can opportunistically insert the data when there is space in the mux. In addition to the start time of such a transmission, a parameter file would indicate the priority and allowable duration of the transmission, ensuring that the content would be available to the consumer by a certain time.
Thankfully, with all this complexity comes better management tools. In the end, we are balancing bandwidth and quality, so we must ensure that our playout systems have in place adequate tools to predict, monitor and control this trade-off.
Aldo Cugnini is a consultant in the digital television industry.
Send questions and comments to:aldo.cugnini@penton.com