What's Wrong With DTV?


Out there, you might have noticed that the quality of some broadcast DTV, particularly some HDTV, ain't quite up to snuff. Lord knows Mario has.

This is particularly noticeable because, as we have known for some time, digital video and audio don't, as the expression goes, "degrade gracefully," as their analog counterparts do. Here's what Mario is talkin' about.

Assumin' NTSC video is of high quality when it is transmitted and no multipath situations are encountered, the primary quality-degradation mechanism experienced by the viewer is weak signal strength at the receiver. As the signal becomes weaker and weaker, the picture on the tube gets fuzzier and fuzzier, until eventually it just vanishes into the snow. For analog audio, just substitute the word "noise" for "snow."

Viewers have generally demonstrated a high tolerance for fuzzy NTSC pictures, largely because the fuzziness is just a random, incoherent degradation of the still-comprehensible images; no disturbing, visually unrelated artifacts like blocks are generated. Mario is here to tell you that the same cannot be said for digital pictures.

We all know about the digital "cliff effect," also known in the early days of DTV transmission experimentation as the "valley of death." In a nutshell, this may be expressed in this way: either ya get a perfect picture, or ya get no picture at all. Well, now that we are transmitting and viewing digital images daily, we know that this ain't necessarily the whole story.

We have experienced a passel of digital artifacts in received DTV pictures and we will experience a passel more.

DIGITAL DEGRADATION

One familiar digital artifact is caused by the old nemesis: weak signal strength. Valley of death notwithstanding, when the signal strength of an ATSC DTV transmission at the receiver becomes marginal, macroblock errors appear in the images. These are manifest as portions of the picture turning to garbage; garbage in rectilinear packages.

These errors can be large or small, ranging from a very short, thin horizontal rectangle to a substantial chunk of the picture. These artifacts come and go. At the extreme, the whole dadburn raster can turn into a pig's breakfast, resembling a screen full of shards of broken glass, before the signal disappears completely. This sort of degradation is not necessarily or typically the broadcaster's fault.

The other type of digital degradation Mario is seeing is caused by the broadcaster, particularly when the broadcaster attempts to pack 10 pounds into a 5-pound bag. The root cause of such degradation is the fact that a DTV signal is a stream of compressed digital data, and, as is the case with all digital data streams, the various components must share a finite amount of digital bandwidth. It's a mathematics game, and some play it better than others.

A broadcast television channel carries a data stream with a bandwidth of about 19.39 Mbps, into which all the compressed audio and video elements must fit. This works okay, for the most part, when a single HDTV signal is being transmitted on a channel. Usually.

We know that a broadcast HDTV signal is very highly compressed, and that the more motion the pictures contain, the more stress is placed on the encoding or compression device.

At the extreme, for example, in fast-moving sports and suchlike, a channel transmitting even a single HDTV signal can occasionally fall apart with compression artifacts. We can see portions of the image turn to square blocks momentarily as the encoder runs out of gas, and we can also see such blocks in a quick scene change or a fast fade-up from black or white. This is often called "tiling," and the appearance is much different from macroblock errors. The smaller the pixels, the smaller the blocks: SD tiles are larger than HD tiles.

We know fer sure, too, that 720p HDTV is considerably less stressful to the DTV encoder than is 1080i HDTV, for all those reasons we have heard before.

Mario's main point here has to do with what's goin' on out there with DTV broadcasting. For all the familiar financial reasons, broadcasters are taking advantage of the ability of their ATSC signals to carry multiple programs. Taking excessive advantage, in some cases.

Most, but not all, broadcast DTV signals carry an HDTV program, which may be called the "main channel." Many also carry one or more additional "subchannels," which can range from low-resolution, limited-motion weather channels, all the way up to a second network program. All these programs have to share the channel's19.39 Mbps bandwidth, and its encoding resources.

PACKING ATSC DATA STREAMS

It won't come as a complete surprise to discover that there is a limit to how much stuff can be packed into a single ATSC data stream. One real-life example is a station in a major market that packed one HDTV signal and three SD signals into a channel, a not uncommon practice.

This particular station did use 720p for the HD signal, but the total package was just too dang much for its container. When watching the HD signal random compression artifacts flashed through the images, the frequency of their appearance depending on what was going on in the programming in all the main and subchannels.

They often took the form of little blocks momentarily appearing in people's faces, outdoor scenes, etc. People's faces, for example, suddenly appeared to be behind screen doors. When this channel eventually dropped one of its subchannels, the situation improved, but did not entirely disappear.

Another real-life station in a much smaller market broadcasts a 1080i network program on its main channel, a whole other broadcast network in SD on its first subchannel, and an SD version of the HD main program on its second subchannel. Both the 1080i main channel and the first subchannel typically have a whole heap of tiling and sundry compression artifacts, and the first subchannel experiences occasional audio dropouts, not to mention serious lip-sync errors.

Methinks all this is not the best way to attract viewers. What can be done about it? Glad you asked.

Recent-generation encoders have improved greatly. At least one station in a major market, which primarily carries the signal of a broadcast network that broadcasts in 720p, is able, with state-of-the art encoders, to successfully broadcast two, count 'em two, 720p programs simultaneously.

The primary 720p signal is allocated more bits than the secondary one, and it looks right good, with only the occasional small bit of tiling to be seen. The secondary 720p program doesn't look as good, but it is adequate for the purpose, and it doesn't usually contain much fast motion. On the other hand, stations attempting to broadcast a 1080i signal and two full-resolution, full-motion SD signals, are probably kiddin' themselves.

Approaches to improving the broadcast DTV multicasting problem include selection of the most advantageous scanning formats, judicious allocation of the available bit budget, and use of state-of-the-art encoding devices. Like DTV receivers, succeeding generations of DTV encoders have steadily shown markedly improved performance.

Mario thinks there is a lesson here for economically beleaguered broadcast TV stations. Bad pictures are not the best way to attract viewers to broadcast television.

Comment on this or any story. Write totvtech@nbmedia.comwith "Letter to the Editor" in the subject line.

CATEGORIES