MPEG encoding systems

There are plenty of misconceptions about compressed video. If you look back in the recent history of our industry much misinformation exists. Twenty years ago there were plenty of articles in the SMPTE Journal about “bit-rate reduction” (i.e. compression) systems. It talked about the Nyquist limit and the bit rate that might be achieved for acceptable results. The EBU Technical Review published an article that showed by mathematical derivation that you could not adequately represent PAL signals in less than 34Mb/s. While that information was useful in the context of the debate at that time, it clearly missed the march of technology.

Much has changed in the last decade to make compression so pervasive. Compressed-recording consumer camcorders now provide HDTV recordings at less than 25Mb/s, and the promise of advanced coding methods portends high-quality 480i at scarcely more than 1Mb/s. New coding algorithms make Internet video possible on low-bandwidth circuits. What has changed? How is this possible? And even more interesting, this being the month of the annual pilgrimage to Lost Wages, what might we see going forward?

For the answers, one needs only to turn to the best technical journal in North America, the Wall Street Journal. The solution is simple: Follow the money. The increase in consumer spending on electronics that use compression technologies is a mirror of the increase in capability and the availability of funding for research into perceptual coding (i.e. compression). The fact that it gets better is driven by money, for if it were a mathematical oddity we would still be struggling to get beyond that 34Mb/s barrier.

Coding standards

The Moving Pictures Experts Group (also known as Motion Predictive Educated Guesswork) has created a family of standards for coding pictures, audio and metadata. The two generalized standards we see most often in professional and consumer products are MPEG-2 and MPEG-4. MPEG coding systems are based on economics in an important, and intentional way: put the cost at the sending end; make the encoders complex and expensive because there are few of them; and make the decoders cheap and simple because there are more receivers than transmitters. What we perhaps didn't know when MPEG was first shown to the public was that with sufficient research and development the result would create such a huge demand that encoders too would become (relatively) cheap and ubiquitously available in many professional applications.

MPEG-2 now delivers credible results with sophisticated coders delivering real-time live variable-rate bitstreams at less than 3Mb/s, and news backhaul at less than 8Mb/s, suitable for use in production. HDTV is routinely delivered with pretty good quality in the United States and elsewhere at less than 19Mb/s, with 720p60 systems performing well at even 12Mb/s. Think about the numbers for a second. 720p60 delivers a video payload of 885Mb/s with eight-bit 4:2:2 coding. Compression to 12Mb/s means a reduction to only 1.36 percent of the original bit rate, yet the picture quality is remarkably good. 1080i30 at 19Mb/s is 1.91 percent of the original bit rate. Even professional backhaul and server recording of 1080i30 at 50Mb/s is just five percent of the starting data, but the results are nearly transparent. Some researchers predict that MPEG-2 can still yield perhaps a 20 percent to 40 percent increase in coding efficiency, which of course means lower bit rates and/or better pictures. Of course, lower bit rates mean better economics as well, which is really the driver behind the research. More streams in a given occupied bandwidth, less cost, more profit.

The original MPEG-4 standard did not take off in professional use due to many factors, including licensing costs, and the improvements demonstrated in MPEG-2 systems during the development of MPEG-4 systems. MPEG-4 is used in only one professional product that I know of, Sony's SRW series of HDTV recorders. It records the MPEG-4 Studio Profile at a whopping 600Mb/s data rate.

New codec on the block

There is a new kid on the block. The MPEG-4 AVC codec, recently standardized, and adopted by ISO as H.264, includes professional extensions that have been demonstrated to produce a 40 percent to 60 percent increase in coding efficiency over MPEG-2 at similar bit rates. At IBC2003, 720p60 playback from a server at just 6Mb/s (film-based material) was shown. Real-time encoders are not yet available for HDTV H.264, but two manufacturers of MPEG equipment are expected to have H.264 real-time hardware encoders for standard definition at NAB2004. The application most exciting to many is the backhaul of HDTV at 20- to 30Mb/s using H.264. This would allow a single DS3 to backhaul high-quality 480i and HDTV at the same time at a reasonable cost. The impact on the business would be enormous. Currently it is difficult to do H.264 professional coding in real time, especially at HDTV rates. There will be hardware coming for this later this year. It is interesting to note that H.264 also works at astoundingly low bit rates for wireless and Internet delivery of content to small screens with quite reasonable perceived quality. Remember that compression is all about perception.

One DBS startup has stated that they will launch this year with MPEG-2 coding and switch next year to H.264 by downloading new microcode to the consumer STBs. The increase in the number of channels in the same bandwidth will make additional delivery channels essentially free. So why not throw out MPEG-2 and move to something really exciting? The cost of replacing the entire consumer infrastructure not capable of simple upgrade is untenable. But rest assured, new decoders will likely have microcode lobotomies in the future to allow more efficient use of bandwidth and better pictures and sound. Locking the consumer into outdated technology will become too frustrating and could kill sales.

At NAB this year, spend a little time looking at the traditional coder manufacturers, and then visit Microsoft and see Windows Media 9 (also in hardware in other manufacturers' booths). Think through the impetus behind the development of professional coding by a computer software company. Science experiment? Control of the world? Ego? Nope.

John Luff is senior vice president of business development for AZCAR.

Send questions and comments to:john_luff@primediabusiness.com

CATEGORIES