Getting the Image Right From the Start
Establishing and maintaining good image performance requires a thorough amount of planning at the system level. Given the many nonlinear editing platforms, the growing number of camcorder formats and the target of an all tapeless/video server-based environment—it is no wonder we're starting to see the effects of inconsistency in the set up of servers, cameras and those preverbal transcoders we're all now having to live with.
Watching high-definition broadcasting is, unfortunately, becoming more painful on a daily basis. Perhaps we're headed toward a not so new name for HDTV—that being "high disappointment" television. As we came within hours of the analog sunset, it became apparent that what we can all start to look forward to—with a minor percentage of exceptions—is a mass conversion of program content that looks worse than it did on the late analog receivers.
HD CONTENT PATH
Not a scan through the cable, satellite or DTV channels goes by without seeing some poorly handled audio or video. It's not hard to find content that started out first as 4:3 analog, and then was D-to-A converted to SDI, next upconverted to 1080i while being stretched to get rid of those nasty side panels. Somewhere along the line the content got recorded into a server platform at a low bit-rate with a poor selection of chroma sampling, i.e., 4:2:0 or 4:1:1 instead of 4:2:2. Often we find an input codec that subsamples the signal to produce less then the 1920x1080 "resolution" we'd targeted.
Instead of employing a high enough bit-rate to retain image quality, the already mangled picture content gets encoded at 8 to 10 Mbps. After all this manipulation, the system decompresses the video to baseband HD, processes it through an HD-master control channel and then for emission, recompresses it to 720p (4:2:0) just so the broadcaster can run three or more streams of "revenue" on their DTV channel!
OK, so maybe that was a little over the top. Conscientious broadcasters know better than this and strive for a level of quality that makes them look better than they did in days of analog old. However, it goes without saying that mistakes are made. Improper codec setups can irreparably damage video content and there is absolutely nothing you can do for it short of reshooting and editing. Furthermore, flipping files from one format to another can also do the same thing.
As a word to the wise: understand the impacts of your setups, do a sufficient amount of testing on various types of content, and know what to expect before committing to a house "standard" video, compression or archive format. The days of assuming any higher bit rate retains a better picture are behind us. Given no other effects on the image, if you encode a standard definition MPEG-2 I-frame only image at 30 Mbps, it will hold up better than if it were encoded at sub 15 Mbps Long GOP. When you elect to store original images in 4:2:2 Profile instead of 4:2:0 or 4:1:1, the preservation of a better image is significantly higher for the same given bit-rates.
UNDERSTANDING THE PROCESS
Many editors deplore working in Long GOP, so they immediately flip it into something they like to work with, which is often that new hot-off-the-press codec for which there is no transcoder available to transconvert it back to for on-air purposes. The NLE images look stellar on their 23-inch Apple Cinema monitor, but when it goes to the real broadcast world, there's where the comparison ends.
This article can't begin to be a tutorial on the plethora of setups, combinations and permutations of this format plagued over-encoded video world we've created. But we can be a conduit that can aim users and technicians in a direction that will spell long term success. The process involves a bit of research and training on the part of the technical community, unfortunately something we've not had much opportunity to pursue. Nonetheless, you must understand what's going to happen to your images and files; and you need to control the processes from the start.
Begin by studying the details of those codecs in your facility's stable. Look at your field cameras, how they encode images and transfer files, and what systems they pass through. Camera images become the benchmark for what happens in the rest of the chain. If you've started out with low-cost HDV, you're limited to that quality throughout. If you're fortunate to be shooting I-frame only or JPEG2000, keep the bit-rate up as high as you can afford. There are lots of opportunities to retain a higher quality image from the start—if you know how.
Regardless of what level of image or encoding you start with, be sure you don't inadvertently select a path that worsens quality. Do this by looking at how you intend to move images and files around the facility. Make a drawing on a giant white board so you can see and retain the data from the test plan you're about to develop. Map out the end point for each path. For the HD-master control path, if your server has a full fledged high-definition decoder, then look at what the encode server port has to offer.
If storage isn't an issue in your plant, scroll through the available codec settings looking for the two or three highest profiles that you'd consider setting as the "house standard." Compare the baseband ingest properties for high definition. If you have studio cameras, use them as the reference point. Roll the camera outside, and shoot some trees blowing in the wind, or a highway with fast moving cars—as movement kills codecs. Alter the settings on the input codec, record and document each of the files you capture.
Save those files, and then do the same for each of the other image generating devices (field cameras, graphics systems, etc.). Once you've created your "golden file" set, archive them and then use those files and their baseband images around the entire facility. Record the noticeable variations in the image quality. Repeat the process at different bit-rates, encoding schemes, chroma samples, and the like. Don't forget transcoders used in moving from NLE to the server and back. Some formats will look fine, others will look seriously different. Test for generational passes through NLE and servers, then back to NLE.
DON'T FORGET THE AUDIO
Don't forget the audio side either, especially if you do upmixing, downmixing or dabble in surround sound. Observe processing for multichannel audio. Limit the use of Dolby Digital (AC3 or A/52) audio until emission as it not only changes the footprint of the signal, it may impair the ability of the transcoder to do their thing.
Finally, if you use proxies for browsing or offline decisions, or create content for alternative delivery (Web, mobile, etc.), don't forget to include them in your analysis. Just because the images are smaller doesn't mean you won't see artifacts that will detract from your presentation.
Making the best use of the tools you have has never been more important. The intermixing of audio/video formats, serving and editing platforms, plus the distribution of those assets now encompasses more derivatives and variations than ever before. Keeping a good hold on the variables will do nothing but help maintain your image in the all digital world.
Karl Paulsen is chief technology officer for AZCAR Technologies, a provider of digital media solutions and systems integration for the moving media industry. Karl is a SMPTE Fellow and an SBE Life Certified Professional Broadcast Engineer.
Contact him atkarl.paulsen@azcar.com.
Get the TV Tech Newsletter
The professional video industry's #1 source for news, trends and product and tech information. Sign up below.
Karl Paulsen recently retired as a CTO and has regularly contributed to TV Tech on topics related to media, networking, workflow, cloud and systemization for the media and entertainment industry. He is a SMPTE Fellow with more than 50 years of engineering and managerial experience in commercial TV and radio broadcasting. For over 25 years he has written on featured topics in TV Tech magazine—penning the magazine’s Storage and Media Technologies and Cloudspotter’s Journal columns.