Workflow in a Hybrid SD/HD News Production Environment

Preparing for the ultimate all file-based newsroom can be a daunting task, even if you’re building a new, green field facility. The decisions made today with regard to workflow, functionality and media management will be ones that you’ll need to live with for years, maybe decades.

Even if your operation is only looking at preserving your assets for the long term, it must also consider how to handle the immediate, short term, and its legacy material all at the same time. This becomes increasingly more important, and complex, if you’re intending to handle both high-definition and standard-definition content in the same working environment.

WEIGHING THE OPTIONS

One of the approaches that broadcasters and news organizations are considering is a hybrid mix of widescreen standard-definition field material coupled with live studio high-definition production. If your facility is already invested in a nonlinear editing platform that won’t support the jump to hi-def, financially or technically, then you might wish to explore field news camera systems that shoot in widescreen format captured to 4:3 standard-definition media. That media, whether file- or tape-based, could be transferred to and edited on a standard-definition NLE and then, during play-out is upconverted to 16:9 high definition—easily harmonizing with HD studio cameras, graphics and production switching.

With that background, this installment will focus on a workflow and treatment of systems that will support hybrid operations. To do so, we’ll look at the principle forms of media that might “enter” the system through a variety of transports or methods; then describe how that material is utilized on the air and in the archive.

The field recording equipment as shown in Fig. 1 employs a widescreen optics internal conversion and a solid-state image capture (e.g., Panasonic P2). Standard-definition images are captured and presented in a horizontal—spatially—compressed structure that preserves the widescreen image, even though it is only a standard-definition format. If these images were to be viewed on a conventional 4:3 monitor, one that does not convert the sweep to display a widescreen mode, the images will appear taller and thinner; like how Cinemascope used to look as captured on 35mm film. We’ll call this a 4:3 anamorphic presentation.

FRAME THE SHOT

The videographer must be cognizent that material shot in the widescreen mode may also be used in a 4:3 mode—sometimes referred to as “shooting in a protected” mode. All content there must be framed appropriately, so that important areas of the frame will not be lost when shown in 4:3 formats.

In this model the media storage, ingest and play-out servers, and nonlinear editing can be performed on standard-definition equipment in an SD format. Since most monitors and editing software applications allow the user to select a different sweep mode that returns the 4:3 anamorphic image to a widescreen format with proper imaging structure, saving the image in a 4:3 anamorphic mode makes no significant difference.


(click thumbnail)Fig. 1: Widescreen originated content for SD and HD transmissionFig. 1 depicts this widescreen, standard definition workflow with the appearance of the image frame depicted at various stages in the chain. Field content is ingested—either as tape or as a file—to a server, editor, storage platform or play-out device. All the content captured in this mode is then preserved in its intended format, i.e., widescreen; and playback may occur according to the needed or intended transmission mode (SD or HD). This concept further allows an archive to remain in a full widescreen aperture, which can be utilized for SD or HD purposes in the future.

The right side of Fig. 1 shows two alternatives for play-out. In our model facility, all play-out for the main digital broadcast channel will be in high definition. Thus, the play-out from the standard-definition file must be stretched to 16:9 and upconverted to high definition. The full raster HD image is then presented to the HD-production switcher for integration into a live program.

However, this model facility also has a standard-definition subchannel that features a continuous news program with its program content derived from the same (widescreen) edited stories. To accommodate this form of play-out, the material must first be restored to a proper aspect ratio, and then the areas outside the 4:3 space must be “cut off.” The device used in this application is an aspect ratio converter (ARC) that stretches the 4:3 anamorphic back to normal shape and then creates a 4:3 “center-cut” version for the subchannel program stream. The viewer now sees the same widescreen captured material, but excludes any non-informative material that falls outside the 4:3 protected space, provided the videographer protected that area during shooting.

To meet the needs of this workflow, today’s broadcast servers often employ these conversions directly on their hardware platforms, thus eliminating the need for external upconverted or ARCs. However, some standard-definition only newsroom editing systems, or those legacy devices from two or more years ago, can achieve the same results, but must utilize external “glue” devices for the stretch or hi-def conversions.

LEGACY VIDEOTAPE

The next condition that must be addressed handles legacy videotape in a hybrid standard-definition and high-definition environment. This legacy material may be archived videotape, or stringer video, or even feeds from microwave or other affiliates that do not use a widescreen format.


(click thumbnail)Fig. 2: Handling legacy and non-widescreen contentFig. 2 depicts the workflow that starts with legacy tape transferred to a server or storage platform through converters that format the material for play-out on high-definition channels (using side panels); or play-out on standard-definition subchannels. Each path maintains the same picture format continuity. Legacy material may be mixed with widescreen or native content during play-out. It may also be added to the archive, with side panels, keeping the images in 4:3 anamorphic format that permits editing on standard-definition platforms.

Note that the side panels generated in the ARC device may be static colored or black panels, or may be keyed from an external animation from a secondary server or other external graphics generator. The importance of these two systems is the maintenance of consistency in the content’s appearance, for both aspect ratio and full aperture images allowing for flexibility, regardless of the source material.

For the near term, dealing with hybrid operations is likely to be the norm for many facilities, especially for those that have just recently invested in an SD nonlinear newsroom editing system. As the facility grows—mixing native HD and SD formats—the degree of complexity increases many fold. Moving to an exclusive file-based architecture still necessitates the ability to recall legacy material, and preserve it at a file size which is manageable. Being prepared for the inevitable future means deploying systems and hardware that are extensible; and today’s server/software components are now making the future more achievable on a respectable budget.

CATEGORIES
Karl Paulsen
Contributor

Karl Paulsen recently retired as a CTO and has regularly contributed to TV Tech on topics related to media, networking, workflow, cloud and systemization for the media and entertainment industry. He is a SMPTE Fellow with more than 50 years of engineering and managerial experience in commercial TV and radio broadcasting. For over 25 years he has written on featured topics in TV Tech magazine—penning the magazine’s Storage and Media Technologies and Cloudspotter’s Journal columns.