The EFFICIENT WORKFLOW
After half a century of refining itself as a medium-based presentation business, broadcast news now finds itself under pressure to accommodate new technologies for file-based sourcing and content management, as well as real-time, streaming and file-based delivery systems.
Despite changes in nomenclature and hardware, the front end of every broadcast news business continues to be the acquisition, editing, storage and exchange of content. File-based technologies bring substantial economies, flexibility and improved technical quality to these operations. The middle step for all news operations is creation of the actual on-air product. It is here that each news operation generates its unique value.
On-air is universally done in real time, intermixing live video and audio of the anchors and journalists with playout of file-based content and supporting graphics. The finished feeds then move to the third operational step of the business — distribution — through any combination of real-time (e.g., broadcast), file-based (e.g., archiving, redistribution) or streaming (e.g., Web, handhelds) media channels.
Startup
The primary impetus for upgrading news technology is the resounding success of proprietary video servers in transmission applications. Not long after video servers established their value as replacements for videotape recorders, substantial improvements in the power and cost advantages of digital content storage technology introduced an IT-centric rethinking about operations for all of broadcasting. This paradigm shift, plus a tightening business climate, reframed the business question of how to better use technology as a means for operating individual newsrooms more efficiently, and of coupling multiple news operations together.
During that period, the term file-based began to gain common use to define the transition of broadcasters away from tape-based operations. While the phrase references important distinctions that bring undisputed advantages, taking it to literally describe all broadcast news operations risks obscuring a clear understanding of two key aspects of newscasting that are not file-based.
First, files of necessity reside at a specific location, and each encloses contents that must be finite with a beginning and an end (though the end point may not necessarily be known at the time of creation). As noted above, the news program itself is an aggregated product composited in real time from live studio and remote video, along with video played from files and graphic devices. While the output feed of a live news show is usually recorded (which returns it to the file domain), its primary revenue value is as a live, nonfile feed to broadcasting and cable distribution.
Second, digital news operations diverge from file-based underpinnings when the news program (or portions of it) is supplied to secondary markets, such as the Web and handheld mobile devices. These operations use streaming rather than actual file transfers to provide the viewer with content.
While streaming technology requires the use of source files, the recipient's device does not receive a copy of the file itself, but rather a stream of time-stamped data that the client application displays on arrival. This allows the viewer to see the requested content without jeopardizing control of ownership rights.
Aside from these two aspects of real-time production and frame-streaming to applications, all other parts of the news business are immersed in a comprehensive file-based makeover. While the legacy backbone of shoot-process-cut-stack-air continues to remain recognizable, all the tools and workflow processes are in active flux.
Starting with images
From the outset, the most compelling aspect of television news has always been its unique moving images, with their power to engage the viewer. These moving images, more than anything else, give television news its competitive advantage over magazine or radio news coverage.
The images that are shot today in SD (still the dominant format) are captured with a range of field acquisition equipment, from the aging Beta SP analog, through the baseband digital tape formats of DigiBeta, SX and DVCPRO, and into file-based technologies such as Sony's XDCAM and Panasonic's P2. The impending arrival of HD newsgathering focuses additional attention on XDCAM and P2, along with the HD capabilities of Ikegami's Editcam HD, Grass Valley's Infinity/REV PRO, and a variety of standalone hard drive acquisition systems such as FireStore.
No matter which acquisition format is used, video servers are the central technology for file-based operation. The first servers with sufficient capacity, reliability and data rate for broadcast were the closed, proprietary devices introduced in 1994 for commercial insertion. Today, server technology includes the high-performance, high-reliability, highly integrated offerings of major industry vendors, such as Avid, Grass Valley, Harris and Omneon, as well as custom-built systems assembled from available IT components. The purchase decision of each customer is driven by the particular importance placed on cost, reliability, shared storage operations, expansion of channels, bandwidth, storage capacity and linked operations, such as editing and browsing, archiving and system management.
For news, the most valuable development in servers is the ability to offer true shared storage as an alternative to copying and moving files across a network. Content had been thought of as a resident that lived within its media. Shared storage revealed it to be a forefront entity, capable of simultaneously serving in multiple locations, residing on multiple types of media and possessing potential value in lines of business not traditionally associated with television news. The unrestricted, shared, simultaneous availability to file material has resulted in a dramatic shift in the industry's expectations about interoperability and universal access to content.
Going low
Some pathways to the universalization of content have proven faster and easier to implement than others. Because the relationship between network capacities and file size was so widely understood, an obvious next step was to generate low-resolution proxies for viewing and editing. Low-resolution operations increased costs by adding complexity to a station's infrastructure and workflow, but three key benefits paid off from the beginning and are still important today:
- browse viewing on any desktop rather than on expensive baseband video monitors;
- browse viewing cheaply across large geographical distances; and
- editing on more seats than would be possible using full-resolution video.
Initially, the only means available for creating files of differing resolution was to use real-time baseband video as a source. Although low-resolution files could theoretically be stored in the same servers as the high-resolution originals, most successful offerings used less expensive standalone servers and storage devices for this procedure.
This creation of dual inventories raised the need for content management tools. The most pressing requirement was for a scavenge tool that could create low-resolution copies of the high-resolution files that arrived in the server without passing through baseband ingest, such as finished edits from attached nonlinear editors. This task required databasing the inventory of both low- and high-resolution domains, along with rules-driven software for tracking discrepancies, managing high-resolution playout and coordinating low-resolution recording.
A related need was to manage deletion of low-resolution proxy files when their high-resolution masters were removed from the server. While operators have consistently expressed the desire to have low-resolution viewing access to any content that has ever resided on the server, the high cost of an archive system that automatically returned archived high-resolution content back into the server has restricted this feature primarily to large installations.
News operations would probably not add much more archive content into their shows even if it were readily available. But the ongoing desire for easy viewing and editing access to all their content continues to drive vendors toward better solutions.
Content sharing
In addition to fueling the development of low-resolution functionality, the need to gain access to content across proprietary hardware platforms drove development in two directions. One was toward standards-based wrapping and unwrapping of files across manufacturers. The second was toward math-based transcoding of files across codec families and bit rates.
The post-production community initially began articulating a vision for the free exchange of works in progress across platforms and channeled this into the development of AAF. The broadcast community recognized a parallel need for exchanging finished media products and supported similar developments with MXF.
Today the MXF is codified as SMPTE 377M and provides a standards-based way of exchanging material across devices. MXF applications range from linking P2 and XDCAMs with today's major news editors, to a broadcast group that uses a single archive linking Grass Valley K2 servers and Harris NEXIO servers to serve as the translation engine for sharing files across their operations.
The MXF and AAF standards have resulted in intellectual contributions, which have improved the way files and their content are understood. The standards identified that a file of content is in fact composed of three elements:
- its wrapper;
- the essence of the content (the audio and the video itself); and
- metadata that describes the content.
This distinction brought new clarity and inspiration to the thinking of operations management, technology developers and business opportunity planners.
As needs arose to convert files from one bit rate and/or codec family to another (e.g., creating low-resolution browse files, compacting DV-based edited files into space-saving long GOP MPEG), the fallback solution from the outset has been real-time playout through baseband and re-encoding. Although widely and successfully used, this approach is generally regarded as awkward.
When computational-based transcoding arrived, it not surprisingly proved to be computationally intensive. Many particular conversions required substantial processing time, and the per-channel cost was not trivial. As a consequence, legacy baseband ingest remained a viable way of creating browse video. Competitive computational transcoding products have now somewhat broadened the marketplace, and the new generation of architectures for HD/SD servers are beginning to promise internal co-generation of different formats and resolutions.
File-based delivery systems
The ability to transfer content as files rather than as baseband has eased and sped up news operations at almost every step in the workflow. The freedom from the cost, scheduling and technological complexity of baseband linkages has improved the quality of the news product substantially.
Archiving
Archiving is an example of how thinking about file-based workflow for news was initially defined by solutions created for large transmission facilities. Those solutions have three requirements:
- The archiving is nonlossy.
- Both archiving and retrieval are automatic.
- Retrieval is triggered by a customer's use of a reference to an archived piece (e.g. dropping its ID into a playlist) without requiring the customer to perform additional steps to implement that request.
A direct parallel for file-based news is obvious and attractive. When a browse edit includes shots that have been moved off to archive, the system would restore the necessary high-resolution content automatically and transparently.
The technology for implementing archiving with this level of automation is readily available, but its current cost cannot be justified by many (if any) call letter news operations. Creative thinking remains to be done about the assumptions and appropriate workflow to provide affordable archiving for file-based news operations.
High definition
By far, the largest set of changes for broadcasters as the result of implementing file-based workflows will surround the (usually) simultaneous arrival of 16:9 and high definition. From the moment a facility introduces its first frame of production in HD, all material — both SD and HD — needs identifying metadata to assure proper handling by the file-managing devices. These devices include captioning engines, storage and playout servers, editors, browse encoders and transcoders, browse editors, and browse conform engines. But for this content identification to be meaningfully used, virtually every step of the production workflow requires human decision making, because every option will have branching consequences that require informed evaluation by engineering, production and management.
An example is editing 16:9 HD and 4:3 SD source content together in the same story for simultaneous transmission in SD and HD. As long as SD content must be integrated and substantial viewing takes place in 4:3, then the HD content must be shot with 4:3 framing. The most expedient news workflow would then be to cut and produce an HD output, using black side pillars on the 4:3 content.
Some NLEs accept mixed SD and HD source content and perform up-and downconversion and aspect ratio conversion (ARC) on output. Others require external conversions prior to editing. In either case, the resulting edit will be saved to the server for HD playout. (Browse editing adds some initial complexity, largely setting the encoder/transcoders to accommodate HD, and setting auto-conform engines to output in HD.) To play that same edited piece in SD, some servers will automatically downconvert and correct the aspect ratio with a 4:3 center crop. Other servers will require downconversion and ARC as an external real-time process. In either case, the SD viewer sees a full screen of 4:3 content that is unstretched and unboxed.
While this process gives traditional 4:3 viewers the best-looking product and is currently the most expeditious in terms of workflow, it locks out the creative potential from using the full width of the HD frame. Its value will therefore diminish as 16:9 penetration rises among consumers. Yet the vital importance of news in each station's profit picture and the rising importance of third- and fourth-screen markets, which are primarily 4:3, argue for favoring 4:3 as the preferred distribution aspect ratio for news.
Business impact
File-based operations have revised the economics of news production at virtually every step and in the process have affected the costs of hardware and processes. With commodity IT hardware replacing baseband hardware, new economies have improved both connectivity and storage.
Connectivity, which had relied on baseband piping, now flows across vastly greater distances through networking, which has grown from 10BASE-T to Gigabit Ethernet. Commercial and technical services have finally been worked out for interconnecting multiple locations. Now that exchange speeds are no longer locked to real time, operations and management have substantial flexibility for controlling the speed, resolution and cost of connectivity at every step.
VTR technology, which had been hugely expensive to manufacture and support, has been replaced by piggybacking IT storage technology. Not only is IT storage far more powerful and robust, but its costs of development and scale are distributed across a vastly larger customer base. The savings from using field cameras with IT storage instead of tape allows stations to put cameras into the hands of many more shooters in their communities.
With the content of file-based assets now clearly exposed, management is engaged in finding additional ways of extracting value. The Web, mobile phones, public facilities like elevators, airports and trains, and a burgeoning variety of video-enabled handhelds, such as the Video iPod, are platforms where news can be repurposed with value for both owners and customers. The fact that rights to much news content is in the hands of large companies with other types of media to distribute (like music, movies and sports) brings both deeper pockets and a longer view to the search for opportunities. Increasingly, the success of applications in which customers pull content of interest will depend on attaching and exposing the metadata in an economical and meaningful way. And some markets, such as with handhelds, are proving to be so new and unknown that innovative offerings will necessarily be high risk.
The real product of the news business remains the news program itself. Holders of broadcast licenses realize that news is their brand identity — the key differentiator they have against their competitors. With the market in flux from changes in demographics, lifestyles and technology options, news operations certainly expect emerging technologies that will reduce costs and improve the quality of their product. Even more crucially, they will also expect their vendors to help anticipate new market opportunities available from the upcoming technology and work cooperatively to develop them.
Fred Schultz is senior marketing manager, news solutions, for Harris. He has written for the SMPTE Journal, is the author of a series of white papers on server technology, has won a prime time Emmy Award, and holds a Ph.D. from Vanderbilt University.
Leverage your file-based workflow
By Mark Turner, director of IT at Media General Broadcasting
To get the most out of your workflow, your file-based system needs to be:
- Pervasive
Offers multiple people simultaneous access. - Immediate
Allows rapid access to any clip or point in a clip. - Abstract
Virtual clips can be created by referencing other clips. - Independent
Clips in a sequence can be deleted without destroying others in the sequence. - Self-describing
Rich metadata can be embedded. - Extensible
New types of metadata can be added. - Portable
A file doesn't change regardless of the type of physical media it's stored on. - Agnostic
Multiple file types can exist on the same media. - Secure
IT controls can manage who sees what.
The new and improved workflow
By implementing a file-based workflow, your team could improve the final product while saving time. Below are some of the specific benefits:
Improved product
- Video content can be simultaneously used without restriction.
- Allows fast re-editing, which eases repurposing of stories.
- Pointer lists can be played to air faster than waiting for rendered files.
- Producer can automatically re-sequence media elements while on-air, without staff.
- Searching can return viewable content instead of a reference to shelf/tape/time code.
Improved economies
- Commodity IT storage replaces VTR design/manufacturing/maintenance/expendables.
- Commodity IT connectivity replaces costs and restrictions of baseband transfers.
- Editing skills are increasingly prelearned by the new-hire pool.
New benefits
- Content is accessible without geographical restrictions.
- Editors receive preselected source material in bin simply by clicking story.
- Transfers can be faster without lock to baseband clock.
- Searches can examine multiple databases simultaneously.
Get the TV Tech Newsletter
The professional video industry's #1 source for news, trends and product and tech information. Sign up below.