File-based workflows

Change is good in life. It is what encourages new thoughts and approaches to many things. The broadcast industry has seen three distinct eras in technology since the advent of broadcast television before the middle of the last century. In the early years, all television was live. The only recordings were kinescopes, which were film recordings shot off special-purpose monitors. However, when it played to air, the film was running in real time. It allowed reruns, but editing was at best cumbersome. Film was used for a lot of origination, which resulted in the importance of film islands for slides, commercials and library content. In the mid-1950s, commercially available electronic recording was first deployed, but it changed little until electronic editing was made practical about a decade later.

Once electronic editing was practical, it became possible to change the dominant workflow that had been developed to support the kinescope/videotape/film content system. Live television was no different, but the ability to process content in a separate electronic workflow was a huge change. It was, however, not simpler. Editing bays had multiple VTRs to allow signals to be mixed freely (audio and video). Interestingly, to cut the cost of post production, an offline technique was developed using proxy copies of the content on lower quality formats, often U-Matic 3/4in. This actually made the workflow even more complex, but akin to film rough-cut approaches.

The most important thing about offline editing was that computer-based approaches quickly were developed. The first editing done with computers was thought of as an offline technique because the computers of the day could not process full-bandwidth content. As a result, multiple compression approaches were used, principally motion JPEG and fractal decomposition. As the capabilities of computers and the quality of compression systems improved, we entered the third era of production — that of fully file-based workflow.

It is critical to recognize that file-based workflow changes everything. How we acquire and process content is now tailored to the workflow in which the finishing of the content is completed. We have largely abandoned the approaches used for 75 years to embrace what I like to think of as virtual content. I say virtual because in a fully file-based workflow, you cannot put your hands on the content the way you could with videotape (digital or analog) or film. Obviously, field acquisition content can be transported on physical media, but we are moving to electronic movement of the files for many workflows. This is not subtle change; it is revolutionary and as a result begs that new standards and structures — not built on old technology — need to be deployed.

We stand today with a direction established and technology deployed, which allows adaptation of former workflows that used streams instead of files. In truth, we are still using some crossover techniques that rely on recording bits to physical media, but I am convinced that will disappear in professional applications. We are headed into a much more flexible world where IT technologies replace what I believe is late phase television-specific technology, some of it adapted with hybrid IT interfaces. Consider for a moment the promise of service-oriented architecture (SOA).

Service-oriented architecture

SOA employs a middleware layer to build workflow between applications designed to interact with files. (See Figure 1.) Envision a template used to draw lines between processes that might be used in a broadcast plant. For instance, one process might be to add captions, and another might be to remix surround sound to Lt, Rt. In a stream-based facility, you would move the content between islands in a serial fashion to accomplish both processes. If problems come up, human intervention interrupts the process and looks for a remedy.

In an SOA-based facility, you would build a template that moves the content from one application to the other, with analysis determining when a failure has happened, likely either automatically repairing the content, or notifying a human of the error so a decision can be made. Clearly, this notes that there must be several planes interacting. In the most simplistic case, those are essence, metadata and management. The management plane may be viewed as both monitoring and control, and messaging related to errors and commands.

One might accomplish this processing by linking watch folders in multiple applications without using a middleware application. But if you do so, the design of the workflow must be accomplished by setting up the options manually in each application. You also would likely have to plan monitoring and management plane yourself, programming those to send notifications as appropriate.

But in this era, we have everything needed to design automated workflows with complex analysis and control, with one exception. SOA was not developed for broadcasters, but as an industry, we are exploring how to rebuild a world of wires, patch panels and switchers into one which moves content over networks in a flexible workflow that can be different for every program.

We need a way to send media-specific messaging between applications so that they will be aware of the content, but more importantly that they will act on messages in ways understandable by any media application.

Luckily, the Advanced Media Workflow Association (AMWA) and the EBU have begun a process that hopefully will lead to delivering that “interface.” They have jointly established a Task Force on Framework for Interoperable Media Service (FIMS).

In the press release announcing a Request for Technology issued by the FIMS Task Force in April, Brad Gilmer, executive director of AMWA, said, “The professional media industry really needs a standardized, open framework for media services, along with standardized definitions for common processes in the industry. It is our hope that the report and standards created as a result of this process help guide the industry as we work to create more flexible facilities and workflows.”

This game-changing technology has had a more disruptive effect than the change to color or the adoption of HD, but it promises tremendous benefits.

John Luff is a broadcast technology consultant.

Send questions and comments to: john.luff@penton.com

CATEGORIES