File-based workflow: Producing content in a virtual workspace

I guess we had workflow before smartphones, but it seems to me that the frequency of reading or hearing the term in professional discussions has doubled every year or so for the last decade. We have workflow in the newsroom, in master control, in production, in motion picture development and distribution, probably in every facet of our profession.

Of course, we had workflow before we had files, but we didn’t call it that. A station had procedures related to traffic, sales and master control, which today we would call workflow in the strictest sense. In technology, we had “system flow” diagrams, which represented the movement of signals through a plant, which I suppose could have been called workflow in some regards.

Metadata has always been a critical part of production, but with the advent of file-based workflow, it has had to change to accommodate the nature of the captured content. Image courtesy Grass Valley.

But file-based workflow is fundamentally different. For one thing, files are transported and used in quite different ways from baseband signals. They contain more elements — audio, video, and metadata — necessary to facilitate their use. Files are closer to videotape or film representations than they are to baseband SDI/HD-SDI streams, for they both live in a container (wrapper) and carry metadata on the “surface” — or, in the case of a tape, on the box — which allows one to understand what is in the container.

But unlike physical assets, files are only representations that can be best thought of as virtual assets. They represent no physical medium, and only when played out, reassembling the bits on screen, can they achieve the presentation of essence and associated metadata in a way that becomes a program. This creates a virtual workspace where files are processed and used to create content. This virtual workspace is where file-based workflow actually happens.

Development

The development of file-based production began perhaps with the Ikegami/AVID “EditCAM” in 1995. This first on-camera file-based recording system allowed media to be moved directly as files to the edit room without re-recording the signal or processing of any kind. It was still a physical process, and at some level in many applications, it still is and will remain so. News production relies on moving the physical media-containing files to the station (or laptop in the truck, or coffee shop), either as a sneakernet process or its virtual equivalent. It is likely that from there on, only the data representation will move through the rest of the workflow.

The edit process begins and ends with a concatenation of shots with graphics and effects layered as appropriate. In terms of workflow, the interface to editing is mostly shots in, completed package out, representing a pretty simple workflow. More importantly, the interface to the management systems where program planning is completed is done exclusively with metadata, and perhaps proxy versions of the essence at lower bit rates.

Obviously, the content moves to storage, where it is accessible to playout devices, and perhaps to archive and processing, which delivers versions and derivatives that are used for over-the-top and Web distribution. The most important question is: What is used in the workflow management. The answer is almost always that metadata is used to manage the movement and storage of the content, for the metadata contains the technical and descriptive information that allows decisions about processes and movement to be made.

When newsfilm ruled, metadata was on small file cards and on blackboards noting where crews had been sent and what they expected to bring back. Oh, for those
simpler times!

It is worth noting that the processes I have been describing are not “video processes” in the sense that you route, switch, mix, layer or play out from baseband constructs. This is an entirely IT-centric process. And therein lies the rub for many who have built decades-long careers in television. The technology, while not opaque, is at best a bit murky to those who have grown up feeding on NTSC, or even SDI interconnection. Parsing metadata is an IT process, as is the management of a newsroom where scripts and slugs are created in a newsroom computer system.

One critical part of that process in a modern station is management of the metadata and creation of tags that tie together systems and content. The creation of tags, or content IDs, an item of metadata, facilitates communication of the steps in workflow used to transform and move content. Analogous to a slate on an analog piece, a tag uniquely identifies the content, tying it to a database of metadata about the essence and how it might be used.

So many options

When teaching the fundamentals of workflow, I am sometimes asked why we have to have so many different options that make it all so complicated. A real answer is that this is no different in many key aspects than the processes we used before files. We had a multiplicity of tape formats (analogous to file formats perhaps), video standards (analogous to compression formats) and even interconnection methods. When signal timing, testing, monitoring and other topics are layered onto an analog system, the net effect is to produce a technical system every bit as complex as a file-based system is today. In many real ways, it required more technical precision and experienced care. The systems never warned you of impending failure; often the first thing you noted was the distinctive odor of “overheated ohmite.” Today, we get pinged by the SNMP management layer on our smartphone requesting we look into impending doom in a disk array.

An executive at a large network once remarked that his employer had never met a tape format it didn’t like. Today he or she would no doubt say the same thing about distribution formats. Multiplying those options complicates workflow immensely. The modern workflow is, of course, becoming more complex as producers and stations develop new uses for content formatted in different ways.

This has given new impetus to a movement to use Service-oriented Architecture (SOA) to manage the workflow. Putting all of the transforms and processes on a bus as “services” and allowing a software layer to manage the process by calling for actions such as transcoding, ingest, archive, etc., can make workflow much more adaptable to change in the future. The Advanced Media Workflow Association (AMWA) and the EBU have tackled this headlong, creating the Framework for Interoperable Media Services (FIMS). As more companies adopt FIMS-compliant interfaces, changing workflow will be less a science project and more of a business decision. The future will only add to the technical complexity we have now.

Farewell

After more than a decade, this is my last regular column for Broadcast Engineering. I have delivered columns and features totaling more than 150 articles. My hope is that I have been able to put some perspective on the technological change that has swept over our industry again and again during that time period. To be certain, 1000 words does not allow an in-depth exploration of any topic, and I have sometimes struggled to smash longer musings into limited space.

My hope is that readers have taken my work a step further, thinking critically about what drives change and how they can take advantage of it. One thing is certain, and that is that change cannot be stopped, and thus must be embraced headlong. I used to tell clients in my design and consulting practice that I had no problems, only their problems begging for our solutions. By applying experience gained in many facets of our business over decades, I was able to synthesize solutions that solved thorny technology and business problems for them. For me, that has always been one of the most rewarding things one can do. Presented with confusion and complication, I choose change and embrace temporary anarchy in the interests of seeing patterns and
finding solutions.

Occasionally, when researching an article, I have searched the Internet and found links to articles I wrote years ago, and was pleased that my thinking has evolved as technology has swept out the old and in the new. I have worked on monochrome video recorders stuffed with tubes and HDTV playout centers. How much fun it has been!

My personal thanks to the editors who have challenged my grammar and occasionally my “facts,” and to Brad Dick for the opportunity to connect with many of you who wrote to me asking for opinions and help after reading BE. I’ll miss that interaction, but I welcome your e-mail and calls at any time. You can reach me at john.luff@HDConsulting.tv or at 724-318-9240.

Be well.

John Luff is a television technology consultant.