Cloud broadcasting

We have all heard about cloud computing — a method of sharing computer resources and software over the Internet on demand. But what does this have to do with broadcasting? To better understand the connection, we need to consider two factors: how our signal flows and how talented engineers have changed over the past decades.

Traditionally, when we as broadcasters needed to get a video between plants or remotes, we simply ordered up a TV1 line from the local telco, and presto! NTSC with audio got delivered matching the RS-250B long-haul spec. Then the digital revolution began, and suddenly there was no such thing as a system drawing that didn't feature a network cloud. (See Figure 1.)

At that time, only a select group of people knew what was really inside that networking cloud. As broadcasters, we gave the network cloud provider our signals on coax or twisted pair, and at the other end — as if by magic — we got a coax or twisted pair back again with our signals. We had no idea how it worked and didn't care much as long as we didn't see sparkles on the screen or hear audio hits. Perhaps we gave them digital video or perhaps analog; it really didn't matter. The cloud provider figured out how to give us the same signal back.

In an attempt to measure or capture sparkles and audio hits information, we created Error Detection and Handling (EDH) SMPTE RP 165-1994. Cloud providers went to great lengths in attempting to match our EDH world with their bit error rate (BER) measurements used in the world of data transmission. As our EDH measurements are frame-based and their BER calculations are continuous, EDH and BER don't match up and can't be correlated. Somehow, we all progressed beyond these simple numbers and quickly learned that digital links were typically either horrible or acceptable, and acceptable ones were far superior to the tape drop-outs and satellite sparkles that we as an industry had deemed acceptable in the past.

This move to digital started to clearly define contribution and distribution links because we no longer had RS250B short-haul and long-haul specs. However, more important were the bandwidth charges for different data rates. It was one thing to pay for a TV1 line; getting a dark fiber or 270Mbs line was very expensive. To combat these significant bandwidth charges, we began using different video and audio compression rates for different types of applications, hence yielding contribution services for backhaul type applications and distribution for network feeds.

To confuse the situation more, these were never standardized, which resulted in different applications construing different meanings for contribution and distribution links. Also, the definition of signal latency quickly went from frames to seconds. Broadcast engineers needed to be very careful in specifying the proper link technology to match the intent of the signal being transported.

On the telco side, there were “turf wars” about Asynchronous Transfer Mode (ATM) vs. Multiprotocol Label Switching (MPLS). It was good that a few broadcasters got involved in these activities, which ultimately assured that both technologies could be used for television. Fortunately, in many cases we don't care which carrier technology is used as long as we get our television signal back with acceptable quality from the cloud.

All of this aside, as in Figure 1, all of our system drawings during that time featured these magical clouds.

Role reversal

Here we are 10 years later with more engineers knowing what is inside the network cloud and how it works than understand what television is about. To these engineers, television is just another data type — specialized compared to other data types, but still “just data.” So, why did this happen?

Simple economics — consumer demand for bandwidth for phones (mobile and fixed), Internet, private networks for home and decentralized offices, and, of course, the multichannel explosion. Both educational institutions and companies scrambled to develop a huge workforce that possessed an understanding of what networking is all about. The numbers of networking engineers grew; the number of broadcast engineers shrunk. Don't get me wrong: There is a small group of engineers that either do understand both worlds or strive to learn networking and broadcasting.

As broadcasters, we think of this as a role reversal. Networking is no longer a cloud. Rather, the broadcast facility is now the cloud. (See Figure 2). New media outlets are popping up everywhere; existing outlets are constantly growing channel capacity. Do most of these distribution outlets care about how the content was frame synched, lip synched, format converted, switched and processed? No! They just want content streams or files. To them, television magically pops out of some content cloud somewhere in the world.

This doesn't mean that the art of broadcast is dying or becoming extinct. It is more like Latin — spoken and understood only by a few scholars, but forming the root of so many modern languages. In the current evolving media environment, we have developed numerous modern forms that stem from original broadcast standards and practices created years ago. To most outsiders today, video simply comes from a cloud hanging somewhere — the same way we used to put transport clouds into our system drawings.

This means that we as broadcasters are now clouds in the media outlet's systems drawings, as shown in Figure 3. Broadcaster clouds could be located anywhere in the world. They could be huge operations or a small niche channel. They could be a narrowcast operation for digital signage or a corporate private network that requires global distribution. The possibilities go on and on. Think of the new revenue opportunities for us as broadcasters. We can start to supply content almost anywhere in the world. Naturally, content rights needs to be geo-managed as we currently often do between East and West Coast operations within North America. The content we supply as cloud providers may be live streaming or file-based or both.

Reaping the benefits

To reap maximum benefit from this role reversal, we need to move beyond our traditional thinking of sending signals to a transmitter and cable satellite headends. Some of this has started now with mobile TV, with the addition of another program path tailored for mobile users. Similar techniques can be used to pick up additional revenue streams by creating a digital signage business within your facility. TV stations can use their skill sets to produce new and innovative content that sells and goes well beyond local news and sports by production.

It is the traditional thinking that is hurting our growth. We still think there are separate functions within a facility that are based on both workflow and signal flow, such as post area, traffic and billing, playout, transmission, news, and graphics. In some cases, they actually are separate departments; in other cases, they are simply a mindset.

Technology allows us now to think beyond this. Just as we see optical I/Os and processing functions going directly into routing switchers, we will begin to see single platforms that help us provide multiple distribution formats, such as various flavors of SD or HD MPEG, with some on ASI and others directly on IP streams. A single platform will break down the barrier between playout/master control and transmission within a facility.

Additionally, as shown in Figure 4, this means broadcasters start to take control of their own quality by providing streams and files directly in the format the distribution outlet needs. That's right; if a media outlet needs an SD MPEG-4 feed and another one needs an HD MPEG-2 for a transmitter, a single platform can provide this. Sure, it may be stat-muxed down the way, but we have skipped that extra step of rate conversion that often causes so much loss in quality. In doing so, we now have the ability to distribute some of our own content directly — not to cut the media outlets out of the loop but to enable services that generate new revenue streams.

Summary

As broadcasters, we know content is king. For that matter, even “the network cloud people” know this. But broadcasters need to find new and better ways to integrate with the network cloud people and seek out more and more ways to distribute our content. More specifically, we need to develop new models to take advantage of networking, along with new equipment and software, to better serve new distribution types directly — all while still delivering premium signals.

Whether we like it or not, this cloud role reversal is happening. Let's embrace it, taking “their” clouds into “our” clouds. Doing so will allow us to take control of quality driving distribution outlets, and most importantly, new revenue streams.

Stan Moote is vice president business development, Harris Broadcast.

CATEGORIES