Building New Infrastructures for M&E

Netflix is the company that helped popularize the story of “Breaking Bad.” Like Walter White, the anti-hero of this tragic morality tale, Netflix has changed the way its product is made and delivered.

In the last four years, Netflix and other digital pioneers like Apple, Spotify and HBO have radically redesigned the backbone of the networks used by the largest movie studios and TV networks to feed content to their subscribers. Netflix has found a new way to cook up media and deliver it to the punters in a pure and powerful form. To paraphrase Tuco Salamanca, one of the distributors of Walter White’s content, Netflix’s product and delivery is loved by consumers because it’s “tight tight tight.”

Netflix has achieved this revolution by replacing their traditional Content Delivery Network (CDN) system that was leased as a service from traditional providers with their own, more powerful self-built system that’s closer to the customer. In IT terms it’s a network computer “at the edge.” The key technology that powers this CDN edge device is a specialized ConnectX 100 Gigabit Ethernet adapter. The upshot is they operate at hyperscale to allow whole neighborhoods and towns to watch video in High Definition with stunning clarity.

Key Trends to Watch

In the entertainment industry, so called Over The Top streaming services that send video over the internet are creating new ways for consumers to access video content. This allows “cord cutters” to cancel their traditional cable and satellite TV subscriptions and instead get their video streamed over their high bandwidth internet connection. Consumers still need to subscribe to content, which is where the new players in this sector—such as Netflix, Amazon and Hulu—that will stream movies and TV programs directly to their customers’ homes come in. Even traditional broadcasters, like the BBC, are adopting streaming. The format of the streaming content is evolving too: with advanced 4K video and other ultra-high definition forms becoming the norm.

These new digital streaming services offer on-demand video to millions of homes, with each consumer needing slightly different formats, data rates and of course ability to watch whenever they want. Thus, unlike the broadcasters who stream their programs to millions of viewers all at once, these OTT services need to customize their content and unicast video to each and every customer. Thus the digital video content needs to be tuned in quality and streaming rate for tens of thousands of home network connections and playback devices, thus creating a massive scalability challenge for the content providers. So live and recorded movies, sports, news and TV content can be streamed out in undreamt of quantities—however this requires infrastructure that can operate at hyperscale.

Streaming infrastructures are built out like giant power grids today. Their capacity to service tens of millions of people at once with individualized content takes them well beyond traditional broadcast TV networks, who only send the same content, at the same time, to all of their customers all at once.

This change in how entertainment is consumed has been catalysed by the IP, which gives streaming networks an advantage over traditional broadcast, cable and satellite TV networks, because it is so much better at delivering individualized content, monitoring consumers viewing habits and allowing customized digital advertising content targeting individual viewers.

Streaming platforms will eventually be the best way to deliver live and on-demand video content. This is a practical response to a change in the way we access entertainment, which mainly comes via smart phones, iPads and smart TVs owned by millennials (18-35 year-olds). This is their preferred window on the world, even when they want to watch TV and movies.

The subscriber numbers of live and on-demand streaming services tell the story succinctly enough. Netflix today has more than 60 million subscribers worldwide and represents more than 37 percent of all the internet traffic in North America.

The MLB baseball streaming app is the only sports league in the top 10 streaming services. MLB has estimated 5 million streaming subscribers today and generates $2-$3 billion of revenue each year. That’s almost one-third of its $10 billion in total revenue.

Third party fantasy platforms like FanDuel and Yahoo have more 50 million subscribers and are at the center of a casual betting market, valued at more than $10 billion. All of these streaming platforms are built on over-the-top IP infrastructures, with superfast infrastructures that instantly serve tens of millions of mobile and smart TV users.

The Internet is the Next Mass Medium

In 1997, when commercial audio and video streaming platforms were first introduced, media servers could deliver about 100-500 concurrent streams. It was obvious they could be the next mass medium once they could overcome their compute, storage and network limitations, and scale to thousands of streams. It took 20 years to reach full maturity.

Today a single server can manage as many 100,000 concurrent consumer of video streams. How? By adapting the distribution model from centralized data center servers to powerful CDN machines placed in co-location facilities right next to the consumers “last mile” internet connections.

But these CDN boxes need lots of bandwidth and require the key enabling technology of 25 and 100 Gigabit Ethernet network adapters. These CDN appliances are in fact powerful network computers that use the most advanced network adapter features to precisely stream content at exactly the right quality and exactly the right rate to thousand of consumers each of whom have slightly different requirements. These unique scalability challenges required Netflix and others to build their own modern CDN networks and make them more responsive by being placed exactly where demand is highest.

CDNs: Then and Now

Today’s online businesses want to give each customer personalized and dynamic content, because it creates a meaningful relationship between them. This requires the creation of content that is infrequently accessed but has a long internet shelf life.

Traditional CDN infrastructure was built for a previous generation of material, which was static. However, Facebook, Netflix and others have created user experiences based on content discovery and recommendation engines. Netflix’s movie recommendations, for example, provide instant interaction available to 60 million subscribers. Facebook Live allows millions of users to watch a live stream without any interruptions.

This type of dynamic content requires an interactive infrastructure that traditional CDNs cannot offer. To cater for millions of users and billions of interactions in a short time and on a planetary scale, you need local network computers.

Why can’t CDNs cope? Because their modus operandum—storing content in cache memory and feeding it out on request—is a stop gap method that is not designed for two way exchanges.

It cannot cope when end-users begin interacting with applications, posting non-stop user generated feeds, comments, pictures and updates.

Yes, newer CDN architecture like Open Connect, pioneered by Netflix, are designed with the goal to get more and more bandwidth out of single box. However, video streaming in this newer architecture is particularly sensitive to network delays. Throwing more processing power at the problem provides temporary relief, but a network redesign is the only practical solution.

Tinkering with the software at content library servers produces incremental boosts to performance. But these are far surpassed by the introduction of new customer built edge servers such as Open Connect.

In the CDN designed for Facebook Live, for example, 98 percent of user requests are handled by the edge servers, drastically reducing the load on the origin servers.

Conclusion

Traditional CDNs can’t handle dynamic workloads. The demands of modern consumer applications are beyond them because their architectures are outdated and their software-caching platforms bloated.

With next-gen applications, bandwidth alone is not enough. Networks must be predictable to deliver movie and TV reliably to millions of concurrent users without so much as a hiccup. That calls for reinforcements at the edge in the form of mega powerful network computers, placed locally, with network adapters specialized to support streaming connections.

As a result, Hollywood and other content providers can scale from hundreds of thousands to millions of concurrent subscribers with a total bandwidth of up to 100 Gbps out of a single box. The 100 Gigabit Ethernet Network adapter is a key enabler for such a beast and it keeps the network “tight, tight, tight.”

CATEGORIES