Standards: The Next Generation

cloud technology
(Image credit: ThinkStock)

The development and adoption of the SMPTE ST 2110 IP standards suite has been so prominent in both broadcast and pro AV that, in truth, the coverage of some other standards projects has sometimes suffered in comparison. So the aim here has been to cast the net as wide as possible and establish an overview of the current technical standards and specification initiatives set to have a major impact on the industry in the future.

Now that the ST 2110 standards are firmly embedded in broadcast and beyond, the organization is working on projects for virtual production (VP) and AI. The SMPTE On-Set Virtual Production (OSVP) initiative is part of SMPTE’s Rapid Industry Solutions program, one of the ambitions of which is “to be an on-ramp to the definition of new SMPTE standards by bringing together experts to determine where interoperability gaps exist that are getting in the way of using new techniques and technologies,” says Jim Helman, CTO MovieLabs and co-lead for OSVP.

At present, the OSVP’s working groups are focused on interoperability in the areas of camera and lens metadata, camera tracking metadata and color, as well as a group concentrated on education. In the interoperability areas, the groups are involved in documenting best practices, and providing metadata specifications and code to support them.

Jim Helman

Jim Helman (Image credit: SMPTE)

Already, notes Helman: “The interoperability working groups have published metadata specifications for camera and lens metadata and packaged supporting code as camdkit. Additions are underway that define a tracking protocol called OpenTrackIO, which was presented at the SMPTE MTS conference in October, in a version 0.9 form and should soon be in full release. The OpenTrackIO work could also lead to new SMPTE standards, such as carriage on SMPTE ST 2110.”

Addressing AI
Also underway are three new standards concerned with various aspects of AI, including ST 2141 Metadata Generated by LLMs (large language models), for the definition of metadata fields for LLM-generated context; ST 2142 Embeddings for Metadata: Contextual and Non-Human Readable Fields, which will define the metadata required for embeddings, including generation context and model parameters; and ST 2143 AI Model Metadata and Creation of Centralised Model Registry, which will define a standardized metadata scheme, and develop guidelines for metadata creation and management. SMPTE has also been collaborating with the ETC in a joint Taskforce on AI in Media, the latest report from which is titled “SMPTE ER 1010:2023.”

Thomas Bause Mason, SMPTE director of standards development, explains: “Although we are still in the early stages of AI adoption, with many individuals seeking to understand its implications, it is evident that AI will integrate into all facets of media creation, processing and distribution. As with previous major technological advancements, standards from organizations such as SMPTE and other Standards Development Organizations (SDOs) will play a crucial role in facilitating the adoption of interoperable and secure solutions within the media industry.”

Thomas Bause Mason

Thomas Bause Mason (Image credit: SMPTE)

Being mindful of the “ethical component” will continue to be a crucial element here: “It is imperative to ensure that AI applications are developed within ethical boundaries to protect consumers from biases, misinformation, and to guarantee equitable access to AI technologies,” Mason added. “Industry-wide frameworks established by standards organizations can provide environments conducive to the development of ethical AI applications.”

AV1 and IPMX Updates
The high efficiency and versatility of the AV1 video encoding standard generated plenty of headlines pre-COVID-19, so it seems like a timely moment to check in with the Alliance for Open Media about its current phase of adoption.

According to a spokesperson: “AV1 is becoming increasingly integral to daily life, reducing streaming and storage costs. [For example] approximately 95% of Netflix’s catalogue is encoded with AV1; over 50% of YouTube’s catalog (weighted by watch time) is available in AV1; and more than 70% of Meta Reels (by watch time) on iOS utilize AV1. The widespread adoption of AV1, supported by tens of millions of AV1-enabled Intel CPUs and GPUs, underscores the growing importance of this codec in delivering superior video experiences across platforms.”

Meanwhile, it’s important to note that the rollout of ST 2110 continues apace—not least through its manifestation as part of the IPMX standards, which are based on ST 2110 but add features and capabilities to address the specific needs of broadcast and pro AV, such as asynchronous audio and video support that suits AV and live production. At ISE in February, the Alliance for IP Media Solutions (AIMS) highlighted a host of new IPMX-compatible solutions, including an NMOS controller, AES67 speaker, and Dante-to-IP adapter.

Andrew Starks, chair of the AIMS marketing workgroup, is also closely involved with the IPMX roadmap, as well as director of product management at Macnica. Starks confirms that 2025 will be a big year for IPMX

“In March, the VSF [Video Services Forum] hosted an IPMX testing event focused on key standards, including timing, video and audio. This was the third round of testing for IPMX, but the first using an official test plan designed to serve as the technical framework for validating IPMX equipment. The goal was to finalize the standards documents and refine the testing process so that by the next event in August, manufacturers will have everything they need to bring IPMX-compliant products to market.”

In particular, he highlights two current documents from the VSF: the IPMX Test Plan, or VSF TP-10-1, which outlines the tests manufacturers need to pass for their products to qualify as IPMX-compliant; and the recently-released VSF TR-10-9, which defines IPMX requirements for system environments and device behavior, ensuring that IPMX devices interact properly with each other—as well as with ST 2110 and AES67 devices.

“From the outside, it can seem like open standards take forever to develop—especially compared to proprietary technologies,” says Starks. “So we get why some people are eager to see IPMX fully available. But when you consider the amount of testing, documentation and collaboration involved, things are actually moving fast with very few delays. We’re right where we expected to be, and the momentum is strong.”

That last sentiment, it seems, is one that can be applied to many areas of standards development at the moment.

David Davies