TV synchronization

In 1939, when General Sarnoff broadcast the first live television show at the World's Fair in New York, synchronization was critical to the ability to send signals to the home. The tube-type sync generators were notoriously finicky. The pulses (horizontal, vertical and blanking, along with composite sync) were used to ensure the cameras were making images in perfect synchronization. In fact, the entire system depended on a tight linkage between the scanning beam in the camera and the electron beam scanning the back of the display faceplate in perfect lock step. There were no electronic delays on videotape, which wasn't made practical until 1956, when Ampex made recorders that could mechanically lock rapidly spinning heads tightly enough to permit electronics to correct the signal to roughly the same time base accuracy as the sync generator provided for live sources. This absolute linkage between acquisition, transmission and display was a feature central to television for nearly 50 years.

Timing and sync progress

Think for a moment about the specifications for the NTSC (color) system to be practical. Every signal had to be lined up vertically, horizontally and even tightly timed to make the color subcarrier line up perfectly. The standard specifies 10 parts in 3.5 million as the tolerance for subcarrier frequency. However, at any one point in time, two signals needed to be within about two degrees of phase of the subcarrier running at 3,579,545.27MHz. That means that the real instantaneous accuracy is routinely about 1.5ns of phase difference, often much less in well-controlled facilities.

The world changed with the advent of digital systems with significant amounts of buffering built in, allowing those difficult specs to become a thing of the past. Today, a production switcher doesn't need to be timed to a single nanosecond, but rather to plus or minus half a line or so. This amounts to about 85,000 times the timing error permissible just a few years ago. But in the end, that alone is not more than a curiosity. By using compression and pixel-mapped displays, we have effectively delinked the two ends of the production/consumption chain. There is no longer any direct correlation between the two ends except in the statistical sense. At any one instant in time you cannot say how many frames per second are being transmitted, for the length of a frame depends on how many bits it needs for adequate reconstruction of the intended image. A good transmission system may change the number of frames per second when film material is sent instead of replicating the 3:2 pulldown required for 24-frame film material in a 30-frame video world. Just send the original 24 frames, and if the receiver needs 30 frames for the display, reconstruct them at the other end. This is much different than General Sarnoff's engineers would have thought possible.

But we live in a multiformat world today. First, the frame rate is really 30/1.001 to accommodate the need to be synchronous with remaining NTSC legacy hardware. In a perfect world, HD would have been precisely 30fps, but after lengthy discussions in the 1980s, SMPTE acknowledged the reality of building a facility with frame rates so close, yet so far apart. The hardware needed to create 30 frames out of 29.97 is expensive and was deemed to be an unreasonable burden on future systems. But we do have the need to accommodate frame rates locked to PAL systems and the need to lock digital audio at a multitude of different clock speeds.

We are still burdened with time measurement, which was invented before the 30/1.001 issue. The result is drop frame and nondrop frame time code, which by the way is of barely any use in systems like 720p59.94 because it cannot name frames beyond 30 uniquely. Now I am sure someone will point out the field flag in SMPTE 12M, but the point is that there has been a need for a time code that can correctly number (nominally) 60fps for many years. Existing analog, audio-based time code doesn't adequately specify the date or a reference to a globally supported time standard either. Out of adversity comes creativity and progress.

A new reference signal

A couple years ago, SMPTE and EBU started to create a new reference signal. Experts worldwide collaborated, and after much work, several critical elements of the next time and reference signal emerged.

First, the reference signal will be locked universally to the time signal used in GPS satellites, allowing precise information about time and date to be included.

Second, in a stroke of creative engineering, the committee determined that the references for all standards worldwide could be built from this single starting point in the following manner. If one started all signals at the same time, you could use the precise time of day to describe their current state anywhere, without having multiple references for multiple frame rates. For example, if line one of field one of an NTSC signal began at midnight GMT on Jan. 1, and you know the time precisely enough (enter GPS clocks), you could predict the phase of the color black signal at any time in the future with a little bit of math. Locking an oscillator locally to this time signal and calculating the startup conditions from the current local time creates a generator that you can trace back to the “big bang of sync signals.” That time, or epoch as it is called, has been selected as 1/1 1958, per SMPTE 404M Draft:

“Time 00:00:00 of Wednesday, January 1, 1958, 00:00:00 being the midnight at the start of January 1). This corresponds to the origin for International Atomic Time (TAI) and Coordinated Universal Time (UTC). The corresponding Modified Julian Date (MJD) is 36204.”

Thus, neatly, we can start clocks for AES, NTSC, PAL, MPEG or any other media-related signal using one convenient and well-known reference. The signal may be distributed a number of ways, including most significantly via computer networks of known latency, and as an overlay on color black. In the future, if someone determines that an 81Hz frame rate with 2678 lines is appropriate, we can quickly relate it to the “SMPTE epoch” and deliver a new sync signal. Done; and without breaking any existing signal.

John Luff is a broadcast technology consultant.

Send questions and comments to: john.luff@penton.com

CATEGORIES