Next-Gen Monitoring: You Just Late And See
You might not have noticed that the first part of Sherlock Holmes's explanation to Dr. Watson gets ignored. Yes, this month's rant is on next-generation control room monitoring.
The great detective told his sidekick that when the impossible was eliminated, whatever remained, no matter how improbable, was it. Fair enough. But what happens when you can't eliminate the impossible?
Yes, I know, the history of TV technology is littered with supposedly impossible stuff that just turned out to be possible. Let's see. There was narrowband FM, which was impossible until Ampex used it in the first videotape recorder to get sold.
And methinks there was an NAB convention not too long ago (in a geological timescale sense) at which digital TV broadcasting was declared impossible on account of it requiring a channel at least 50 MHz wide.
So I know I'm sticking my neck way out here in declaring something impossible (and I can almost figure out a way around it if money is no object), but I've got the second law of thermodynamics burned into my last remaining neuron. I'll state it this way: Time doesn't go backwards. And I'd say that's a big problem for control room monitoring.
"But, Mario, there are thousands of control rooms today."
CRT BE GONE
Indeed there are. And, for the most part, they're using CRT-based monitors. That's the problem. No one wants to make them anymore.
You can go back before Philo Farnsworth to the first plans for CRT-based TV. There was a camera tube with an electron beam in it and a picture tube with an electron beam in it and synchronized deflection circuitry to make sure that the camera beam and the picture beam did everything at the exact same time, not counting the insignificant delay it took the signals to get from the camera to the display.
From Farnsworth in the late 1920s to the RCA CCD-1 sometime in the 1980s, that was TV. What happened at the camera happened on screen at about the same time.
Imaging chips screwed things up just a least-significant bit.
They took in the whole picture at once and then squirted it out a scanning line at a time. That led to the rubber-table-leg effect (pan a chip camera rapidly back and forth past a table leg, if you don't know what I mean). So pictures sometimes looked a wee mite strange, but at least they looked the same amount of strange to everyone.
Then came LCDs and plasma TVs and DLPs and who knows what else. So a handful of viewers got washed-out pictures and strange colors and motion artifacts and low-brightness contouring or error-dispersion dots and other stuff like that there.
That wasn't a problem. Folks using those displays knew what they were getting into. And then those new display technologies got better. That's when the big problems began.
Let me rave for a second about the role of the CRT in TV monitoring. The NTSC color primaries were based on the phosphors in an early RCA picture tube. It ain't around anymore, and neither is NTSC green (except in those pesky old FCC regulations).
When you get to come to the NAB show more years than some of the floor walkers have been alive, you start to notice some changes. Ampex used to have one of the largest exhibits; now they've got one of the smallest. There ain't any RCA exhibit anymore (just an RCA TV in the DTV Hot Spot). No Gates, no CEI, no Vital, and no Conrac.
HDTV color primaries come from ITU-R Rec. 709. Those are based on older stuff eventually traceable to something called SMPTE C. The "C" stood for Conrac, the standard for control room monitoring for ages.
Now, then, back when LCDs couldn't match the colors of CRTs, there wasn't any big problem. If you needed to use an LCD for some reason, well, you knew you were making compromises. But now some folks, like Cine-tal and eCinema Systems, are making LCDs that do come awfully close to CRT colors.
If you want to stick one of those in front of your favorite "colorist," be my guest. But don't put it in my control room.
There's more to monitoring than color. Viewing angles ain't there yet, but I figure they will be someday, so that ain't my "impossible" part. There are some interlace and motion artifacts, and there are folks working on fixing those, too.
SLEEPER CELLS
No, the part I've got a problem with is latency-not how long it takes a liquid-crystal cell to switch from black to white and back, but how long it takes before that white or black (or gray or more saturated color) first appears. This hit me at the NAB show.
I wandered into the control room of the big mobile unit parked in Sonyland. The main monitors were CRTs. The camera monitors were LCDs, methinks, with maybe some vertically oriented plasmas along the sides, with someone's display controller breaking those up into smaller pictures.
They were playing something into most of the monitors. On the CRTs, it was in perfect lip-sync. On the LCDs, it looked like audio was leading just a wee bit. On the plasmas, there was no doubt that the audio was advanced.
It makes sense. Aside from whatever the plasma displays do on their own, the display controller takes some time to scale and arrange the different signals.
De-interlacing takes time. Picture-size change? More time. Color look-up tables? Add some more. Yes, I believe that someday, someone is going to be able to make an LCD or plasma look just like a CRT, but I ain't convinced it's going to happen without some fixed amount of delay.
You can delay the audio to match and delay the video to monitors that are faster so they all match the slowest, but, then when the director yells "Take," is that going to be six frames late? Too bad time marches on.
Get the TV Tech Newsletter
The professional video industry's #1 source for news, trends and product and tech information. Sign up below.