The Role of Lenses in Virtual Environments
Canon’s HJ14ex4.3B wide-angle portable HDTV lens features advanced data connections that facilitate the creation of dynamic, visually appealing virtual studio environments.
SEATTLE—When 3D video production was gaining steam several years ago, there was much made of the need for precisely matching zoom lenses for 3D shooting. Without matched lenses—if, for example the left-eye lens remained perfectly optically centered during a zoom but its companion right-eye lens had its optical center drift during the same zoom—the viewer would experience a Class 5 headache as his eyes and brain tried to resolve the difference between the stereo image pair.
3D has not become ubiquitous in everyday television production, but the increasingly popular augmented and virtual reality production techniques used in sports and news program production require similar optical and mechanical perfection in zoom lenses. Rather than a matching a pair of lenses, as needed for 3D, AR and VR require matching the lens performance to the perfect optical world created by the background or effect creating rendering engine.
“Lenses are definitely a critical element in virtual and augmented reality,” said Thom Calabro, director of marketing and product development at Fujifilm Optical Devices Division.
“The preeminent qualification [for a lens used in AR or VR] is the ability to actually provide information on focusing and zooming coordinates out of the lens to a very, very precise degree to the actual software that is running the virtual reality,” said Chuck Westfall, technical advisor, Professional Engineering & Solutions Division at Canon U.S.A.
Both Fujifilm and Canon build both box and portable style zoom lenses with digital controls that feed out precise zoom, focus and iris setting information to rendering engines.
POSITIONAL FEEDBACK
Get the TV Tech Newsletter
The professional video industry's #1 source for news, trends and product and tech information. Sign up below.
Electronic lens controls are not new. Fujifilm, Canon and other lens makers have long made analog controlled lenses. These remote control lenses have been fitted to cameras such as the all-weather housed beauty shot cameras placed on downtown buildings and towers. The analog controls for these lenses allow operators to zoom, focus and control iris settings, but offer no positional feedback other than what the operator sees on a video monitor.
As analog controls gave way to digital controls, it became possible for the lenses to report precisely their zoom, focus and iris settings for use in the rendering engines. Encoded lenses precisely report the position of the zoom, focus and iris controls, which are the rotating barrels of the lens.
Canon has a long history of producing a component called a “laser rotary encoder,” according to Westfall. Both Canon and Fujifilm now provide 16 bit lens encoders. “The value of [16 bit encoding] is that it’s able to get the positioning of the focusing and zooming to an accuracy of 0.1 micron, which is basically microscopic,” Westfall said.
Virtual Reality vs. Augmented Reality
An example of augmented reality. Virtual reality and augmented reality are often confused, and there is no clear line between them.
They’re differentiated by the amount of virtual material that’s rendered in a particular shot. A totally rendered background with perhaps only a presenter and a desk would be an example of virtual reality, where a football first down line or the insertion of advertising billboard material behind a baseball batter would be an example of augmented reality. In short, if most of the image is real, it’s AR; if most of the image is rendered, it’s VR.
—Craig Johnston
Calabro provided another means of measuring the accuracy afforded by 16-bit encoding. “Sixteen-bit means you have nearly 65,000 data points end to end” to report the settings of zoom, focus and iris on the lenses, Calabro said. The rendering engines take these readings once per frame (along with pan, tilt, elevation and pedestal positioning information from the camera support equipment) to know how to create a matching effect or background.
NO CHANGE FOR 4K
Westfall’s 0.1 micron and Calabro’s nearly 65,000 data points have been put into context by a recent Vinten whitepaper on AR and VR needs for 4K augmented reality, written by Andrew Butler, strategic planning and projects manager for the Videocom division at Vitec Group. He’s done the math and states: “It is unlikely lens manufacturers will have to change the encoding performance in designing new 4K lenses.”
The encoders, however, measure the position of the motors, or lens barrels, not the optical glass or aperture leaves. Any delay or backlash between those controls and the actual movement of the associated lens components will result in a positional error in the data sent to the rendering engine. In the case of a football first down line, for example, it might appear to move upfield or downfield a few inches as the zoom is engaged.
In Fujifilm’s top-of-the-line digital lenses, Calabro reports: “The motors are improved, the gearing is better, and our clutch system basically takes all the play out of the two, so that they’re extremely accurate. They start where they’re supposed to start and they stop when they’re supposed to stop.”
Westfall spoke to another lens anomaly that would cause a difference between the actual image delivered from a lens and what the rendering engine thought the image would be: focus breathing. It’s a phenomena that’s especially pronounced in lower cost/quality zoom lenses. When the focus ring is turned, the size of objects in the image increases or decreases.
“We have technology built into broadcast lenses where we’re able to combine the zoom and focus together so that as you change one or the other, the relative size of the subject remains proportional the way it should, “ Westfall said.
Both Canon and Fujifilm make a point of noting that just as they didn’t make special lenses for 3D, they don’t make special virtual or augmented reality lenses. Their 16- bit digital broadcast lenses are all built with the optical, mechanical and digital encoding specs that meet the requirements for VR or AR production.