At the Movies... Technology

As a fan of movies and after having studied through my corollary sequence under Ohio University’s School of Film, I have a great interest in movies and movie making. I’m always seeing a movie and certainly appreciate the work involved in creative story telling. Adding to my love for story telling is the idea that technology has now “evolved” (that word might not be the same word used by die-hard “film” people) into 4K video and even 8K video. Besides my radio engineering career, I’m also a video engineer currently converting a full facility from HD to 4K (technically, UHD in this instance, but many people use them interchangeably). This Off the Beaten Path column is about the movies and the HD and 4K/UHD technology behind them.

First, a quick explanation of 4K/UHD video to get you up to speed. The easiest way to put it is that UHD a combination of four HD images. The difference between 4K and UHD? 4K is what the movie people use for big displays in theaters. The aspect ratio is just a little wider than the standard TV widescreen ratio of 16:9, while UHD follows the 16:9 aspect ratio. Unlike common HDTV, 4K/UHD now has something thrown in called HDR or High Dynamic Range. The interesting thing about HDR is that it’s like pre-emphasis in FM. To use an FM analogy what it really means is that if you use HDR with UHD/4K, you really need to have a monitor capable of decoding that HDR — something like encoding audio with dbx then listening to the audio without running it back through a decoding circuit. It sounds odd.

To take it a little further, an HDTV image streams over coax/BNC at 3 Mbps. To send 4K/UHD video over a coax (or four of them if being done as “quad-link”), you need to push 12 Mbps. Oh, and the next generation of video, called 8K, requires 48 Mbps of bandwidth!

So that’s the Cliff’s Notes version of 4K/UHD, which is the current main display spec for movies. 8K is on the horizon, but for a typical consumer TV in a home, you won’t see big differences on images unless your screen size is excessively large — such as viewing on an 8-foot TV screen! Remember when the days when a 25-inch RCA or Zenith console TV was considered a “big screen”?

The First Film Makers

Just like radio & TV’s history, film history becomes a little checkered and cloudy when you go back in time. Who is taking credit for what become a common issue with technology. Though Edison has a large influence in the U.S., Luis and Auguste Lumière (“The Lumière Brothers”) are generally credited as the first true film makers. By 1881, the young teenaged brothers had created a developing process which took Edison’s “peephole film viewing” technique and gave it the ability to become something that could be “projected;” which is the reason they’re more frequently credited with inventing “film” (or something more closely resembling film today).

Early Movies

It’s simply amazing to see where we’ve come from and where we are going with movie-making technology — from the earliest days of Edison and the Lumière brothers, to Charlie Chaplin, to Walt Disney and his multiplane camera, George Lucas, and into today. We began with studios where the roofs were open because they needed more light than was technically possible using the then current lighting technology. Then came “rocking” sets. “Talkies” followed. Visual effects like forced perspective created size illusions while matte paintings created incredible backgrounds. Now computer graphics and digital animations make “virtual everything” — from people to places to create pretty much anything we want in film (from the truly believable to the unbelievable!). When you look at movie technology and consider all the invention behind it, a lot has happened in just over 100 years!

Technicolor

Here’s an awesome video about the technology behind Technicolor. Technicolor was not the first movie film available, but it really became the first great color film technology. The Technicolor process really held its own over the years and through the life of 35mm film.

Movie Mistakes

For fun, I’ve had this link about the worst on-set mistakes from Hollywood. Just like staying until the end of a Jackie Chan movie where you get to see the outtakes, it’s interesting to see things when they don’t go right!

Movie Technology “Past Predictions”

This link is now six years old, but I included it because it was a look at the future of movie technology. Now we can look at it and see what came true and what hasn’t. For instance, they talk about IMAX “with lasers,” and the fact is that now high-lumen projectors are incorporating laser technology! Long gone are the days of carbon-arc projecting … and now even xenon bulbs are seemingly old technology. Instead of a massive 6,000 watt bulb, projectors have transitioned to multiple stacks of bulbs (no more single point of failure with a “bulb blow-out”) and laser technology. Just three years ago, I had the opportunity to see a 90,000 (!) lumen projector in Christie’s digital labs in Kitchener, Ontario. Until then, 35,000 lumens was about the brightest output for a projector. To give you an idea how bright 90,000 lumens is, if you stand close in the path of the light, dark clothing can start to smolder and ignite! Though this technology hasn’t advanced, likely due to the danger involved and the very unique need for something that bright, laser technology has continued to advance.

More on Movie Technology

Since the days computers found their way into movie making, things have changed a lot. Here’s another look at the technology today and where it’s going.

Time to Think About 8K?

And here’s a sneak look into 8K video and the development. By the way, the next Olympics from Japan will be shot in 8K.

And finally ...

If you stumble across a good or unusual website that might be of interest, please don’t hesitate to send me the link and any info you might have about it. My email address is dan_slentz@yahoo.com.

CATEGORIES