New Angles on 2D and 3D Images

Shooting stereoscopic 3D has involved many parameters: magnification, interaxial distance, toe-in angle (which can be zero), image-sensor-to-lens-axis shift, etc. To all of those, must we now also consider shutter angle (or exposure time)? The answer seems to be yes.

Unlike other posts here, this one will not have many pictures.  As the saying goes, “You had to be there.” There, in this case, was the SMPTE Technology Summit on Cinema (TSC) at the National Association of Broadcasters (NAB) convention in Las Vegas last month.

The reason you had to be there was the extraordinary projection facilities that had been assembled. There was stereoscopic, higher-than-HDTV-resolution, high-frame-rate projection. There was even laser-illuminated projection, but that will be the subject of a different post.

The subject of this post is primarily the very last session of the SMPTE TSC, which was called “High Frame Rate Stereoscopic 3D,” moderated by SMPTE engineering vice president (and senior vice president of technology of Warner Bros. Technical Operations) Wendy Aylsworth. It featured Marty Banks of the Visual Space Perception Laboratory at the University of California – Berkeley, Phil Oatley of Park Road Post, Nick Mitchell of Technicolor Digital Cinema, and Siegfried Foessel of Fraunhofer IIS.

You might not be familiar with Park Road Post. It’s located in New Zealand — a very particular part of New Zealand, within walking distance of Weta Digital, Weta Workshop, and Stone Street Studios. If that suggests a connection to the Lord of the Rings trilogy and other movies, it’s an accurate impression. So, when Peter Jackson chose to use a high frame rate for The Hobbit, Park Road Post arranged to demonstrate multiple frame rates. Because higher frame rates mean less exposure time per frame, they also arranged to test different shutter angles.

Video engineers are accustomed to discussing exposure times in terms of, well, times: 17 milliseconds, 10 milliseconds, etc. Percentages of the full-frame time (e.g., 50% shutter) and equivalent frame rates (e.g. 90 frames per second or 90 fps) are also used. Cinematographers have usually expressed exposure times in terms of shutter angles.

Motion-picture film cameras (and some electronic cameras) have rotating shutters. The shutters need to be closed while the film moves and a new frame is placed into position. If the rotating shutter disk is a semicircle, that’s said to be a 180-degree shutter, exposing the film for half of the frame rate. By adjusting a movable portion of the disk, smaller openings (120-degree, 90-degree, etc.) may be easily achieved, as shown above <http://en.wikipedia.org/wiki/File:ShutterAngle.png>.

Certain things have long been known about shutters: The longer they are open, the more light gets to the film. For any particular film stock, exposure can be adjusted with shutter, iris, and optical filtering (and various forms of processing can also have an effect, and electronic cameras can have their gains adjusted). Shorter shutter times provide sharper individual frames when there is motion, but they also tend to portray the motion in a jerkier fashion. And there are specific shutter times that can be used to minimize the flicker of certain types of lighting or video displays.

That was what was commonly known about shutters before the NAB SMPTE TSC. And then came that last session.

Oatley, Park Road Post’s head of technology, showed some tests that had been shot stereoscopically at various frame rates and shutter angles. The scene was a sword fight, with flowing water and bright lights in the image. Some audience members noticed what appeared to be a motion-distortion problem. The swords seemed to bend. Oatley explained that the swords did bend. They were toy swords.

That left the real differences. Sequences were shown that were shot at different frame rates and at different shutter angles. As might be expected, higher frame rates seemed to make the images somewhat more “video” like (there are many characteristics of what might be called the look of traditional film-based motion pictures, and one of those is probably the 24-fps frame rate).

At each frame rate, however, the change from a 270-degree shutter angle to 360-degree made the pictures look much more video like. The effect appeared greater than that of increasing frame rate, and it occurred at all of the frame rates.

Foessel, head of the Fraunhofer Institute’s department of moving picture technologies, also showed the effects of different frame rates and shutter angles, but they were created differently. A single sequence at a boxing gym was shot with a pair of ARRI Alexa cameras in a Stereotec mirror rig, time synchronized, at 120 fps at a 356-degree shutter.

When the first of every group of five frames was isolated, the resulting sequence was the equivalent of material shot at 24 fps with a 71.2-degree shutter (the presentation called it 72-degree). If the first three of every five frames were combined, the result was roughly the equivalent of material shot at 24 fps with a 213.6-degree shutter (the presentation called it 216-degree). It’s roughly equivalent because there are two tiny moments of blackness that wouldn’t have been there in actual shooting with the larger shutter angle.

As shown above, the expected effects of sharpness and motion judder were seen in the differently shuttered examples. But there was another effect. The stereoscopic depth appeared to be reduced in the larger-shutter-angle presentation.

Foessel had another set of demonstrations, as shown above. Both showed the equivalent of 60-fps with roughly a 180-degree shutter angle, but in one set the left- and right-eye views were time coincident, and in the other they alternated. In the alternating one, the boxers’ moving arms had a semitransparent, ghostly quality.

The SMPTE TSC wasn’t the only place where the effects of angles on stereoscopic 3D could be seen at NAB 2012. Another was at any of the many glasses-free displays. All of them had a relatively restricted viewing angle, though that angle was always far greater than the viewing angle of the only true live holographic video system ever shown at NAB.

That system was shown at NAB 2009 by NICT, Japan’s National Institute of Information and Communications Technology. It was smaller than an inch across, had an extremely restricted viewing angle, and, as can be seen above, was not exactly of entertainment quality. It also required an optics lab’s equipment for both shooting and display.

At NAB 2012, NICT was back. As at NAB 2009, they showed a multi-sensory apparatus, but this time it added 3D sound and olfactory stimulus. And, as at NAB 2009, they offered a means to view 3D without glasses.

This time, however, as shown above, instead of being poor quality, it was full HD; instead of being less than an inch, it was 200-inch; and, instead of having the show’s most-restricted no-glasses-3D viewing angle, it had the broadest. It also had something no other glasses-free 3D display at the show offered: the ability to look around objects by changing viewing angle, as shown below.

There was another big difference between the 2009 live hologram and the 2012 200-inch glasses-free system. There were no granite optical tables, first-surface mirrors, or lasers involved. The technology is so simple, it was explained in a single diagram (below).

Granted, aligning 200 projectors is not exactly trivial, and the rear-projection space required currently precludes installation in most homes. Despite its prototypical nature, however, NICT’s 200-inch, glasses-free, look-around-objects 3D system could be installed at a venue like a shopping center or sports arena today.

Of course, there is something else to be considered. The images shown were computer graphics. There doesn’t seem to be a 200-view camera rig yet.

Password must contain the following:

A lowercase letter

A capital (uppercase) letter

A number

Minimum 8 characters

;
SVGLogoHR_NOTAG-200

The Latest in Sports Video Production & Technology
in Your Inbox for FREE

Daily Email Newsletters Monday - Friday