By Dave Stump, SMPTE Working Group on High Frame Rate Cinema
The SMPTE workgroup on High Frame Rate was established eight months ago, and is co-chaired by Michael Kargosian, Kommer Kleijn, a Belgium cinematographer, and me. The work group was formed partly as a response to the creative community by several studios, which realized they would have to accommodate the eventual distribution of High Frame Rate (HFR) content. The studios distributing The Hobbit and Avatar 2 are keen to jump in and help work out solutions, but, in the end, the entire film community is going to have to grapple with the issues of HFR, including the exhibitors. Currently, SMPTE standards and recommended practices don’t extend to higher frame rates or higher bit rates for compression. So the HFR working group was a necessary step towards that. Currently there are 50 to 60 people in the working group, and there is a huge amount of interest in the organization in this topic.
The working group is constructing a test plan and a shot list. We’re trying to whittle that shot list down to something that is do-able in size, without allowing it to grow exponentially. We hope to have help from the studios to sponsor some tests. We will need to test with the RED Epic, ARRI Alexa, Sony F65; we’ll test with all these cameras and at a variety of frame rates including 24, 25, 30, 48, 50 and 60 Hz, as well as 72 and 120 fps, which will help to create a library of test material.
Our primary area of interest is going to be things that stress compression, both JPEG 2000 and MPEG. That’s because, ultimately, a DCP is a JPEG 2000 compressed file and both JPEG and MPEG files are on our list of deliverables. What will stress compression is different for each scheme. For JPEG 2000, when you have fine detail in darker background areas behind faces in the foreground, JPEG tries to preserve the fine detail in the faces instead of the fine detail in the background areas.
MPEG has problems with fine detail in motion, across the frame or especially in rotation. If you took a camera and pointed it at a very large stadium crowd and rolled the camera as you panned it, you could really stress MPEG encoding.
I’m interested in testing all the parameters of high frame rate and their effects on 3D viewing. There are some motion artifacts that I’m interested in testing. For example, when footage is acquired synchronously and then played back asynchronously, some subtle but funny things happen. With synchronously shot content played back asynchronously, objects moving across the frame change in depth, dependent on which direction they are traveling and how fast they are moving across the frame. Objects moving from left to right can exhibit a different apparent z depth than objects moving across the frame at the same distance from camera, but moving from right to left.
I’m concerned with HFR in 3D and 2D, but by virtue of doing the testing in 3D, we end up with bonus content in 2D as the single eye of one pair. This technique probably won’t extend to 4K HFR testing and I am looking forward to comparing 4K material shot through a 3D beam splitter to material shot clean. For now, we’re going to try to do all the testing in 3D, because 3D is more relevant for now as all the big HFR movies coming out are 3D [more…]
Source: Creative COW