Fulldome 3D for Everyone
By Roberto Ziche, fddb The Fulldome Database
THE RESEARCH AND DEVELOPMENT
The Stereo Domemaster shader is the result of an innocent question asked by a friend at the Chabot Space and Science Center in the Oakland hills, California, where I had been volunteering for just a few months.
Was it possible to create 3D movies for a dome?
I immediately thought that it would be possible, despite seeing immediately some obvious limitations. But it seemed like a good challenge to take on, and come up with a workable solution.
The first round of research about existing technology, and 3D images in general, found that some proprietary technology existed, but adoption was very limited. Costs to enable 3D projection on planetariums were high, but in my opinion, it was the classic chicken and egg issue. As long as production of 3D content was limited because of scarce tools availability, 3D on domes would have never been mainstream.
So, a free, open source solution, seemed like a good idea for a start. The chosen missing production tool was a lens shader for one of the most common 3D content creation tool: 3ds Max and mental ray.
A lens shader is a plug in for mental ray that modifies the way a camera sees the scene. The way I wanted to use it was to overcome the limitations of the common 2-cameras rigs that are used in the production of traditional “flat screen” 3D movies.
Armed with my high school level math and geometry knowledge, and after 15 years that I didn’t write a single line of C++ code, I enjoyed every single minute of those 6-7 months of research and development I squeezed out of my spare time.
THE DOME LIMITATIONS
3D on a flat surface, like your TV or movie theater screen, is the easiest form of 3D. I didn’t say “easy”, as it still requires lots of understanding and tweaking at the capture (or rendering) and post-processing stages, but it’s a limited view area with predictable viewing positions that can be easily understood and addressed.
The dome is a different beast.
Imagine a horizontal dome, and for the sake of simplicity, just ignore the upper part, and focus on the lower edge. It might help to think of a cylindrical screen, open at the top.
It’s easy to see that a standard fixed 2-cameras/2-projectors setup wouldn’t work. If we simply extend a traditional screen and wrap it around the viewer, we would create areas where the 3D effect would be null as the cameras appears aligned.
Even worse, in the area on the back side (not shown in the illustration above) the effect would be reversed, as the cameras effectively swap their position.
What’s the problem here? The problem is that the rendering process can create the wrap-around image in one go (through a 360 degrees lens shader), and this is a common technique in 3D renderings, but in reality, a human looking around a panorama, will turn his head. A fundamental difference for 3D effects to work.
If we could put a lot of traditional 3D screens side by side, wrapping around the viewer, the effect would be maintained, as now the cameras/projectors pairs rotates as the viewer’s head does, but would require multiple renderings, one for each screen, or “slice”.
By reducing the width of each screen, and increasing the number of slices, edge distortions and misalignments would be minimized, and with an infinite slicing of the cylinder, we could create the perfect viewing experience.
Slicing is a common method used for real life 3D capture of panoramic or dome images, and the number of slices needs to be something manageable. But since here we are talking about computer generated images, reaching infinite slices is perfectly possible. [more…]
This is an extract from the 1st part of a 5 part article:
Part 1 – Research and Development
Part 2 – Research and Development
Part 3 – Astronaut 3D – A Stereoscopic Fulldome Production
Part 4 – Domography + Creative Considerations
Part 5 – Maya and Softimage Fulldome Stereo Support
Source: fddb The Fulldome Database