This video shows a demo of a new approach for displaying 360° 6DOF Stereo VR imagery in Maya.
The “Z360 Maya Preview” example scene works by opening up Z360 formatted imagery that is arranged in an over/under color + depthmap layout. When creating a Z360 formatted image / image sequence / movie file a 2D color equirectangular image is positioned at the top of the vertically stacked frame layout, and the depthmap is on the bottom of the frame:
With a Z360 workflow you can create animated 360° omni-directional stereo compatible XYZ camera translations and rotations in post-production.
This means you can create simulated slider rail / dolly based camera moves in your live action 360° stereo video scene with the exact amount of motion controlled using keyframe animation in a compositing package. You are free to dial in the amount of stereo camera IPD separation you want when the Z360 image is displayed on an HMD, or when it is processed into a traditional left and right view equirectangular image in a compositing package.
In a Z360 system the stereo depth information isn’t baked into the left and right camera equirectangular views in advance.
This means you are free to apply extreme amounts of roll/pitch rotation correction without degrading the stereo depth quality in your panoramic stereo imagery. Also with Z360 you don’t have to worry about experiencing stereo artifacts like cross-eyed stereo output or “brain shear” which could happen when a traditional “non omni-directional stereo” based view rotation effect is applied.
If you have the mOculus VR Plugin for Maya you can display the Z360 example scene in stereo live on your Oculus Rift or HTC VIVE HMD.
Download the Example Project
You can download the sample Z360 for Maya project here:
Note: Maya 2017 Update 3 or newer is required to use this scene file.
Read the Z360 for Maya Documentation
You can read the online documentation for the Z360 for Maya example project to learn more about Z360 playback approach.
How do I Create Z360 Based Live Action Stereo 360° Footage?
The KartaVR for Fusion plugin is capable of stitching live action stereo 360° camera rig footage into a Z360 frame format using disparity mapping in Fusion Studio. You can also convert 2D panoramic images into Z360 based stereo3D images using a manual rotoscoping approach.
If you want to work with live action 6DOF Stereo VR imagery the following KartaVR nodes are extremely useful in Fusion: Z360VRStereo, Z360VRDolly, Z360VRMesh3D. There is also a Z360Renderer3D node that can be used to render Fusion 3D workspace content directly into this stereoscopic VR frame format.