Creating Volumetric Fog in 6DoF 360VR 3D Using KartaVR
This video shows the results of a KartaVR based VR workflow R&D experiment titled “West Dover Forest Z360 Disparity Depth Stitch“. The end goal was to be able to create a node based 6DoF Stereo 3D scene in KartaVR that had depth based volumetric fog layered into the shot.
The forest scene started put with 6 fisheye views (three sets of stereo pairs) that were filmed using a Nodal Ninja Panoramic head mounted on a Jasper Engineering 12″ stereo slidebar.
The fisheye footage was loaded into KartaVR and a node based stitch was done. A disparity mapping based approach made it possible to generate a LatLong formatted depthmap that matched the LatLong color image data.
Next, an omni-stereo 6DoF workflow was used in KartaVR where the color and depthmap based LatLong views were merged into a combined 360 based over/under image layout that had the color view on the top, and the depthmap view on the bottom.
The KartaVR based Z360Merge, Z360VRDolly, and Z360Stereo and nodes made it easy to convert the over/under formatted RGBZ data into a standard stereo pair of left and right eye LatLong views that could be displayed on an HMD or viewed in the standard viewer windows with anaglyph glasses.
The thing that makes KartaVR’s Z360 based 6DOF omni-stereo tools so useful is that you can use a Z360VRDolly node to freely rotate the color+depthmap based panorama on the yaw/pitch/roll axis. With the color and depth data isolated this way the quality of the stereo depth is not degraded since the final color left and right eye stereo LatLong views haven’t been generated yet.
Finally the rotated Z360 frame is routed from a Z360VRDolly node into a Z360Stereo node and your traditional left and right eye view color LatLong frame can be generated. With Z360 you have the ability to adjust the virtual camera spacing and the convergence setting to make a comfortable to view stereoscopic 3D image.
Finally, since this was an R&D workflow test, CG rendered volumetric fog was added in KartaVR to the live-action stereoscopic 360° panoramic forest scene. The depthmap data was used to place the fog at the correct depth levels throughout the LatLong image. The fog was animated using keyframes to blow through the 6DOF scene over time so you could see the varying fog density as the video clip plays.