Steak Underwater "Reactor" package manager suppport was added, along with new full-featured KartaVR freeware license that allows commercial use of the VR tools for $0.
macOS based users of KartaVR can run the new "Video Snapshot" tool that allows Fusion to capture live action footage from HDMI/SDI/USB video sources to disk. This video I/O captured media is accessed inside of Fusion using a managed loader node that can be added to the foreground comp with a single click inside the "Video Snapshot" window.
The video snapshot tool could be used for stop motion animation work. Or a VFX supervisor could use it to grab footage from a video camera to help with on-set production comp-viz work. Or an XR media producer could do a fast node based 360VR stitching test in Fusion to make sure the footage being captured on location is going to be able to be fine-stitched in post without any show-stopping issues.
Added an AcerWMRStereoRenderer3D Renderer3D macro that creates stereoscopic 3D 2880x1440px output from the Fusion 3D system. That interactively rendered output can be displayed directly on an Acer Windows Mixed Reality HMD on macOS/Win/Linux via a floating image view.
Added a ViewerAcerWMR2StereoOU node for displaying panoramic images on an Acer Windows Mixed Reality HMD on macOS/Win/Linux via a floating image view.
Added a DaVinci Resolve compatible set of example Fusion page compositing project files that can be accessed in the PathMap folder:
Reactor:/Deploy/Comps/KartaVR/KartaVR Example Comps.drp
Virtual Production tools with support for exporting of Fusion 3D workspace elements like Point Clouds, Cameras, and Meshes to XYZ ASCII (.xyz), PLY ASCII (.ply), Maya ASCII (.ma), and PIXAR OpenUSD ASCII (.usda) formats.
KartaVR now has a collection of panoramic 360° depthmap data compatible "Z360" nodes that allow you to create 6DOF stereo VR output inside of Fusion. As part of this new 6DOF workflow, KartaVR also supports using Fusion Studio's "Disparity" node with the Z360 toolset to extract depth information from your live action camera rig footage.
The Z360VRDolly node allows you to animate omni-directional stereo compatible XYZ rotation and translation effects inside of an equirectangular 360°x180° panoramic image projection. This means you can now create slider dolly like motions in post-production from your stereo imagery.
The Z360Stereo node makes it easy to convert over/under formatted color and depthmap data into a pair of new left and right stereo camera views.
The Z360Mesh3D node takes the color + depthmap image data and creates a new displaced environment sphere that allows you to explore a simulated real-time volumetric VR version of the scene in Fusion's 3D workspace. Since the Z360Mesh3D node creates real geometry in the scene that updates per frame you are able to easily move around with full XYZ rotation and translation controls. With this approach you can also place Fusion based Alembic/FBX/OBJ meshes inside the same 3D scene, or add photogrammetry generated elements, too.
The Z360DepthBlur node allows you to apply depth of field lens blurring effects to your panoramic imagery based upon the Z360 based depthmap data.
KartaVR has a new Send Media to Photoscan script that helps people who are working with photogrammetry (image based modelling) workflows. This script instantly creates an AGI Photoscan project file out of your selected Fusion based loader/saver imagery. This makes for a really efficient pipeline that allows you to key your greenscreen shot photogrammetry footage using Primatte in Fusion and then process the footage in AGI Photoscan with geometry based alpha masking.
There is an accompanying Send Media to Photoscan YouTube video tutorial that shows the new toolset in action using studio shot footage.
A pair of nodes called ImageGridCreator and ImageGridExtractor help create/extract image sequences from a tiled image grid layout. This is handy if you are working with photogrammetry or lightfield source imagery that might be coming from a combined "sprite atlas" style image grid layout.
Dig into the Example Projects
KartaVR now includes 72 Fusion example projects. Each one contains detailed descriptions of a panoramic compositing workflow. Explore the projects and learn new techniques that will take your VR project to the next level. There is also a fun roller coaster example that demonstrates how to render VR content directly in Fusion's 3D animation environment.
Import PTGui Project Files
You can now import a PTGui stitching project file into Fusion. This will make a new composite with all of the nodes required to stitch your footage in seconds.
UV Pass Based High Speed Panoramic Conversions
KartaVR is able to dramatically simplify the process of building a fast and high quality UV pass based panoramic 360° video stitch. This UV Pass technique allows you to stitch and remap imagery between any image projection imaginable.