Shopping Cart (0)

Mawson’s Hut 360VR Matchmoved Camera Dolly

This blog post explores the new 360VR tools in the SynthEyes Pro matchmoving program. For this experiment I was interested in seeing how well SynthEyes newest version 1702 release could handle tracking “fulldome” style footage that was filmed with 180° circular fisheye lens.

The source footage was filmed by Peter Morse in Antarctica at the Mawson’s Hut historic site in Cape Denison, Commonwealth Bay. It was filmed using a custom built timelapse motion control rig that manually controlled a Canon DSLR with an 8mm Canon fisheye lens.

Check out Peter Morse’s Blog for more details on his time in Antarctica:

The timelapse dolly is a custom creation that is known as a “Hurley Dolly”:

For more information about Mawson’s Huts and the 1912 Australasian Antarctic Expedition:’s_Huts

180° Fisheye Video Tracking Workflow

This screenshot shows the process of tracking a 180° circular fisheye movie in SynthEyes using the new 360VR mode.

I started the post-production effort on this Mawson’s Hut shot by using my KartaVR for Fusion VR toolset to warp the 180° circular fisheye footage into an equirectangular 360°x180° image projection. The footage was then rendered to disk as a 1776 frame long TIFF image sequence that had LZW compression applied.

The KartaVR for Fusion plug-in was used to apply a “Fisheye2Equirectangular” node based panoramic transform to the 180° circular fisheye imagery.

The equirectangular re-formatted footage was then loaded into SynthEyes Pro with the 360VR mode active. This resulted in a 500 MB SynthEyes .sni project file being created for this shot. The SynthEyes “Auto” tracking mode did a nice job on this footage since it was a smooth dolly motion and a majority of the tracker marks were visible for the entire length of the clip.

The SynthEyes footage loading dialog allowed me to specify that the media was a 360VR clip.

After the clip was tracked by SynthEyes a manually added coordinate system was applied that aligned the scene origin to the orientation of the long wooden “plank” shelf in the shot that is visible as the timelapse camera tracks underneath. This origin adjustment was done using “*3” button in the SynthEyes “Coordinates” tab.

SynthEyes Coordinates tab and the *3 button were used to align the scene’s origin and XYZ rotation axis so they were positioned relative to the objects in the frame.

Several stand-in temporary polygon meshes of red cubes and blue teapot shapes were inserted into the panoramic scene using the SynthEyes “3D” tab. These placeholder meshes made it easier to see how accurately the scene was matchmoved and to check for drifting issues or camera wobble.

As a tip, if you experience wobbles when you load a SynthEyes scene into a compositing package, make sure you aren’t experiencing an “off by one” error. This happens when your video footage is loaded with a frame one or frame zero starting frame, and the keyframed camera path data has the opposite setting.

The SynthEyes track was rendered to disk with a previz quality output at 4K resolution using an OpenGL based preview. This test clip was rendered quickly and easily inside of SynthEyes by switching to the Summary tab and then clicking on the “Save Sequence” button. In the “Save  Processed Image Sequence” dialog I enabled the [x] Meshes Included checkbox so I could see the Stand-In meshes in my rendered footage.

The Save Sequence button in SynthEyes renders out a Playblast like preview of your tracked scene.

The SynthEyes rendered image sequence was then converted using Fusion Studio into an MP4 H.264 movie file at 60 FPS.

The last step for the project was to add the YouTube 360 spatial media metadata to the MP4 movie and then upload the video to the web.

The “Spatial Media Metadata” program was used to add the panoramic metadata to the video file.

SynthEyes Fusion Export Options

SynthEyes recently updated its exporter modules in the version 1702 release to make it easier to send completed SynthEyes tracking projects and footage to the Fusion compositing package.

Fusion 360VR Stabilization

If you have filmed live action 360° video footage that has handheld motion you will love the new SynthEyes 360VR Stabilizer toolset. This option can be used to smooth out wobbly / shaky VR shots into something that is gentle and pleasing for a viewer to watch on an HMD (head mounted display).

The File > Export > 360VR > Fusion 360VR Stabilization menu item is used to run the new “fu360stab.szl” sizzle script. This tool makes it fast and efficient to apply a Fusion node based stabilization effect to your panoramic videos.

Syntheyes 360VR Stabilization Export

Fusion 360VR Scene Exports

The SynthEyes developer, Russ Andersson, has really worked hard in the latest release to make the Fusion export workflow a truly seamless process. With a single click, SynthEyes is able to export all of your matchmoving / video tracking data into a new Fusion comp.

When the SynthEyes File > Export > Fusion Composition menu item is selected the “fu7comp.szl” sizzle script is run. This tool allows you to export either a regular matchmoving project, or a 360VR matchmoving project directly to the Fusion compositing package.

SynthEyes Fusion Composition Export

The Fusion exporter tool creates a node based comp automatically that adds the tracked cameras with animated motion paths, the original live action background footage, point cloud locators for the tracker markers in the scene, and .OBJ mesh exported copies of the SynthEyes based stand-in meshes. When this newly created Fusion .comp file is opened in Fusion (Free) or Fusion Studio, everything is loaded and ready to go which dramatically speeds up the process.

The newest update adds 360VR support, and a new [x] Export to Clipboard checkbox option that will load the Fusion data to your system’s clipboard. You can then paste the tracking nodes into any of your current Fusion comps.

The SynthEyes “Fusion Composition” export dialog allows you to customize what elements will be added to the Fusion .comp.

Rendering 180° Circular Fisheye Content in Fusion Studio

This screenshot shows the result of rendering and compositing the Mawson’s Hut scene in Fusion Studio. The KartaVR plugin’s “Equirectangular2Fisheye” node made it easy to integrate the matchmoved CG elements into the original 180° circular fisheye background plate footage.

An environment reflection map was applied to the teapot and polygon cube meshes in Fusion. The elements were rendered in Fusion with the correct distortion so they could be placed over the 180° circular fisheye imagery with the help of the KartaVR for Fusion plug-in.