NextStage Pro

NextStage Pro exports camera tracking data as a Collada .dae file. This is a common format that most 3D applications support.

 

HD Capture

 

Tracking data captured in HD mode will export a camera attached to an empty null object. The null object contains all of the animation keyframes from the Kinect’s infrared camera view. The camera object is offset 50mm to the right. This is the Kinect’s color camera view. The camera’s field of view is automatically set.

 

Sync Capture

 

Tracking data captured in Sync mode will export a camera attached to an empty null object, attached to another empty null object.

 

The main null object contains all of the animation keyframes from the Kinect’s infrared camera view.

 

If the external camera has been calibrated, the second null object is the external camera’s offset position and rotation data. If there is no calibration data this null object’s offset and rotation are set to zero.

 

If the external camera has been calibrated the field of view will be automatically set. Otherwise it will default to the equivalent of a 50mm lens.

 

Sync mode will also export the marker set as empty null objects. This is to help refine the camera calibration if necessary, and assist with subframe syncing.

 

Rotation Order

 

Rotation data from NextStage is exported as Euler angles.

 

Euler angles describe rotation using three angles, one for each axis. There is an order to how each angle is applied, and different rotation orders result in different rotations, even if the values are identical.

 

The rotation order for the NextStage is roll, pitch and yaw. Depending on your 3D application this is the forwards/backwards axis, followed by the horizontal axis, and finally the vertical axis. This will need to be manually set in your 3D application.

 

In Blender the rotation order is YXZ.

 

In Maya the rotation order is ZXY.

 

If the rotation order is not properly set you will get inaccurate results when rendering.

 

Interpolating Framerates

 

The Kinect runs at 30hz exactly and not 29.97 frames per second, which is commonly referred to as 30. This may seem like a small difference, but after a minute 29.97 and 30 frames per second will fall out of sync.

 

If you are syncing the tracking data to an external camera, you will either have to capture footage at 30 frames per second exactly, or interpolate the data.

 

Most 3D applications will let you scale the timing of keyframe data using a dope sheet editor. Since the Kinect technically runs faster at 30hz we need to scale it down. With all the keyframes selected, scale everything by 0.9992 (this is the result of 29.976 / 30).

 

If you need to interpolate a lower frame rate like 24 or 25 frames per second, simply scale the keyframes by 0.8 (24/30) or 0.8333 (25/30) respectively.

 

Some 3D applications can do this automatically. If a Blender project is set to 24 frames per second, when NextStage data is imported it will be automatically scaled as if it is 30hz data in a 24fps timeline.

 

If you are working with 23.978 footage you will still need to scale those keyframes again by 0.999083333 (23.978/24) as Blender will not account for the difference between the two framerates.

 

Keep in mind that your 3D application needs the ability to place keyframes in between timeline frames. Some 3D applications will automatically place keyframes on the nearest frame on the timeline, and won’t let you make subframe adjustments. If possible you will need to turn this feature off.

 

Syncing Footage

 

HD mode will export an the same amount of frames in the tracking data as have been captured with RGBA video. The first frames of each are identical, so there should be no need to resync them.

 

Tracking data captured in Sync mode will need to be synced. This can be done using the Kinect’s reference audio track, and the audio from the external camera recording.

 

Once both the external camera is recording and NextStage is capturing, used a clapboard or slate to create a point of reference in both audio recordings. If you do not have a slate, hands clapping together should suffice as a sync point.

 

In a video editing application sync the Kinect’s reference audio track to the external recording. The timeline framerate should match the external camera, and not the Kinect.

 

Find how many frames the beginning of the Kinect audio is offset from the beginning of the external recording. The first frame of the Kinect audio recording is the first frame of the tracking data.

 

Adjust the position of the keyframes in your 3D application as needed. Make sure any keyframe interpolation happens before syncing.

 

If your 3D application allows you to import video clips, you can verify that the tracking is synced by importing the video as a background plate for the tracked camera.

 

You can further refine the sync by matching the exported marker set points to the markers visible in the external footage. If your 3D application allows for sub frame adjustments, shift all the keyframes as one until the markers line up.

 

 

Tracking Data