NextStage Pro

“NextStage for Unity” is an add-on that lets users stream tracking data from NextStage Pro, and project realtime 3D video into their Unity projects.

 

The beta version is included with NextStage Pro and can be used for virtual production, pre-visualization, augmented reality, virtual reality and mixed reality Unity projects.

 

Requirements

 

NextStage for Unity requires:

 

  • NextStage Pro 0.8c
  • Unity 5.3.2 or later
  • Kinect for Windows V2 or Kinect for Xbox One with Windows Adapter
  • Kinect for Windows Unity add-on
  • Windows 8 or 10

 

Installation

 

 

Then install the “NextStageForUnity_Beta01a” Unity Package included with NextStage Pro.

 

   

 

There are two prefabs to choose from depending on your Unity project.

 

NextStageHQ Prefab

 

This prefab projects the Kinect’s depth camera at full resolution. It is intended for standard, single camera Unity scenes.

 

 

Add the prefab into your Unity scene. NextStageHQ includes a “virtualCamera” child object so you can delete the default “Main Camera” in the scene.

 

NextStageVR Prefab

 

The NextStageVR prefab is designed for real-time, mixed-reality applications. It functions almost identically to NextStageHQ, but is optimized to run simultaneously with virtual reality headsets.

 

 

After adding the prefab to your virtual reality scene, keep the default main camera. This time we want to have the player’s view, and the mixed reality view separate.

 

On the NextStageVR “virtualCamera” object “Target Eye” is set to none, and the depth value is set to 1. The depth value should be higher than the depth value on the main camera.

 

 

With this setup the computer monitor will display mixed reality from the virtual camera’s view, while the headset will display normally in virtual reality.

 

Lastly, make sure to make a new “layer” in your Unity project called “projector”, and then turn off the “projector” layer on the main camera’s culling. The ProjectorVR script will automatically try to assign the projected mesh to this layer, so it won’t show up in the virtual reality view.

 

Streaming from NextStage Pro

 

NextStage for Unity works by streaming the tracking data and scene space mask from NextStage Pro into Unity.

 

 

In the lower left corner of the settings tab, change the camera mode to “Stream”.

 

There are two numbers here that tell NextStage where to send the tracking data.

 

 

 

 

First is the ip address. By default this number is set to your computers “local” ip address, so you don’t have to change anything.

 

The second number is the port, which by default is set to “5555”. Attached to the Projector object in Unity is a script called “TrackingServer” which has default port value of “5555”. As long as these port numbers match, the Unity will receive data sent by NextStage.

 

 

In NextStage, press initialize to confirm the address and port, then press stream to begin sending tracking data.

 

Once you’re streaming from NextStage, press play inside the Unity editor and you should see tracked video from the Kinect, projected inside Unity.

 

 

Alignment

 

In the beta release you will need to manually align the projection with any other VR peripherals you may have. NextStage tracks at real world scale, so 1 meter in real life is 1 unit in Unity.

 

Both the projector and the virtual camera are parented to the NextStageVR object. This is equivalent to the origin point in NextStage. You can align the projection and tracking data by moving and rotating this object.

 

NextStage for Unity