Mixed reality

Note: you are currently viewing documentation for a beta or an older version of Varjo

The Varjo XR-3 and XR-1 Developer Edition headsets combine VR and AR capabilities into one device. Two cameras on the front plate of the headset allow you to see the real world with virtual objects placed inside it. Conversely, you can choose to place a smaller portion of the real world into a larger virtual environment.

Learn more about the main mixed reality features:

Varjo XR-3 and Varjo XR-1 Developer Edition

On the front plate of the headset, XR-3 and XR-1 Developer Edition both feature two high-resolution 12-megapixel cameras that can be used for mixed reality applications and see-through capabilities. Both headsets operate at 90 Hz. XR-1 Developer Edition provides a 94-degree circular viewing angle, while the XR-3 features an improved 106x94 degree (HxV) stereo FoV.

To provide a crisp image and avoid overwhelming data transfer, Varjo XR devices downscale the image in those areas that the user is not focussed on. When eye tracking is calibrated, the area where you are looking is always displayed at maximum resolution. If eye tracking is disabled, the area displayed at maximum resolution is fixed in the center of the view. This works independently from the physical focus and peripheral displays, which remain static.

We refer to the real-world image inside the headset as video pass-through. Find out more under Defining camera render position.

Developing for Unreal and Unity

Varjo currently provides implementations for its XR devices for Unreal, Unity, and our native SDK. To start development with Unreal or Unity engine, follow the installation instructions for Unreal and Unity. You can find code examples of implementations in the corresponding sections for Unreal and Unity. Any code included in these examples is freely reusable. Check the API documentation for API endpoints related to video pass-through development.

If your existing project is written in Unreal or Unity, you can also port it to Varjo’s XR devices since both engines are supported. However, if your application was developed natively, you will need to rewrite it to support the Varjo API.

Since both the Varjo XR-3 and XR-1 Developer Edition headsets use the same API, there is no need to support them separately.

Aligning the physical world with the virtual world

On a high level, the Real-world view is blended as a background layer in Varjo compositor. If the virtual content from the VR application is transparent, the video signal from the real world will be visible instead.

You can align the virtual and real worlds using a Vive Tracker that you associate with your virtual model. For information on how to use a Vive Tracker, refer to the Benchmark example in the SDK folder. If you don’t have a Vive Tracker, you can simply move your virtual object into place using the controls in your application.

To have a static physical object occlude virtual content, first create a model of the real object (for example, a plane cockpit or car) on a 1:1 scale. You can then align this model with the real world and make it transparent to see the real world in its place.