Mixed reality

Note: you are currently viewing documentation for a beta or an older version of Varjo

Varjo XR-3 and Varjo XR-1 Developer Edition are devices that combine VR and AR capabilities. Two cameras on the front plate of the device allow you to see the real world with virtual objects or environments within it. Similarly, you can also choose to see only a portion of the real world in the virtual environment.

Learn more about the main mixed reality features:

Varjo XR-3 and Varjo XR-1 Developer Edition

Varjo XR-3 and Varjo XR-1 Developer Edition both have two high-resolution 12-megapixel cameras located on the front plate of the headset that can be used for mixed reality applications and see-through capabilities. Both operate at 90 Hz and XR-1 provides a 94-degree circular viewing angle, while XR-3 provides an improved 106x94 degree (HxV) stereo FoV.

In order to provide a crisp image and avoid overwhelming data transfer, Varjo’s XR devices downscale the image only in areas where you are not focused. When you have calibrated eye tracking, the area at which you are looking will always be at the maximum resolution possible. If the eye tracking is disabled, maximum resolution will be fixed in the center of the view. This works independently from the physical focus and peripheral displays which remain static.

We refer to the real-world image inside the headset as video pass-through. Read more in the Defining camera render position section.

Developing for Unreal and Unity

Varjo currently provides implementations for Unreal, Unity, and the native SDK for its XR devices. To start development with Unreal or Unity engine, follow the installation instructions on the corresponding Unrealand Unity pages. You can find code examples of implementations in the corresponding sections for Unreal and Unity. Any code included in these examples is freely reusable. Check the API documentation for API endpoints related to video pass-through development.

If your existing project is written in Unreal or Unity, it is possible to port it to Varjo’s XR devices since both engines are supported. However, if your application was developed natively, you will need to rewrite it to support Varjo API.

Both the Varjo XR-3 and Varjo XR-1 Developer Edition use the same API, so there is no need to write support for them separately.

Aligning the physical world with the virtual world

On a high level, the Real-world view is blended as a background layer in Varjo compositor. If the virtual content from the VR application is transparent, the video signal from the real world will be visible there.

You can align the virtual and real worlds using a Vive Tracker that you associate with your virtual model. To learn how to use the Vive Tracker, refer to the Benchmark example in the SDK folder. If you don’t have a Vive Tracker, you can simply move your virtual object to a position where you think it should be using the controls in your application. In order to have a static physical object occlude virtual content (e.g., a plane cockpit), you can create a model of the real object (cockpit or car) on a 1:1 scale. You then align this model with the real world and make it transparent to see the real world in those places.