Mixed Reality development

Note: you are currently viewing documentation for a beta or an older version of Varjo

This page is dedicated to development for XR-1 Developer Edition. It contains all the information and references specifically related to mixed reality content development. If you are planning on developing traditional VR applications for Varjo headsets, refer to the main development page.

WHAT IS VARJO XR-1 DEVELOPER EDITION

Varjo XR-1 Developer Edition is a device combining VR and AR capabilities. Two cameras on the front plate of the device allow you to see the real world with virtual objects or environments within it. Similarly, you can also choose to see only a portion of the real world in the virtual environment.

Varjo XR-1 Developer Edition has two high-resolution 12-megapixel cameras located on the front plate of the headset that can be used for mixed reality applications and see-through capabilities. They are operating at 90 Hz and provide a 94-degree circular viewing angle. In order to provide a crisp image and avoid overwhelming data transfer, XR-1 Developer Edition downscales the image only in areas where you are not focused. If you have calibrated eye tracking, the area at which you are looking will always be at the maximum resolution possible. If the eye tracking is disabled, maximum resolution will be fixed in the center of the screen. This works independently from the physical focus and peripheral displays which remain static. We refer to the real-world image in the headset as “video pass-through.” Read more about it in Defining camera render position section.

ENABLING MIXED REALITY ON XR-1 DEVELOPER EDITION

When you decided to develop a mixed reality application on your engine, you need to have the Varjo VR implementation. To locate that, refer to the Varjo SDK instructions.

After that step is done, you can begin with mixed reality implementation.

Video pass-through image is another layer in your application, just like VR. This layer is displayed behind the VR layer by default. However, by using depth estimation and occlusion, you can bring real objects in front of virtual objects. Read more about it in the Depth occlusion section.

Varjo Compositor can automatically blend virtual and video pass-through layers and align VR views. In order to see an image from video pass-through in your application, you need to perform the following steps:

* Check if MR is available

    if (varjo_HasProperty(m_session, varjo_PropertyKey_MRAvailable)) { 
                mixedRealityAvailable = varjo_GetPropertyBool(m_session, varjo_PropertyKey_MRAvailable); 
    } 

* Start video pass-through:

    varjo_MRSetVideoRender(m_session, varjo_True); 

Modify the alpha channel of your VR layer according to your needs: Wherever alpha is lower than 1.0, you cansee the real-world image.

Note: The colors need to be pre-multiplied or otherwise blending does not work as expected.

The full 12 mega-pixel resolution is too large to be streamed to the computer at 90 Hz in stereo, which is why we split the stream into 2 parts.

The first part is the “peripheral stream,” which is the full sensor image downscaled to 1008×1008 resolution. The second part is the “focus stream,” which is a non-downscaled crop from the full-resolution sensor image of size 834×520. The crop position can be moved in real-time per-frame to any location in the image, and currently it moves according to the user’s eye gaze as a result of eye tracking with sub-degree accuracy. In practice, you will always see the full resolution image wherever you are looking. It is important to note that to work properly, this functionality requires eye tracking enabled.Otherwise, the foveated video pass-through image will be fixed to the center of the view.

Developers can get access to the uncompressed, distorted peripheral stream directly from cameras. In addition to the raw data, distortion parameters are provided so you can undistort the image for your use cases. The current stream is intended mainly for possible computer vision applications.

You can get the stream from the API as explained on the Native examples page.

ALIGNING PHYSICAL WORLD WITH VIRTUAL WORLD

On a high level, the Real world view is blended as a background layer in Varjo compositor. If the virtual content from the VR application is transparent, the video signal from the real world will be visible there.

You can align a virtual and real world using a Vive Tracker that you associate with your virtual model. For example, to learn how to use Vive Tracker, refer to the Benchmark example in the SDK folder. If you don’t have a Vive Tracker, you can simply move your virtual object to a position where you think it should be using the controls in your application. In order to have static physical object occluding virtual content (like a plane cockpit), you can create a model of the real object (cockpit or car) in a 1:1 scale. You then align this model with the real world and make it transparent, in order to see the real world in those places.

HANDLING ERRORS

If the video pass-through cameras have been disconnected, you will see an error in Varjo Base and a pink color where the real-world view was supposed to be in the headset. You can replace default pink color for the environment with something else, such as a skybox or an image with a logo. For that, the app can listen to the MR device connected and disconnected events, as well asswitch behavior.

    varjo_Event evt{}; 
    while (varjo_PollEvent(m_session, &evt)) { 
        switch (evt.header.type) { 
            case varjo_EventType_MRDeviceStatus: { 
                switch (evt.data.mrDeviceStatus.status) { 
                    case varjo_MRDeviceStatus_Connected: { 
                        printf("EVENT: Mixed reality device status: %s\n", "Connected"); 
                        break; 
                    } 
                    case varjo_MRDeviceStatus_Disconnected: { 
                        printf("EVENT: Mixed reality device status: %s\n", "Disconnected"); 
                        break; 
                    } 
                } 
            } break; 
            default: break; 
        } 
    } 

DEVELOPING FOR UNREAL AND UNITY

Varjo currently provides implementation for Unreal, Unity, and native SDK for XR-1 Developer Edition. In order to start development with Unreal or Unity engines, follow the installation instructions from the corresponding Unreal and Unity pages. You can find code examples of implementations in the corresponding sections for Unreal and Unity. You can freely reuse any code shown in these examples. Check the API documentation for the API endpoints that are related to video pass-through development.

If you have your project written in Unreal or Unity, it is possible to port it to the XR-1 Developer Edition, since both engines are supported. However, if your application is developed natively, you will need to rewrite it to support Varjo API.