Introduction to Varjo SDK
Note: you are currently viewing documentation for a beta or an older version of Varjo
If your project relies on your own engine that is not yet supported by Varjo headsets, you can add support for your engine using the Varjo SDK for Custom Engines. Varjo SDK documentation is intended to help you prepare your project to work on Varjo headsets. Jump to the corresponding section:
- Going from another VR headset to Varjo headset
- Tracking and controllers
- Migrating from OpenVR
- Rendering
- Eye tracking
- Mixed Reality
- Error handling
- Threading
- Timing
- Properties
- Timewarp
Going from another VR headset to Varjo headset
When thinking about adapting your product for the Varjo headset, it is important to understand the difference between a “traditional” VR headset with traditional displays and the Varjo headset with the Bionic Display. You will need to use much higher image resolution in order to use Bionic Display to its full potential. Additionally, you will need to render two images per eye, instead of a single image, since each Bionic Display includes two overlapping displays. Details on how to render images are explained in the section. Rendering to Varjo headset.
Tracking and controllers
Currently, Varjo headset uses SteamVR™ Tracking technology. The API used for SteamVR Tracking is called OpenVR.
OpenVR API can be utilized for controllers and trackers. You will need to initialize the SteamVR system as a background application to access the controllers in your application. The example implementation of hand controllers tracking can be found in the Benchmark example. In most cases it’s sufficient to get your existing controller implementation working by just changing the type of your vr::IVRSystem
from vr::EVRApplicationType::VRApplication_Scene
to vr::EVRApplicationType::VRApplication_Background
.
Please note that this IVRSystem
should be used only for controllers and input, not for rendering or head pose. This may cause problems with the Varjo compositor and result in a suboptimal VR experience. You should only use the controller tracking and controller input portions of the OpenVR API.
Important: You should always render the image using the view information provided by the Varjo API.
Origin override
As of Varjo Base 2.4 release, the Varjo API can be used to reposition controllers and other trackables after using the Override origin and direction feature in Varjo Base. This is useful since the override feature applies to the headset only.
The varjo_GetTrackingToLocalTransform
function returns a transform matrix that can be used to place trackables in the correct position and orientation. Apply this to, for example, controller pose by multiplying the controller pose with the matrix.
Tracking plugins
Varjo Base supports tracking plugins, that allow users to override the headset tracking and use other tracking systems in addition to SteamVR. Please see Third-party tracking for more information.
Migrating from OpenVR
OpenVR is an SDK and API developed by Valve. If you have read the SDK documentation, you are most likely familiar with how it works, and have already ported your software to work with it. Varjo headset does not use OpenVR for headset tracking. You still must use OpenVR API for controller integration in your engine, but all the headset–related functionalities need to be addressed through the Varjo API.
Rendering
Learn the details about rendering on the Rendering to Varjo Devices page.
20/20 Eye tracking
You can use the full capability of the Varjo headset eye tracking feature in your software. Get more familiar with eye tracking on the Eye tracking page. API endpoints for eye tracking can be found in the API endpoint guide available in Varjo SDK for Custom Engines package under varjo-sdk/docs.
Mixed Reality
Learn the details about mixed reality development on the Mixed reality development page.
Error handling
You can query Varjo errors with the varjo_GetError
function. The error code refers to the first frame loop that has failed. The following API calls may fail as a cascading result, without overriding the previous error. The varjo_GetError
function will also clear all the following errors. Errors are important and must be checked at least once every frame. Errors can be used for informing the user about errors.
Threading
Varjo API is thread safe.
However, graphics APIs put additional constraints on this as, e.g., Direct3D calls will use the immediate context of the provided graphics device and they must not overlap with threads that use the same context.
Timing
Varjo uses nanoseconds as a time unit. Absolute times are relative to an epoch which is constant during execution of the program. Time can be queried using the varjo_GetCurrentTime
function.
To query the time for a frame, the varjo_FrameGetDisplayTime
function will return the time of the average perceived moment of when the image is shown.
Because measuring time in nanoseconds yields very large numbers, you should be aware of possible precision issues when casting to other types.
Events
Varjo API uses events to notify users about changes to the system and user input.
Initialize a varjo_Event
structure. It can be allocated in a stack or if you want it allocated in a heap, you can use the varjo_AllocateEvent
helper function. Poll for the events in your main loop or any other place that gets called frequently by calling varjo_PollEvent
in a loop. Varjo_events.h
contains all available event types.
Properties
Properties can be used to query different Varjo system values.
- Call
varjo_SyncProperties
when you want to update the properties. - Use
varjo_HasProperty
to check whether a specific property value exists. Each property has its ownvarjo_PropertyKey
. - Use
varjo_GetProperty*
to get the actual property value.
All available properties are listed in Varjo_types.h
. Currently, there are only gaze tracking–related properties.
Timewarp
Timewarp is a way of making sure that the rotation of the headset in VR and in real life are always in sync. The plane of the image is rotated as the headset rotation changes, regardless of the image frame rate. This ensures that the user will not feel nauseous when looking around while wearing the headset.
If the application submits a depth buffer, an additional positional timewarp is applied over the image. This will greatly improve the smoothness of applications where the user is moving, especially in cases where the application isn’t rendering the full 60 frames per second. Due to the implementation of positional timewarp, you may see some artifacts around the edges of objects.
In Varjo Base 2.4 we introduced Motion Prediction, which is essentially an extension of positional timewarp. Motion prediction takes into account velocities in the image, either by approximating them or by using information submitted by the application. In comparison with positional timewarp, motion prediction will reduce artifacts even further in scenes that contain a lot of motion. This is especially useful in applications like simulators where the user experiences movement.