User Input
Note: you are currently viewing documentation for a beta or an older version of Varjo
Pages are being reorganized, more content will be added
User interfaces is key to mixed reality experience, and with Varjo HMDs there is support for compelling user interfaces and interaction with all common mondalites. Our runtime supports hand controller interactions, hand tracking and eye tracking interactions and all of those can be used on devices with compatible hardware.
User input is handled through the input system in OpenXR. This is a powerful system that has been designed to be able to cope with both current and future hardware without requiring that games or applications needs to be updated. To make this work there both the application and the OpenXR runtime needs to work together, and it’s very important to understand the different concepts to make sure that the the application contribute in the right way, and also to understand the Varjo OpenXR runtime responsibilities. In short, the application would define all actions that the user can do, for instance pick up, menu select, teleport, etc. and and towards the runtime, suggest which buttons, sensors etc. that would be suitable to to activete the actions, Varjo OpenXR runtime then would map that to the controller in use.
The Varjo OpenXR runtime would take care of mapping to the available input on the other controllers that the application might not have been especially developed for. This would also provide a good path for forward compatibility with future hardware.
The more explicit the application is regarding the intended use, the more forward compatibity with different controllers can be enabled by the runtime. This hopefully sounds compelling on this high level, let’s go over to the concrete concepts.
The input system abstracts away from actual hardware, and instead focuses on what interactions that the developer would like to enable. They key concepts are actions, action sets and suggested bindings. The idea is that actions are interactions that could be done, they can be grouped together to be able to be enabled easily in different contexts and suggested bindings is the application developers suggestion on which activators (such as button, air taps, etc.) that the actions should be bound to.
Motion controller interaction
Our products support the Valve Index Controller, HTC Vive Wand and Khronos Simple Controller.
Eye tracking interaction
All Varjo headsets support eye tracking. The Varjo OpenXR runtime fully implements the XR_EXT_eye_gaze_interaction
extension. See the official specification for details.
Hand tracking
The Varjo XR-3 and VR-3 headsets support hand tracking. The Varjo OpenXR runtime fully implements the XR_EXT_hand_tracking extension. See the official specification for details.