Mixed reality
Note: you are currently viewing documentation for a beta or an older version of Varjo
Varjo XR-1 Developer Edition is a device combining VR and AR capabilities. Two cameras on the front plate of the device allow you to see the real world with virtual objects or environments within it. Similarly, you can also choose to see only a portion of the real world in the virtual environment.
What is Varjo XR-1 Developer Edition
Varjo XR-1 Developer Edition is a device combining VR and AR capabilities. Two cameras on the front plate of the device allow you to see the real world with virtual objects or environments within it. Similarly, you can also choose to see only a portion of the real world in the virtual environment.
Varjo XR-1 Developer Edition has two high-resolution 12-megapixel cameras located on the front plate of the headset that can be used for mixed reality applications and see-through capabilities. They are operating at 90 Hz and provide a 94-degree circular viewing angle. In order to provide a crisp image and avoid overwhelming data transfer, XR-1 Developer Edition downscales the image only in areas where you are not focused. If you have calibrated eye tracking, the area at which you are looking will always be at the maximum resolution possible. If the eye tracking is disabled, maximum resolution will be fixed in the center of the screen. This works independently from the physical focus and peripheral displays which remain static. We refer to the real-world image in the headset as “video pass-through.” Read more about it in Defining camera render position section.
Developing for Unreal and Unity
Varjo currently provides implementation for Unreal, Unity, and native SDK for XR-1 Developer Edition. In order to start development with Unreal or Unity engines, follow the installation instructions from the corresponding Unreal and Unity pages. You can find code examples of implementations in the corresponding sections for Unreal and Unity. You can freely reuse any code shown in these examples. Check the API documentation for the API endpoints that are related to video pass-through development.
If you have your project written in Unreal or Unity, it is possible to port it to the XR-1 Developer Edition, since both engines are supported. However, if your application is developed natively, you will need to rewrite it to support Varjo API.
Aligning physical world with Virtual world
On a high level, the Real world view is blended as a background layer in Varjo compositor. If the virtual content from the VR application is transparent, the video signal from the real world will be visible there.
You can align a virtual and real world using a Vive Tracker that you associate with your virtual model. For example, to learn how to use Vive Tracker, refer to the Benchmark example in the SDK folder. If you don’t have a Vive Tracker, you can simply move your virtual object to a position where you think it should be using the controls in your application. In order to have static physical object occluding virtual content (like a plane cockpit), you can create a model of the real object (cockpit or car) in a 1:1 scale. You then align this model with the real world and make it transparent, in order to see the real world in those places.