Note: you are currently viewing documentation for a beta or an older version of Varjo
This page explains methods for how you can create realistic-looking virtual objects in mixed reality.
On this page you can find the following topics:
- Matching VR output with video signal
- White balance
- Illuminating virtual objects with real world lighting
- Rendering shadows from virtual environments to real world
Matching VR output with video signal
For a realistic MR experience, the brightness and white balance of VR output should match those of the video pass-through image.
Subscribe to the DistortedColor stream that provides constantly updated camera parameter data. See the code example in MRExample -> DataStreamer.cpp
To match the brightness, configure the camera exposure settings in your VR content renderer to match the video pass-through cameras. The luminance (in cd/m2) that saturates a pixel in the video pass-through image is equal to
2ev * cameraCalibrationConstant. See the code example in MRExample -> Scene.cpp
An alternative example for Unity HDRP can be found in the Unity XR plugin samples.
The DistortedColor stream metadata provides complete white balance normalization data to match VR colors and HDR cubemap with the video pass-through image.
VR content should be rendered with a fixed 6500K white point. White balance normalization can then be applied as a post-process step using the following formula:
finalColor = saturate(saturate(color * invCCM * diag(whiteBalanceColorGains)) * ccm)
where color is a 3D row vector, saturate(x) means clamping out-of-bounds values to the [0, 2B-1] range assuming B-bit output image, and diag(x) is a diagonal matrix with x values on the diagonal.
Illuminating virtual objects with real-world lighting
For realistic rendering, it is important to match the lighting of virtual objects with the real environment.
Subscribe to the EnvironmentCubemap stream that provides constantly updated environment lighting data. See the code example in MRExample -> DataStream.cpp
The environment cubemap stream contains metadata that can be used to adapt the VR output to match the video pass-through cameras. To match the brightness, the metadata contains brightnessNormalizationGain that is pre-calculated based on camera exposure parameters.
This is a simple brightness multiplier, where 1.0 means that the brightness will be unaffected. If the client application needs to express this in EV shift or EV 100, the following conversions should work:
EV = log2(brightnessNormalizationGain)
EV100 = -log2(brightnessNormalizationGain / 100.0)
In the 2.4.0 release, the following changes were introduced:
The cubemap is now provided in HDR and contains accurate luminance for both direct and indirect sources of light.
The HDR cubemap now contains the environment’s luminance in cd/m2 with a 1:100 ratio. For example, a value of RGB=(1, 1, 1) corresponds to a luminance of 100 cd/m2.
The white point of the cubemap has been normalized to 6500K. See the previous chapter on how to match the white balance with the current video pass-through settings.
An example for Unity HDRP can be found in the Unity XR plugin samples.
Enviroment Cubemap Modes
By default, HDR cubemap is provided in a fixed 6500K color temperature with the intensity normalized so that RGB(=1, 1, 1) corresponds to a luminance of 100 cd/m2. This requires the client application to perform a color and brightness adaptation as a post process step to match the VR colors and HDR cubemap with the video pass-through image. However, it is also possible to configure the cubemap to be automatically adapted to match the video pass-through image.
In this mode, it is no longer necessary to perform any color correction steps in the post process. This mode offers a simpler way to use the cubemap, in case only the HDR cubemap is used for VR object lighting.
To change the cubemap mode, first lock the HDR cubemap configuration:
varjo_Bool wasLocked = varjo_Lock(m_session, varjo_LockType_EnviromentCubemap);
Only one application can lock the config at a time, so it is possible that your application can’t obtain the lock in case someone else is currently holding it. If the configuration can’t be locked, any attempts to change the cubemap configuration will fail.
When you are holding the lock, you can change the cubemap configuration. The following example shows how to change it to an auto adaptation mode:
cubemapConfig.mode = varjo_EnvironmentCubemapMode_AutoAdapt;
Finally, unlock the cubemap configuration. Alternatively, if you want to prevent other applications from changing the cubemap mode, you can leave the configuration locked.
Rendering shadows from virtual environments to real world
Since you might not have a virtual environment for your MR application, your virtual objects will not cast shadows.
To correct this, you can model a floor for your environment in the same position as the real floor. Typically it is enough to render a plane at floor level. After that, you can render a black VR color and write the shadow intensity into the alpha channel.