Chroma Key Masking
Note: you are currently viewing documentation for a beta or an older version of Varjo
General
Varjo Compositor calculates a chroma key mask for the video texture alpha channel, which can be modified by a video post-process shader provided by the user. This allows you to extend or restrict the original chroma key mask with an application-rendered mask. This additional mask can be tracked in 3D mixed reality space so that it always covers specified areas in the physical world.
One possible use case for extending the chroma key mask is to use the application-rendered mask to cover areas in the physical space, for example to hide equipment, the ceiling, or lights, which can be difficult or impossible to handle using chroma keying only. These areas can be covered with the additional mask to always display VR content instead.
A use case for restricting the chroma key mask is to cover displays or dashboards with the additional mask so as to prevent chroma keying in those area and have the compositor always display the video-pass-through image there instead.
The additional application mask can be of any shape, and is rendered into the input texture for the video post-process shader. It is then used in the custom post-process shader to modify the existing chroma key mask in the video buffer alpha channel.
Example
See ChromaKeyMaskTool in the examples folder for how to implement chroma key masking in your own client application. The example application shows how to render the mask to the texture buffer and then use it in a video post-process shader to restrict or extend the chroma key mask. The example application uses simple plane shapes, but it is easy to extend it to render more complex meshes.
Restrictions
Timing synchronization between video pass-through, VR content, and the rendered mask is not perfect. The problem is that if an application renders its VR content and mask using the headset pose, the mask will be applied to a camera frame taken at a different time. The compositor then composes the camera frame with the VR content buffer it has at that point in time, and these will not match exactly. We will work to improve temporal synchronization in further releases.