Masking with Unreal

When developing for Varjo XR headsets, you can mask specific objects to act as a “window” into the real world. This is especially useful if you want to see real-world elements such as instruments or controller devices in the virtual environment.

Varjo Compositor uses the alpha value of a layer to blend between the image from the client application and that from the video pass-through (VST) cameras. If VST rendering is enabled and the layer is not flagged as opaque, every pixel in the color buffer with an alpha value smaller than 1.0 will be blended with the VST image. If the color of the pixel in the color buffer is RGBA(0,0,0,0), only the VST image will be visible.

Follow these simple steps to add a VST mask to your Unreal project.

Creating a mixed reality mask

Enable mixed reality for Varjo OpenXR plugin following the steps in Mixed Reality with Unreal.

Start by adding a Static Mesh Actor to your scene. The mesh will act as a mask for the video pass-through image and can be any shape.

You can utilize the custom depth buffer to create the VST mask. Go to Project Settings > Rendering > Postprocessing and set Custom Depth-Stencil Pass Enabled to enable the custom depth buffer.

Once you have the custom depth buffer set up, you have to make the mask mesh write into it. Select the created Static Mesh Actor MRMask and in Details > Renderingenable Render CustomDepth Pass.

In addition, in Lightning, switch off Cast Shadow.

In Rendering, switch off the following:

Visible in Reflection Captures

Visible in RealTimeSky Captures

Visible in Ray tracing

Render in Main Pass

Render in Depth Pass

Receives Decal

The mask object is now ready. Follow the steps in Mixed Reality with Unreal to set up custom post processing and open the post process material PP_MR in the Material Editor Duplicate SceneTextureSceneDepth and change the SceneTextureID to CustomDepth.

Create Material Parameter Collection and name it PP_MRParameters.

In Details > Material > Scalar Parameters, add New Element with the name MRMask.

Open the PP_MR material you created earlier.

Create CollectionParameter: go to General > Collection and select PP_MRParameters. In Parameter Name, select MRMask.

The logic for masking is simple. When CustomDepth is smaller than SceneDepth (= mask object is not behind a non-mask object), output Opacity is 0. Otherwise output Opacity is 1.0. Here we have modified the post process material example from Mixed Reality with Unreal to enable masking.

If you want to have the mask always enabled, you can just use 0.0 if CustomDepth is smaller than SceneDepth.

If you want to be able to control the mask, you can create a new Material Parameter Collection with a scalar value and use that in the post process material. In the example we use a scalar called MRMask.

Add a reference to this Material Parameter Collection in a blueprint and create a function for controlling the MRMask scalar parameter.

Now open BP_MRControls blueprint.

In My Blueprint section, create New function and name it UpdatePostProcessing.

Create a new variable and name it PPMaterialParameters. Set Variable Type to Material Parameter CollectionSingle.

Compile and in Default Value, add an element to PPMaterial Parameters: in PPMaterial Parameters, select PP_MRParameters.

Create a Boolean variable and name it MRMaskEnabled.

Recreate the logic from the reference image

Now create a blueprint for toggling the mask and have it call the created function to set the scalar value.

To do that, open Project Settings > EngineInput > Action Mapping and add a new action. Name it MRMaskToggle and set it to key N.

Please note that later versions of Unreal 5 have deprecated Action Mappings and it’s recommended you create an Enhanced Input Action instead.

Open BP_MRControls Event Graph and create a new InputAction MRMaskToggle.

Recreate the logic from the reference image

Click Play and you should see the real world through the mask you created when the mask is enabled.