Eye tracking
The Varjo OpenXR runtime fully implements the XR_EXT_eye_gaze_interaction extension, which allows developers to access simple gaze data with all Varjo headsets using Unreal’s built-in OpenXREyeTracker plugin.
The current OpenXR extension provides support for combined gaze pose (position and rotation). Support for other Varjo eye tracking features, such as stereo gaze data, pupil data, different calibration and gaze data polling modes will be added later.
Using eye tracking with Unreal
Make sure that the OpenXREyeTracker plugin is enabled in the project. It should be enabled automatically if you have also enabled the Varjo OpenXR plugin in your project.
When the OpenXREyeTracker plugin is enabled and your system is using Varjo OpenXR runtime, you can access gaze origin and direction by using the GetGazeData blueprint node. Fixation point and confidence values are not currently supported with OpenXREyeTracker plugin.
Gaze data visualization
This example shows a simple way to visualize gaze data. Start by adding an empty actor in your scene and convert it to a blueprint.
Open the created blueprint.
We will visualize the gaze data with a simple gaze dot. First, add a Sphere component. As we intend to move the sphere with the gaze, make sure to set Transform > Mobility to Movable. Set a sensible scale for the sphere and assign a material. We also highly recommended that you disable any physics for the gaze indicator.
Open the Event Graph of the blueprint.
Use GetGazeData in the Tick event to fetch the latest gaze data. In this example we store it in a variable called GazeData, but you can also just break the EyeTrackerGazeData struct. The boolean returned by GetGazeData can be used to determine if the provided gaze data is valid. Let’s set the visibility of the sphere based on this value.
Note: Due to a bug in Unreal Engine 4.27, if gaze data has ever been valid, GetGazeData will keep returning true even if gaze data was not valid anymore.
We now have a sphere that is visible whenever the gaze data is valid. Next, let us make it follow the gaze.
Use the Break EyeTrackerGazeData node to access individual components of the gaze data struct.
There are several ways to visualize gaze data. In this example, we cast a ray along the gaze vector and place the sphere at the impact point if something was hit.
To achieve this, we can use the LineTraceByChannel node. Set Trace Channel Visibility and make sure Ignore Self is enabled.
Set Gaze Origin as the start of the line segment and Gaze Origin + Gaze Direction * Max ray distance as the end of the line segment. In this example we use a maximum distance of 1000 units (10 meters).
Create a Variable GazeData and set it to EyeTrackerGazeData.
Drag variable GazeData to the EventGraph, and drag a pin from GazeData and create BreakEyeTrackerGazeData.
Drag a pin from its output - GazeDirection - and create Multiply (*) – convert B value into the Integer and set it to 1000.
Use the return value of LineTraceByChannel to determine if a visible object was hit.
In case of a hit, break the Hit result and use SetWorldLocation to move the sphere in the location of the Impact Point.
If nothing was hit, we can hide the gaze dot or place the sphere at a fixed distance from the gaze origin. For the latter, you can simply set Gaze Origin + Gaze Direction * Distance as the new location of the sphere. In this example we use a fixed distance of 200 units (2 meters).
If you now run the scene in VR Preview, you should see the dot follow your gaze, provided that eye tracking is calibrated and the gaze data is valid.
You can use a translucent material with depth testing disabled to make sure that the gaze dot is not occluded by opaque objects.