Eye tracking

Note: you are currently viewing documentation for a beta or an older version of Varjo

For general information about eye tracking, see Eye tracking.

Using eye tracking with Varjo XR plugin

Calibration

To reliably track user’s eye movements, eye tracking must be calibrated first. While the Varjo XR-3, VR-3 and Aero headsets support automatic one-dot calibration, we recommended that you request a separate calibration if more accurate eye tracking data is required. You can request calibration by calling a method in the VarjoEyeTracking class.

// Requests gaze calibration: "Fast" 5 dot calibration is used by default.
// Returns true if calibration was successfully requested.
bool VarjoEyeTracking.RequestGazeCalibration();

// Requests gaze calibration with specific GazeCalibrationMode: allows to
// choose between "Fast" and "Legacy" calibration.
// Returns true if calibration was successfully requested.
bool VarjoEyeTracking.RequestGazeCalibration(gazeCalibrationMode);

It is possible to query quality assessment of current user gaze calibration by calling method GetGazeCalibrationQuality.

// Returns score assesment of currently used gaze calibration
GazeCalibrationQuality VarjoEyeTracking.GetGazeCalibrationQuality();

Returned struct GazeCalibrationQuality contains score GazeEyeCalibrationQuality for left and right eye calibration. These values indicate how well each of the eyes passed through the calibration procedure and how accurate returned gaze data could be for the eye. Possible returned scores are Invalid (score not available), Low, Medium and High.

Accessing eye tracking data

If eye tracking is calibrated, you can access eye tracking data by using either the XR Input subsystem or the methods provided in the VarjoEyeTracking class.

With the Input subsystem, the eye tracking data can be retrieved using the Eyes interface.

The methods in VarjoEyeTracking allow you to poll the eye tracking data at the full frequency supported by the headset, which is especially important for research scenarios. These methods provide eye tracking data frames as GazeData structs, which include the following information:

  • long frameNumber - A unique identifier of the frame at the time when the data was recorded.
  • long captureTime - A timestamp, in nanoseconds, of when the video frame was recorded by the eye tracking cameras.
  • GazeStatus status - A status for eye tracking.
  • GazeRay gaze - Gaze ray combined from both eyes.
  • float focusDistance - The distance between eye and focus point in meters. Values are between 0 and 2 meters.
  • float focusStability - Stability of the user’s focus. Value are between 0.0 and 1.0, where 0.0 indicates least stable focus and 1.0 most stable.
  • GazeEyeStatus leftStatus - A status for the left eye.
  • GazeRay left - Gaze ray for the left eye.
  • float leftPupilSize - Pupil size for the left eye, calculated according to the pupil size range detected by the headset. Values are between 0 and 1.
  • GazeEyeStatus rightStatus - A status for the right eye.
  • GazeRay right - Gaze ray for the right eye.
  • float rightPupilSize - Pupil size for the right eye, calculated according to the pupil size range detected by the headset. Values are between 0 and 1.

GazeStatus is a value for the eye tracking status of the headset as follows:

  • 0 – Data unavailable: User is not wearing the headset or eyes cannot be located
  • 1 – User is wearing the headset, but gaze tracking is being calibrated
  • 2 – Data is valid

GazeRay struct contains data about eye position coordinates in meters [origin (x, y, z)] and a normalized direction vector [forward (x, y, z)]. Gaze data is given in the left-hand coordinate system and is relative to head pose.

GazeEyeStatus is a value for each eye as follows:

  • 0 – Eye is not tracked and not visible (e.g., the eye is shut)
  • 1 – Eye is visible but not reliably tracked (e.g., during a saccade or blink)
  • 2 – Eye is tracked but quality is compromised (e.g., the headset has moved after calibration)
  • 3 – Eye is tracked

Gaze data should be polled frequently as all frames older than 500ms are discarded. Gaze data is polled using the following methods:

// Returns the latest gaze data frame.
GazeData VarjoEyeTracking.GetGaze();

// Writes all gaze data frames since the last call to this method in GazeData list.
// Returns the number of retrieved gaze data frames.
int VarjoEyeTracking.GetGazeList(out gazeDataList);

Data stream options

The Varjo XR plugin provides several options for the eye tracking data stream. You can choose between smoothed (default, filter type Standard) and unfiltered gaze data (filter type None), and select a gaze output frequency from 100Hz, 200Hz and the maximum supported frequency by connected headset (default). See the general Eye tracking documentation for frequencies supported by different headsets.

You can set the options using the following methods:

// Sets GazeOutputFilterType: None / Standard
// Returns true if output filter type was successfully set.
bool VarjoEyeTracking.SetGazeOutputFilterType(outputFilterType);

// Sets GazeOutputFrequency: MaximumSupported / Frequency100Hz / Frequency200Hz
// Returns true if output frequency was successfully set.
bool VarjoEyeTracking.SetGazeOutputFrequency(outputFrequency);

You can also query currently used options using the following methods:

// Returns currently set GazeOutputFilterType
GazeOutputFilterType VarjoEyeTracking.GetGazeOutputFilterType();

// Returns currently set GazeOutputFrequency
GazeOutputFrequency VarjoEyeTracking.GetGazeOutputFrequency();

Eye tracking examples

Examples of using eye tracking can be found in the Eye tracking example of the Varjo Unity XR Plugin.