Varjo headsets feature the 20/20 Eye Tracker, our integrated eye tracking functionality. You can use eye tracking in your application and log gaze information for analytics. Eye tracking can also be used to interact with content; you can use it to select an object or prompt for additional information simply by looking at it.
For detailed instructions, refer to your engine’s documentation:
- Native eye tracking documentation
- Example of using eye tracking with Unity XR SDK
- Example of using eye tracking with Unreal
WHAT IS GAZE?
Gaze starts from the gaze origin (the eye) and follows the gaze vector (the direction of the gaze). A normalized gaze vector is calculated. It is important to understand this concept in order to process eye tracking data while developing for Varjo headsets.
You can also record gaze data together with a video feed from the headset. For instructions, see Gaze data logging page.
EYE TRACKER SPECIFICATIONS
Varjo headsets contain two eye tracking cameras, one for each eye. A combined gaze ray can be computed either using information from two eyes (both eyes tracked; the typical situation) or from one eye (e.g., if the other eye cannot be tracked). With the recommended computer configurations, you can expect a latency of about 20–30ms from pupil change to the eye tracking result being available.
Varjo XR-3, VR-3 and Aero
- IPD range: 58–72mm
- Gaze camera resolution: 640 x 400 px per camera.
- Gaze tracking frequency: 100 Hz (default for native SDK) or 200 Hz (default for OpenXR and Unity XR SDK)
DEVELOPING WITH THE 20/20 EYE TRACKER
Before you launch a demo with eye tracking, make sure to enable Allow eye tracking in the System tab in Varjo Base.
Keep in mind that eye tracking must be recalibrated whenever the headset is taken off and put back on, even if the same person is using it. This is necessary because the headset may not be positioned exactly the same way on a person’s head every time. You can manage the calibration directly from your application.