Varjo headset comes with the 20/20 Eye Tracker, our integrated eye tracking functionality. You can use eye tracking in your application, and you can log the information about gaze for analytics. Eye tracking can also be used for interacting with content; you can use it for selecting objects or prompting additional information about a specific object by simply looking at it. For detailed instructions, refer to your engine’s documentation.
- Native eye tracking documentation
- Example of using eye tracking with Unity XR SDK
- Example of using eye tracking with Unity legacy plugin
- Example of using eye tracking with Unreal
WHAT IS GAZE?
Gaze starts from the gaze origin (the eye), and follows the gaze vector (the direction of the gaze). Normalized gaze vector is calculated. It is important to understand this concept in order to process eye tracking data while developing for the Varjo headset. You can record gaze data together with video feed from the headset, for instructions go to Gaze data logging page.
EYE TRACKER SPECIFICATIONS
Varjo headsets contain two eye tracking cameras – one for each eye.
- IPD range – 61–73 mm
- Gaze camera resolution – 1280 x 800 px per camera.
- Gaze tracking frequency – 100 Hz
- Eye tracking accuracy – 1°
- Eye tracking precision – 0.2°
DEVELOPING WITH 20/20 EYE TRACKER
Before trying to launch a demo with eye tracking, make sure to enable Allow eye tracking in Varjo Base from the System tab.
While developing, keep in mind that every time the headset is taken off and put back on, it must be re-calibrated even if the same person is using it. This is needed because the position of the headset on the user’s face may not be the same each time the headset is worn. You can manage calibration directly from your application.