VRGaze / README.md
nielsr's picture
nielsr HF Staff
Improve dataset card: add paper link, GitHub link, and metadata
f46409a verified
|
raw
history blame
1.57 kB
metadata
task_categories:
  - other
tags:
  - computer-vision
  - gaze-estimation
  - virtual-reality

VRGaze: A Large-scale Dataset for VR Gaze Estimation

Paper | Code

VRGaze is the first large-scale off-axis gaze estimation dataset for Virtual Reality (VR), introduced in the paper "GazeShift: Unsupervised Gaze Estimation and Dataset for VR".

Dataset Summary

The dataset comprises 2.1 million near-eye infrared images collected from 68 participants. It is specifically designed to address data scarcity in VR gaze research, focusing on the off-axis camera configurations typical of modern headsets.

  • Images: 2.1 million infrared images.
  • Participants: 68 individuals.
  • Hardware: Off-axis camera configurations common in modern VR systems.
  • Purpose: Designed for unsupervised gaze representation learning and few-shot calibration.

Dataset Sample

VRGaze dataset sample:

Sample

Usage

To use this dataset with the official GazeShift implementation, follow these steps:

Installation

git clone https://github.com/gazeshift3/gazeshift
cd gazeshift
pip install -r requirements.txt

Training

To reproduce the experiments on the VRGaze dataset, run the provided training script (ensure you update the dataset and output locations within the script):

bash Train.sh

The model is expected to achieve a mean angular error of approximately 1.84° after per-person calibration (around 400K steps).