We present a method for estimating eye gaze direction, which represents a departure from conventional eye gaze estimation methods, the majority of which are based on tracking specific optical phenomena like corneal reflection and the Purkinje images. We employ an appearance manifold model, but instead of using a densely sampled spline to perform the nearest manifold point query, we retain the original set of sparse appearance samples and use linear interpolation among a small subset of samples to approximate the nearest manifold point. The advantage of this approach is that since we are only storing a sparse set of samples, each sample can be a high dimensional vector that retains more representational accuracy than short vectors produced with dimensionality reduction methods. The algorithm was tested with a set of eye images labelled with ground truth point-of-regard coordinates. We have found that the algorithm is capable of estimating eye gaze with a mean angular error of 0.38 degrees, which is comparable to that obtained by commercially available eye trackers.
Appearance-based eye gaze estimation
01.01.2002
324251 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
Appearance-based Eye Gaze Estimation
British Library Conference Proceedings | 2002
|Real-time gaze tracking with appearance-based models
British Library Online Contents | 2009
|British Library Online Contents | 2017
|Auto-Calibrated Gaze Estimation Using Human Gaze Patterns
British Library Online Contents | 2017
|