start » program » presentations

Real-time parallax error compensation in head-mounted eye trackers

A common problem of monocular head-mounted eye trackers is that they introduce gaze estimation errors when the distance between the point of regard and the user is different than when the system was calibrated. This error is due to the scene camera and the eye are not co-axial (a.k.a. parallax error). The standard method to compensate for parallax errors is to assume that all working planes are located on a finite set of distances and then perform calibration for each of these planes for each user. This requires that the distance for each fronto-parallel working plane should be set manually before gaze estimation. The approach is therefore most appropriate for offline gaze analysis. Another assumption is that the working plane should be fronto-parallel with respect to the scene camera, and therefore there will be errors introduced when planes are viewed from different angles.

A method for automatic and real time compensation for parallax error in monocular head-mounted eye trackers is presented which can be used when the PoR is in a plane in 3D environment (fixation plane). The method employs a user specific calibration at different depths to learn the depth compensation parameters and the error behavior. Next time that the system is used, the error of the PoR can be interpolated and compensated for by having the depth of the plane that user is looking. Scene depths can be obtained in real-time using calibrated scene camera. The relationship between the camera and the world coordinate systems can be shown by x=(X.f):Z, where x is a point in the image plane, X is a point in the world coordinates system, f is the focal length of camera and Z is the distance of the point from the camera (depth). Z can be obtained for any point on the fixation plane by having three known points in the fixation plane and detecting the corresponding points in the image. Interaction with displays [Mardanbegi and Witzner 2011] is one of the applications in which this method can be used. When the size of the display is known and user is looking at the display (PoR is inside the display plane), depth and subsequently parallax error can be estimated. So the error can be compensated for when user is looking at the display from different distances and angles.


References:

Mardanbegi, D. and Hansen, D.W., “Mobile gaze-based screen interaction in 3D environments”. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA '11). ACM, New York, NY, USA, Article 2, 4.