Skip to main content

3D from Looking: Using Wearable Gaze Tracking for Hands-Free and Feedback-Free Object Modelling

Teesid Leelasawassuk, Walterio Mayol-Cuevas, 3D from Looking: Using Wearable Gaze Tracking for Hands-Free and Feedback-Free Object Modelling. 17th international symposium on wearable computers (ISWC). ISBN 978-1-4503-2127-3, pp. 105–112. June 2013. PDF, 1219 Kbytes. External information

Abstract

This paper presents a method for estimating the 3D shape of an object being observed using wearable gaze tracking. Starting from a sparse environment map generated by a simultaneous localization and mapping algorithm (SLAM), we use the gaze direction positioned in 3D to extract the model of the object under observation. By letting the user look at the object of interest, and without any feedback, the method determines 3D point-of-regards by back-projecting the user's gaze rays into the map. The 3D point-of-regards are then used as seed points for segmenting the object from captured images and the calculated silhouettes are used to estimate the 3D shape of the object. We explore methods to remove outlier gaze points that result from the user saccading to non object points and methods for reducing the error in the shape estimation. Being able to exploit gaze information in this way, enables the user of wearable gaze trackers to be able to do things as complex as object modelling in a hands-free and even feedback-free manner.

Bibtex entry.

Attachments

Contact details

Publication Admin