Skip to main content

COMBINING ABSOLUTE POSITIONING AND VISION FOR WIDE AREA AUGMENTED REALITY

Thomas Banwell, Andrew Calway, COMBINING ABSOLUTE POSITIONING AND VISION FOR WIDE AREA AUGMENTED REALITY. International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications . May 2010. PDF, 386 Kbytes.

Abstract

One of the major limitations of vision based mapping and localisation is its inability to scale and operate over wide areas. This restricts its use in applications such as Augmented Reality. In this paper we demonstrate that the integration of a second absolute positioning sensor addresses this problem, allowing independent local maps to be combined within a global coordinate frame. This is achieved by aligning trajectories from the two sensors which enables estimation of the relative position, orientation and scale of each local map. The second sensor also provides the additional benefit of reducing the search space required for efficient relocalisation. Results illustrate the method working for an indoor environment using an ultrasound position sensor, building and combining a large number of local maps and successfully relocalising as users move arbitrarily within the map. To show the generality of the proposed method we also demonstrate the system building and aligning local maps in an outdoor environment using GPS as the position sensor.

Bibtex entry.

Contact details

Publication Admin