Current Projects | Background Projects

Current Projects

Real-Time Model-Based SLAM Using Line Segments


This work developed a monocular real-time SLAM system that uses line segments extracted on the fly and that builds a wire-frame model of the scene to help tracking. The use of line segments provides viewpoint invariance and robustness to partial occlusion, whilst the model-based tracking is fast and efficient, reducing problems associated with feature matching and extraction.
  • A. P. Gee and W.W. Mayol. Real-Time Model-Based SLAM Using Line Segments [PDF]. To appear in LNCS proceedings of the 2nd International Symposium on Visual Computing. November. 2006 .
High Quality Video here.

Robust Real-Time SLAM Using Multiresolution Descriptors


We work intensively with the Real-Time Vision Group to develop robust visual mapping tools. This work in collaboration with Denis Chekhlov, Mark Pupilli, and Andrew Calway, uses SIFT-like descriptors within a coherent top-down framework.  The resulting system provides superior performance over previous methods in terms of robustness to erratic motion, camera shake, and the ability to recover from periods of measurement loss.

  • Denis Chekhlov, Mark Pupilli, Walterio Mayol-Cuevas and Andrew Calway. Robust Real-Time Visual SLAM Using Scale Prediction and Exemplar Based Feature Description. To appear CVPR 2007.
High Quality Video here.

A computational aid for people with no sight

In early 2006 we became involved in a multidisciplinary research programme that includes Oxford, Reding and Surrey Universities to develop a system to assist severely visually impaired children. Pilot funding has been provided by VICTA.


People at Bristol: Oli Cooper and Walterio Mayol
Partners: Nicky Ragge (lead, Oxford), Philip Jackson (Surrey), Andrew Glennerster (Reding) and Simon Ungar (Surrey).

More Current Projects Coming Soon


Background (previous) Work

Hand Activity Detection with a wearable active camera.

This work with David Murray aimed towards the automatic detection of hand activity as observed by a wearable camera. A Probabilistic approach fuses different cues to infer the current activity based in the objects subject to manipulation.
  • W.W. Mayol and D.W. Murray. Wearable Hand Activity Recognition for Event Summarization [PDF]. Proc. IEEE Int. Symposium on Wearable Computers (ISWC). Osaka, Japan, October 2005.

Hand gesture and "grasping vector" detection.


This work aimed at detecting hand gestures in real time as seen by an active wearable camera.
  • W.W. Mayol, A.J. Davison, B.J. Tordoff, N.D. Molton, and D.W. Murray. Interaction between hand and wearable camera in 2D and 3D environments [PDF]. Proc. British Machine Vision Conference 2004. London, UK, September. 2004.

General Regression for Image Tracking.


This work with David Murray developed a method to track planar and near-planar image regions by posing the problem as one of statistical general regression. The method is fast and robust to occlusions and rapid object motions.

  •     W. W. Mayol and D. W. Murray. Tracking with General Regression. Machine Vision and Applications. In press.
HQ videos: movie1, movie2, movie3

Annotating 3D scenes in real time using hand gestures.

This work used hand gesture recognition to add virtual objects on top of a scene also recovered in real time. The system uses a passive wearable wide-angle camera and Andrew Davison's SLAM algorithm.

  • W.W. Mayol, A.J. Davison, B.J. Tordoff, N.D. Molton, and D.W. Murray. Interaction between hand and wearable camera in 2D and 3D environments [PDF]. Proc. British Machine Vision Conference 2004. London, UK, September. 2004.

Video contains audio.

Simultaneous Localisation and Mapping with a Wearable Active Vision Camera.

This research was done together with Andrew Davison and presented the first real-time simultaneous localisation and mapping system for an active vision camera. The intended example application is remote collaboration where a remote expert observes the world through the wearable robot and adds augmented reality annotations to be seen by the wearer.
  • A.J. Davison, W.W. Mayol, and D.W. Murray. Real-Time Localisation and Mapping with Wearable Active Vision [PDF]. Proc IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Tokyo, Japan, October 7 - 10, 2003.
  • W.W. Mayol, A.J. Davison, B.J. Tordoff, and D.W. Murray. Applaying Active Vision and SLAM to Wearables. Proc. 11th Int. Symposium of Robotics Research ISRR03. Siena, Italy, 19-22 October. 2003.