| Current Projects | Background Projects | ||
Current Projects |
||
|
|
||
|
|
Real-Time Model-Based SLAM Using Line SegmentsThis work developed a monocular real-time SLAM system that uses line segments extracted on the fly and that builds a wire-frame model of the scene to help tracking. The use of line segments provides viewpoint invariance and robustness to partial occlusion, whilst the model-based tracking is fast and efficient, reducing problems associated with feature matching and extraction.
|
|
|
|
||
|
|
Robust Real-Time SLAM Using Multiresolution DescriptorsWe work intensively with the Real-Time Vision Group to develop robust visual mapping tools. This work in collaboration with Denis Chekhlov, Mark Pupilli, and Andrew Calway, uses SIFT-like descriptors within a coherent top-down framework. The resulting system provides superior performance over previous methods in terms of robustness to erratic motion, camera shake, and the ability to recover from periods of measurement loss.
|
|
|
|
||
![]() |
A computational aid for people with no sightIn early 2006 we became involved in a multidisciplinary research programme that includes Oxford, Reding and Surrey Universities to develop a system to assist severely visually impaired children. Pilot funding has been provided by VICTA.People at Bristol: Oli Cooper and Walterio Mayol Partners: Nicky Ragge (lead, Oxford), Philip Jackson (Surrey), Andrew Glennerster (Reding) and Simon Ungar (Surrey). |
|
|
|
||
| More Current Projects Coming Soon | ||
Background (previous) Work |
||
|
|
||
![]() |
Hand Activity Detection with a wearable active camera.This work with David Murray aimed towards the automatic detection of hand activity as observed by a wearable camera. A Probabilistic approach fuses different cues to infer the current activity based in the objects subject to manipulation.
|
|
|
|
||
|
|
Hand gesture and "grasping vector" detection.This work aimed at detecting hand gestures in real time as seen by an active wearable camera.
|
|
|
|
||
|
|
General Regression for Image Tracking.This work with David Murray developed a method to track planar and near-planar image regions by posing the problem as one of statistical general regression. The method is fast and robust to occlusions and rapid object motions.
|
|
|
|
||
|
|
Annotating 3D scenes in real time using hand gestures.This work used hand gesture recognition to add virtual objects on top of a scene also recovered in real time. The system uses a passive wearable wide-angle camera and Andrew Davison's SLAM algorithm.
Video contains audio. |
|
|
|
||
|
|
Simultaneous
Localisation and Mapping with a Wearable Active Vision Camera.
This research was done together with Andrew Davison and presented the first real-time simultaneous localisation and mapping system for an active vision camera. The intended example application is remote collaboration where a remote expert observes the world through the wearable robot and adds augmented reality annotations to be seen by the wearer.
|
|
|
|
||