Skip to main content

Research Project : Mobile and Wearable Computing | Computer Vision

Real-Time Camera Tracking Using Particle Filtering

Andrew Calway, Walterio Mayol-Cuevas, Mark Pupilli, Denis Chekhlov, Andrew Gee

The aim of this project is to develop robust algorithms for tracking the position and orientation  (6 degrees of freedom) of a moving camera in real-time. We are particularly interested in applications in which the camera will either be held or worn by a user, such as on a PDA or on a head mounted display. This is a challenging area of research given that the motions are often rapid and unpredictable.

We have built a prototype vision-only system in which the tracking is based on correspondence between visual features in the scene and the ego-motion of the camera is estimated using a sequential Monte Carlo method in the form of a particle filter. This is a flexible and computationally efficient method for deaing with the non-linear and non-Gaussian nature of the problem. It results in robust tracking at real-time frame rates which has several key advantages over previous approaches, notably in its ability to maintain track even in the presence of severe occlusion and camera shake. Our system is the first to demonstrate real-time camera tracking of this form using a sequential Monte Carlo method.

Our current work is focused on further extensions of the approach, particularly in the areas of feature initialisation, robust SLAM, guided tracking using 3-D models, and the generation of super-resolution imagery from a roving camera.


The following videos illustrate the tracking performance of our approach. In each case we have augmented the scene with graphics to show the synchronization with the video stream. Here we are using salient feature points represented by image templates as measurements to implement the tracking. First, using the corners of a pre-defined calibration pattern, and then salient features introduced automatically whilst simultaneously tracking (SLAM operation). Videos are encoded as mpeg1 or divX (smaller and better).

Example 1 below shows successful tracking using the calibration pattern whilst undergoing rapid and wide angled motions. The 3-D plot in the left-hand corner (the 'firework') shows the particles projected onto 3-D translation sub-space, with white indicating particles with high weight. The histogram in the bottom-right shows the approximations to the evolving densities along each dimension within the filter (from bottom to top:) 3-D translation and 3-D rotation represented by a 4 element quaternion. The bottom-left sequence shows the projected features into each 'camera particle' for each frame; note how the particle clouds spread during camera shake as the filter widens its search for the best motion and then shrink once it has latched on to the correct motion after shaking has ceased. This is a key component in the filter's ability to robustly maintain and recover track during complex motion. Example 2 shows a similar example, illustrating successful tracking and recovery during severe camera shake and total occlusion. Note that the filter recovers even though camera motion continues during the occlusion, with the camera ending up significantly displaced from its position at the onset of occlusion.

Example 1 mpeg video Example 2 mpeg video
Example 1 (divX version)Example 2 (divX version)

The two examples below illustrate the system successfully introducing new features into its scene map during tracking (SLAM operation). The new features are then used to enable tracking to continue even when initialisation features move out of view. It also successfully maintains track during camera shake.

Example 3 mpeg video Example 4 mpeg video
Example 3 (divX version)Example 4 (divX version)


Details of the research in this project can be found in the following publication:


The following people are working on this project within the Mobile and Wearable Computing and Computer Vision Research Groups.

Walterio Mayol-Cuevas
Denis Chekhlov
Andrew Gee

Further Information

For more information on this work contact Andrew Calway or Mark Pupilli.