This paper summarises recent work on vision based localisation of a moving camera using particle filtering. We are interested in real-time operation for applications in mobile and wearable computing, in which the camera is worn or held by a user. Specifically, we aim for localisation algorithms which are robust to the real-life motions associated with human activity and to the dynamic clutter encountered in real environments. Particle filtering provides greater generality than previous approaches, enabling it to deal with the multi-modal uncertainties characteristic of such operating conditions. We present an overview of the methodology and experimental results for different tracking scenarios, with and without prior knowledge of scene structure.