Dr Chris Renton


Bayesian Machine Vision

Machine vision is becoming an increasingly significant source of sensor information for robotics and autonomous vehicles. Due to the high dimensionality of the raw sensor data, preprocessing is typically employed to extract salient information from a sequence of images, such as optic flow, which are then treated as virtual measurements. Such algorithms can introduce additional sources of uncertainty (e.g., false positive outliers). This complicates the process of data fusion within the Bayesian framework, since it requires fully-characterised measurement uncertainties for each information source. This research aims to characterise the uncertainties due to the camera sensor and environmental texture and propagate these through an optic flow algorithm and a non-parametric spherical camera model to build suitable measurement likelihood functions that can be confidently used in data fusion applications.

Unified estimation and control

To achieve robust autonomy, there is a desire to build autonomous systems that can learn about themselves and their environment and make rational decisions under the associated uncertainties. Therefore, there is a desire to jointly consider the problems of modelling, estimation and control within a unifying framework. Such an approach enables the uncertainties obtained during system identification to be correctly propagated to the state transition and measurement likelihood functions that are used in state estimators and the model and state uncertainties can then be jointly considered in making optimal control decisions.
One of the most promising unifying frameworks appears to be variational free-energy (VFE) minimisation (Friston, 2010). This framework has been shown to produce as special cases the following well-known results: variational Bayes, Expectation-Maximisation, Bayesian state estimators and optimal control including the Hamilton-Jacobi-Bellman equation and reinforcement learning. This research aims to explore the model structures required to implement practical online VFE minimisation for model classes that include mobile robots and autonomous vehicles and their associated sensors.

infinite-dimensional mapping and planning

Traditional map-building tools for mobile robots include salient landmark maps and occupancy grids. Landmark maps encode a sparse representation of the environment, but pose challenges in salient feature (re-)identification, feature-to-landmark data association and do not naturally handle line-of-sight occlusion. Grid maps overcome these issues by directly modelling the probability of occupancy of a large, but finite, number of discrete cells, but introduce unnatural spatial correlation structures. Based on emerging results for exact inference on stochastic processes, this research aims to develop new Bayesian mapping tools to model the environment with an infinite-dimensional occupancy field.

Trajectory planning for multi-agent missions to achieve persistent operations or to complete a distributed task in the minimum possible time is predominantly constrained by energy and range considerations. This research poses these types of problems as cyclic horizon optimisation and infinite-dimensional integer programming problems, respectively, which have been found to have tractable solutions.


Our background and vision


The research team


A list of research publications


Contact us

Close Menu