The Computational Optimisation Group has a 30-year track record of research in decision-making under uncertainty, stochastic and robust optimisation. Current interests of the group include representing uncertainty in optimisation models, designing numerical optimisation algorithms and computational software frameworks, and applying these algorithms to energy production, capacity planning, manufacturing and distribution models under uncertainty, financial engineering and risk management. The Statistical Machine Learning Group focusses on Bayesian methods. Application areas include autonomous systems, robotics, time series analysis, reinforcement learning, and brain-machine interfaces. Several other groups within the Sargent Centre also have strands of research in computational optimisation.
Area | Academic | Expertise |
---|---|---|
Bayesian Optimisation | Deisenroth, Misener | Algorithm design, Optimisation for machine learning, Parameter estimation |
Optimisation-based Control | Bogle, Chachuat, Deisenroth, Dua | Dynamic optimisation, Multiparametric programming, Optimal Control |
Implementations & Software | Adjiman, Chachuat, Fraga, Kucherenko, Misener | Deterministic global optimisation, Global sensitivity analysis, Set-membership estimation, Mixed-integer nonlinear optimisation, Multi-objective optimisation |
Large-Scale Convex Optimisation | Parpas | Multiscale optimisation |
Machine Learning | Deisenroth | Gaussian processes, Reinforcement learning, Bayesian inference, Graphical models, Active learning/optimal design |
Mixed-Integer Nonlinear Optimisation | Adjiman, Chachuat, Misener | alphaBB, Algorithm design, Cutting planes, Deterministic global optimisation, Implementations & software, Multivariable relaxations |
Multilevel Optimisation | Adjiman, Rustem, Wiesemann | Bilevel optimisation, Minimax, Semi-infinite optimisation |
Optimisation under Uncertainty | Rustem, Wiesemann, Fraga |
Chance-constrained systems, Risk management, Robust optimisation, Stochastic optimisation |
Signal Processing | Deisenroth | Bayesian state estimation, System identification, Inference and learning in nonlinear dynamical systems |
Highlighted Project
Bayesian Optimisation with Dimension Scheduling: Application to Biological Systems
Highlighted Project Authors:
Doniyor Ulmasov, Caroline Baroukh (INRA, France), Benoit Chachuat, Marc Peter Deisenroth, Ruth Misener
Bayesian Optimisation (BO) is a data-efficient, global black-box optimisation method optimising an expensive-to-evaluate fitness function; BO uses Gaussian Processes (GPs) to describe a posterior distribution over fitness functions from available experiments. Similar to experimental design, an acquisition function is applied to the GP posterior distribution over fitness functions to suggest the next (optimal) experiment.
Dynamic models of biological processes allow us to test biological hypotheses while running fewer costly, real-world experiments. We consider estimating biological parameters (e.g., reaction rate kinetics) by minimising the squared error between model and experimental data points. Specifically, we propose BO for efficient parameter estimation of a dynamic microalgae metabolism model (Baroukh et al., 2014). The forcing function is based on light exposure and nitrate input; experimental data has been collected for measurable outputs including lipids, carbohydrates, carbon organic biomass, nitrogen organic biomass and chlorophyll. But our method is general and may be applied to any process model.
There are several timescales for collecting microalgae metabolism data: an experiment may take 10 days while each model simulation of Baroukh et al. (2014) runs in a fraction of a second. BO is traditionally applied to functions with an expensive evaluation costs, e.g., running a 10 day experiment, but the objective of this paper is testing biological hypotheses; we are specifically interested in running the simulation model many times for parameter estimation.
In biological parameter estimation, Bayesian Optimisation (BO) is challenging because the parameters interact nonlinearly and the broad parameter bounds result in a huge search space. Due to the high problem dimensionality (in this context, 10 parameters), balancing exploration versus exploitation becomes more intricate and traditional Bayesian methods do not scale well. Therefore, we introduce a new Dimension Scheduling Algorithm (DSA) to deal with high dimensional models. The DSA optimises the fitness function only along a limited set of dimensions at each iteration. In each iteration, a random set of dimensions is selected to be optimised. This reduces the necessary computation time, and allows the dimension scheduling method to find good solutions faster than the traditional method. The increased computational speed stems from the reduced number of data points per each GP and the reduced input dimensions in the GP; GPs scale linearly in the number of dimensions but cubically in the number data points. Additionally, considering a limited number of dimensions at each node allows us to easily parallelise the algorithm.
Compared to commercial parameter estimation for biological models and a traditional Bayesian Optimisation algorithm, our approach achieves strong performance in significantly fewer experiments and a reduced computation time. We also design and provide a graphical user interface (GUI), which allows untrained users to optimise any model that can be invoked through a command line. The framework removes the barrier of programming language by providing the user with a straightforward user interface to set BO parameters, observe the optimisation as the code runs, and examine the GP after the experiment has been completed.
Highlighted Project References:
C. Baroukh, R. Munoz-Tamayo, J.-P. Steyer, O. Bernard, 2014. DRUM: A new framework for metabolic modeling under non-balanced growth. application to the carbon metabolism of unicellular microalgae. PLoS ONE 9 (8), e104499.
D. Ulmasov, C. Baroukh, B. Chachuat, M. P. Deisenroth, R. Misener, 2015. Bayesian Optimization with Dimension Scheduling: Application to Biological Systems. arXiv preprint arXiv:1511.05385.
Misener R., Floudas C. A. ANTIGONE: Algorithms for coNTinuous / Integer Global Optimization of Nonlinear Equations, Journal of Global Optimization; 59: 503 - 526, 2014.