Probabilistic Numerics @ MCQMC 2016
Probabilistic Numerics (PN) is an emerging theme of research that straddles the disciplines of mathematical analysis, statistical science and numerical computation. This nascent field takes it’s inspiration from early work in the mid 80s by Diaconis and O’Hagan, who posited that numerical computation should be viewed as a problem of statistical inference. The progress in the intervening years has focused on algorithm design for the probabilistic solution of ordinary differential equations and numerical quadrature, suggesting empirically that an enhanced performance of e.g. numerical quadrature routines can be achieved via an appropriate statistical formulation. However there has been little theoretical analysis to understand the mathematical principles underlying PN that deliver these improvements. This mini-symposium seeks to bring the area of PN to the MCQMC research community, to present recent mathematical analysis which elucidates the relationship between PN schemes, functional regression and QMC.
The vast amounts of data in many different forms becoming available to politicians, policy makers, technologists, and scientists of every hue presents tantalising opportunities for making advances never before considered feasible. Yet with these apparent opportunities has come an increase in the complexity of the mathematics required to exploit this data. These sophisticated mathematical representations are much more challenging to analyse, and more and more computationally expensive to evaluate. This is a particularly acute problem for many tasks of interest such as making predictions since these will require the extensive use of numerical solvers for linear algebra, optimization, integration or differential equations. These methods will tend to be slow, due to the complexity of the models, and this will potentially lead to solutions with high levels of uncertainty. This talk will introduce our contributions to an emerging area of research defining a nexus of applied mathematics, statistical science and computer science, called “probabilistic numerics”. The aim is to consider numerical problems from a statistical viewpoint, and as such provide numerical methods for which numerical error can be quantified and controlled in a probabilistic manner. This philosophy will be illustrated on problems ranging from predictive policing via crime modelling to computer vision, where probabilistic numerical methods provide a rich and essential quantification of the uncertainty associated with such models and their computation.
One of the current active branches of the field of probabilistic numerics focuses on numerical integration. We will study such a probabilistic integration method called Bayesian quadrature, which provides solutions in the form of probability distributions (rather than point estimates) whose structure can provide additional insight into the uncertainty emanating from the finite number of evaluations of the integrand. Strong mathematical guarantees will then be provided in the form of convergence and contraction rates in the number of function evaluations. Finally, we will compare Bayesian quadrature with non-probabilistic methods including Monte Carlo and Quasi-Monte Carlo methods, and illustrate its performance on applications ranging from computer graphics to oil field simulation.
This talk will build on the theme of Probabilistic Integration, with particular focus on distributions that are intractable, such as posterior distributions in Bayesian statistics. In this context we will discuss a novel approach to estimate posterior expectations using Markov chains. This is shown, under regularity conditions, to produce Monte Carlo quadrature rules that converge as O(n^(−0.5− 2(a∧b) + e) at a cost that is cubic in n, where p (the posterior density) has 2a + 1 derivatives, the integrand f has a ∧ b derivatives and > 0 can be arbitrarily small
Partial differential equations (PDEs) are a challenging class of problems which rarely admit closed-form solutions, forcing us to resort to numerical methods which discretise the problem to construct an approximate solution. We seek to phrase solution of PDEs as a statistical inference problem, and construct probability measures which quantify the epistemic uncertainty in the solution resulting from the discretisation. We explore construction of probability measures for the strong formulation of elliptic PDEs, and the connection between this and “meshless” methods. We seek to apply these probability measures in Bayesian inverse problems, parameter inference problems whose dynamics are often constrained by a system of PDEs. Sampling from parameter posteriors in such problems often involves replacing an exact likelihood involving the unknown solution of the system with an approximate one, in which a numerical approximation is used. Such approximations have been shown to produce biased and overconfident posteriors when error in the forward solver is not tightly controlled. We show how the uncertainty from a probabilistic forward solver can be propagated into the parameter posteriors, thus permitting the use of coarser discretisations which still produce valid statistical inferences.