Convolutional Gaussian Processes

back to our research

Convolutional Gaussian Processes

4-9 December, Long Beach California, USA

Advances in Neural Information Processing Systems 30 (NIPS 2017) (oral presentation)

Authors: Mark van der Wilk (Cambridge), Carl E Rasmussen ( and Cambridge), James Hensman (

Abstract: We present a practical way of introducing convolutional structure into Gaussian processes, making them more suited to high-dimensional inputs like images. The main contribution of our work is the construction of an inter-domain inducing point approximation that is well-tailored to the convolutional kernel. This allows us to gain the generalisation benefit of a convolutional kernel, together with fast but accurate posterior inference. We investigate several variations of the convolutional kernel, and apply it to MNIST and CIFAR-10, which have both been known to be challenging for Gaussian processes. We also show how the marginal likelihood can be used to find an optimal weighting between convolutional and RBF kernels to further improve performance. We hope that this illustration of the usefulness of a marginal likelihood will help automate discovering architectures in larger models.

Data Efficiency

Gaussian Processes


Probabilistic Modelling

See paper