Back to all members...

Joost van Amersfoort

PhD, started 2018

Joost is a DPhil student in the OATML group in the Department of Computer Science at the University of Oxford, supervised by Yarin Gal and Yee Whye Teh. He is interested in representation learning, variational inference, and Bayesian methods. Furthermore he likes to play with new programming languages for ML, such as Swift and Julia. Previously, he spent two years at Twitter Cortex as part of the team that came out of the Magic Pony acquisition. He obtained his MSc. at the University of Amsterdam, working with Max Welling and Diederik Kingma. He is an Oxford-Google DeepMind scholar.


BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning

We develop BatchBALD, a tractable approximation to the mutual information between a batch of points and model parameters, which we use as an acquisition function to select multiple informative points jointly for the task of deep Bayesian active learning. BatchBALD is a greedy linear-time 1−1/e-approximate algorithm amenable to dynamic programming and efficient caching. We compare BatchBALD to the commonly used approach for batch data acquisition and find that the current approach acquires similar and redundant points, sometimes performing worse than randomly acquiring data. We finish by showing that, using BatchBALD to consider dependencies within an acquisition batch, we achieve new state of the art performance on standard benchmarks, providing substantial data efficiency improvements in batch acquisition.

Andreas Kirsch, Joost van Amersfoort, Yarin Gal
NeurIPS, 2019

Reproducibility and Code

Getting high accuracy on CIFAR-10 is not straightforward. This self-contained script gets to 94% accuracy with a minimal setup.

Joost van Amersfoort wrote a self-contained, 150 line script that trains a ResNet-18 to ~94% accuracy on CIFAR-10. Useful for obtaining a strong baseline with minimal tricks.

Joost van Amersfoort

Reproducing the results from "Do Deep Generative Models Know What They Don't Know?"

PyTorch implementation of Glow that reproduces the results from “Do Deep Generative Models Know What They Don’t Know?” (Nalisnick et al.); Includes a pretrained model, evaluation notebooks and training code!

Joost van Amersfoort

Blog Posts

25 OATML Conference and Workshop papers at NeurIPS 2019

We are glad to share the following 25 papers by OATML authors and collaborators to be presented at this NeurIPS conference and workshops. …

Full post...

Angelos Filos, Sebastian Farquhar, Aidan Gomez, Tim G. J. Rudner, Zac Kenton, Lewis Smith, Milad Alizadeh, Tom Rainforth, Panagiotis Tigas, Andreas Kirsch, Clare Lyle, Joost van Amersfoort, Yarin Gal,08 Dec 2019

Human in the Loop: Deep Learning without Wasteful Labelling

In Active Learning we use a “human in the loop” approach to data labelling, reducing the amount of data that needs to be labelled drastically, and making machine learning applicable when labelling costs would be too high otherwise. In our paper [1] we present BatchBALD: a new practical method for choosing batches of informative points in Deep Active Learning which avoids labelling redundancies that plague existing methods. Our approach is based on information theory and expands on useful intuitions. We have also made our implementation available on GitHub at …

Full post...

Andreas Kirsch, Joost van Amersfoort, Yarin Gal,24 Jun 2019


We are located at
Department of Computer Science, University of Oxford
Wolfson Building
Parks Road
Twitter: @OATML_Oxford
Github: OATML

Are you looking to do a PhD in machine learning? Did you do a PhD in another field and want to do a postdoc in machine learning? Would you like to visit the group?

How to apply