Freddie is a DPhil student working with Tom Rainforth, Yarin Gal and Seb Farquhar. His main interests are uncertainty estimation and unsupervised learning. Before joining OATML, he worked on deep learning for cognitive science with Brad Love, Brett Roads and Ed Grefenstette at UCL. He has degrees in machine learning (UCL) and mechanical engineering (Bristol).
Publications while at OATML • News items mentioning Freddie Bickford Smith • Reproducibility and Code • Blog Posts
Continual learning is the process of developing new abilities while retaining existing ones. Sequential Bayesian inference over predictive functions is a natural framework for doing this, but applying it to deep neural networks is challenging in practice. To address the drawbacks of existing approaches, we formulate continual learning as sequential function-space variational inference. From this formulation, we derive a tractable variational objective that explicitly encourages a neural network to fit data from new tasks while also matching posterior distributions over functions inferred from previous tasks. Crucially, the objective is expressed purely in terms of predictive functions. This way, the parameters of the neural network can vary widely during training, allowing easier adaptation to new tasks than is possible with techniques that directly regularize parameters. We demonstrate that, across a range of task sequences, neural networks trained via sequential function-space va... [full abstract]
Tim G. J. Rudner, Freddie Bickford Smith, Qixuan Feng, Yee Whye Teh, Yarin Gal
ICML Workshop on Theory and Foundations of Continual Learning, 2021
ICML Workshop on Subset Selection in Machine Learning, From Theory to Applications, 2021
OATML student Pascal Notin is co-organizing the 7th edition of the Workshop on Computational Biology (WCB) at ICML 2022 jointly with collaborators at Harvard, Columbia, Cornell and others. OATML students Neil Band, Freddie Bickford Smith, Jan Brauner, Andreas Kirsch and Lood Van Niekerk are part of the PC.