Back to all members...

Lorenz Kuhn

PhD, started 2021

Lorenz is a DPhil student in Computer Science working with Prof. Yarin Gal at the University of Oxford. His main research interests include improving our theoretical understanding of deep learning, as well as making deep learning safer and more reliable for real-world use cases. Lorenz is a recipient of a FHI DPhil Scholarship and holds a MSc in Computer Science from ETH Zurich.

Previously, Lorenz has undertaken research on pruning and generalization in deep neural networks under the supervision of Prof. Yarin Gal and Prof. Andreas Krause, on medical recommendation systems with Prof. Ce Zhang, and on medical information retrieval with Prof. Carsten Eickhoff and Prof. Thomas Hofmann.

Additionally, Lorenz has worked on making very large language models more efficient at Cohere, on various data science projects at the Boston Consulting Group, and on medical recommendation systems at IBM Research and ETH Zurich.


Publications while at OATMLNews items mentioning Lorenz KuhnReproducibility and CodeBlog Posts

Publications while at OATML:

Robustness to Pruning Predicts Generalization in Deep Neural Networks

Existing generalization measures that aim to capture a model's simplicity based on parameter counts or norms fail to explain generalization in overparameterized deep neural networks. In this paper, we introduce a new, theoretically motivated measure of a network's simplicity which we call prunability: the smallest \emph{fraction} of the network's parameters that can be kept while pruning without adversely affecting its training loss. We show that this measure is highly predictive of a model's generalization performance across a large set of convolutional networks trained on CIFAR-10, does not grow with network size unlike existing pruning-based measures, and exhibits high correlation with test set loss even in a particularly challenging double descent setting. Lastly, we show that the success of prunability cannot be explained by its relation to known complexity measures based on models' margin, flatness of minima and optimization speed, finding that our new measure is similar to -... [full abstract]


Lorenz Kuhn, Clare Lyle, Aidan Gomez, Jonas Rothfuss, Yarin Gal
arXiv
[paper]
More publications on Google Scholar.

Blog Posts

OATML Conference papers at NeurIPS 2022

OATML group members and collaborators are proud to present 8 papers at NeurIPS 2022 main conference, and 11 workshop papers. …

Full post...


Yarin Gal, Freddie Kalaitzis, Sören Mindermann, Lorenz Kuhn, Gunshi Gupta, Jannik Kossen, Pascal Notin, Andrew Jesson, Panagiotis Tigas, Tim G. J. Rudner, Sebastian Farquhar, Ilia Shumailov, 25 Nov 2022

Are you looking to do a PhD in machine learning? Did you do a PhD in another field and want to do a postdoc in machine learning? Would you like to visit the group?

How to apply


Contact

We are located at
Department of Computer Science, University of Oxford
Wolfson Building
Parks Road
OXFORD
OX1 3QD
UK
Twitter: @OATML_Oxford
Github: OATML
Email: oatml@cs.ox.ac.uk