Posts

NeurIPS 2020: Hyperparameter Ensembles.

In our NeurIPS paper, we leverage diversity stemming from models with different hyperparameters. This leads to SotA accuracy and more robust predictions. Hyper-deep ensembles expand on deep ensembles by integrating over a larger space of hyperparameters. Hyper-batch ensembles expand on efficient methods. Also check out the thread by Dustin Tran.

ICML 2020: Bayesian Neural Networks and the Cold Posterior.

In our ICML 2020 paper, we tackle the question: How Good is the Bayes Posterior in Deep Neural Networks Really? We cast doubt on the current understanding of Bayes posteriors in popular deep neural networks. We find that using cold posteriors improves the performance of BNNs but sharply deviate from the Bayesian paradigm. We put forward several hypotheses that could explain cold posteriors and evaluate the hypotheses through experiments.

AISTATS 2020: Automated Augmented Conjugate Inference.

In our AISTATS paper, we propose a method that automatically transforms non-conjugate Gaussian process models into conjugate models. In the transformed model inference is easy, much faster and more stable. No more need for long manual derivations of complete conditional distributions and their expectations (although I enjoy them from time to time).

I joined Google Brain Berlin.

Great news: I started a postdoctoral researcher position at Google Brain Berlin. I’m excited to work on topics around Bayesian deep learning. Stay tuned! :-)

UAI'19 paper accepted

We have a new paper on augmentation variational inference, to appear at UAI 2019 in Tel Aviv. We consider Gaussian Process multi-class classification and propose a new multi-class likelihood which has two benefits: it leads to well-calibrated uncertainty estimates and allows for an efficient conditionally conjugate latent variable augmentation. The code can be found in our Julia GP package AugmentedGaussianProcesses.

NeurIPS'18 Workshop Presentations

This year at NeurIPS in Montreal, I have three workshop papers. Together with my colleagues, I will present at:

AAAI'19 paper accepted

Our new paper Efficient Gaussian Process Classification Using Polya-Gamma Data Augmentation got accepted at AAAI in Honolulu, Hawaii. We propose an augmented perspective on a GP classification model based on Polya-Gamma variables. The augmented model is conjugate and inference is much faster than in former approaches. We publish the code in our Julia package AugmentedGaussianProcesses – a library for fast inference in non-conjugate GP models based on data augmentation.

ICML'18 paper accepted

Our paper Quasi-Monte Carlo Variational Inference was accepted at ICML. In this paper we explore the usage of Quasi-Monte Carlo sampling for obtaining low variance gradient estimators for Monte Carlo variational inference.

AISTATS'18 paper accepted

I am happy that our paper Generalized Dynamic Topic Models got accepted at AISTATS in Lanzarote. We generalize the classic topic model to topics which evolve other time. The time dynamics are modeled by Gaussian processes. Choosing different kernels for the GPs allows for discovering different time patterns in the text corpus.

Netflix Travel Award / NIPS'17

Our workshop paper Scalable Logit Gaussian Process Classification got selected as contributed talk at the Advances in Approximate Bayesian Inference Workshop at NIPS. Moreover, I am very happy to receive the Netflix Travel Award.