Roeterseilandcampus - Gebouw A, Straat: Nieuwe Achtergracht 129-B, Ruimte: A2.11. Vanwege beperkte zaalcapaciteit is deelname op basis van wie het eerst komt, het eerst maalt. Leraren moeten zich hieraan houden.
Bayesian modelling is powerful, but waiting for MCMC to converge can be a frustrating experience, especially in high-dimensional or multimodal models. The field of deep learning however has reached tremendous breakthroughs by exploiting massive parallel computation using GPUs and TPUs. It seemed like these developments would pass by the Bayesian, as traditional MCMC is a serial algorithm. However, new software platforms like JAX, and appropriate inference algorithms that exploit parallel computation have the potential to be faster than even CmdStan. In today's lecture I will discuss how Bayesian statisticians can benefit from these developments. I will show how to do Bayesian modelling and inference in the JAX framework, and I'll showcase an inference algorithm that exploits parallel computation, the confusingly named Sequential Monte Carlo algorithm (SMC). I'll demonstrate these ideas with models we use in our lab, such as Generalized Wishart Processes for dynamic correlation structures, and Gaussian process mixture models for clustering children's learning behaviour, where we achieved impressive speed-ups.