Skip to main content

2-day course: Fundamentals of Bayesian Inference using Probabilistic Programming

Save to calendar

Jun 01

Date and time: 1 June 14.00-17.00 and 7 June 14.00-17.00 CEST (UTC+2), 2022
Course lecturer: Assoc. Prof. David Broman, KTH and Digital Futures
Title: Fundamentals of Bayesian Inference using Probabilistic Programming

Where: Digital Futures hub, Osquars Backe 5, floor 2 at KTH main campus OR via Zoom
Directions: https://www.digitalfutures.kth.se/contact/how-to-get-here/

Format: Two 3-hour seminar lectures plus optional homework
Cost: The course is free of charge, but registration is required.

Note that a maximum of 50 participants can be onsite at the Digital Futures hub. First-come, first-served basis.

There will also be an option to participate via Zoom, but onsite participation is preferred. You must register for both onsite and online participation. For online participation, a Zoom link will be sent upon registration.

Please register here: https://www.kth.se/form/6254058494395f32cdb8aaed

This course is a collaboration between TECoSA and Digital Futures.

Welcome to the world of probabilistic programming and Bayesian inference!

In this short course/tutorial (2 x 3h) you will learn the fundamentals of Bayesian inference using a rather new paradigm called probabilistic programming. No prior knowledge of Bayesian theory is necessary.

You will learn about the basic intuitions behind popular inference algorithms as well as how to design and write small models / probabilistic programs in popular probabilistic programming languages (for instance WebPPL and Stan).

The tutorial will be highly interactive, where the course participants perform experiments using their own computers. We also recommend that you participate onsite (if possible) to make the interactive experience as good as possible, but we will also provide an option to participate online.

Key content

Fundamental probabilistic modelling, observe and assume concepts, basic distributions, likelihood, probabilities, Bayes’ theorem, conjugate priors, delayed sampling, various inference algorithms (intuitions behind importance sampling, sequential Monte Carlo (SMC), and various Markov chain Monte Carlo (MCMC) methods), well-known models like Latent Dirichlet Allocation (LDA), and the concept of universal probabilistic programming.

If you have any questions, please do not hesitate to contact David Broman, dbro@kth.se