Skip to content

USCbiostats/PM520

Repository files navigation

cozy programming

PM520 Advanced Statistical Computing

This course seeks to introduce students to theory and hands-on programming underlying advanced statistical computing. Modern statistical inference requires computational approaches that consider numerical stability (e.g., solving linear systems, “logsumexp” trick), optimization techniques (e.g., gradient descent, natural gradient descent), automatic differentiation, and scalability in the face of ever-increasing data (e.g., variational inference). Both theory as well as hands-on examples will be covered through lecture and lab exercises. The audience for this course includes second year and beyond Biostatistics graduate students as well as graduate students from other Divisions, Departments, or Schools interested in designing and implementing computational inferential tools for their research.

Learning Objectives

This course focuses on advanced statistical computing and Bayesian inference using Python. By the end of this course, students will be able to:

  • Use Github and Google Colab to organize and share code/software and code examples.
  • Implement differentiable programs using Python + JAX.
  • Implement numerically stable algorithms for statistical computing and inference using Python.
  • Discuss theory of exponential families, statistical divergences, and natural gradient descent.
  • Derive the “evidence lower bound” (i.e. ELBO) for KL-based variational inference objectives.
  • Contrast variational approaches to sampling approaches for Bayesian inference.

Releases

No releases published

Packages

No packages published