- Lecture 1: Intro and Probability
- Lecture 2: Probability, Distributions, and Frequentism
- lab 1: Frequentism
- Lecture 3: Expectations, the laws, and Monte Carlo
- Lecture 4: Sampling
- lab 2: Python, Math, and Stratification
- Lecture 5: Machine Learning
- Lecture 6: Gradient Descent
- lab 3: PyTorch, Regressions, and Artificial Neural Networks
- Lecture 7: Machine Learning and Backpropagation
- Lecture 8: Neural Nets and Information Theory
- lab 4: PyTorch and Artificial Neural Networks(contd)
- Lecture 9: Information Theory, Deviance, and Global Optimization
- Lecture 10: Annealing, Markov, and Metropolis
- lab 5: Simulated Annealing
- Lecture 11: Metropolis To Bayes
- Lecture 12: Bayes
- lab 6: Sampling and Bayes
- Lecture 13: Bayes
- Lecture 14: Convergence and Gibbs
- lab 7: Sampling and Hierarchical Models
- Lecture 15: Linear Regression
- Lecture 16: Gaussian Processes
- lab 8: Regression and GP
- Lecture 17: Augmentation and Slice and HMC
- Lecture 18: HMC, Normal Normal Hierarchical
- lab 9: Normal-Normal Hierarchicals
- Lecture 19: Posterior Predictive Checks and GLMs
- Lecture 20: Decisions, Model Comparison, and GLMs
- Lab 10: Prosocial Chimps glm
- Lecture 21: Decisions, Model Comparison, and GLMs, Ensembles, Workflow
- Lecture 22: Workflow and Mixtures
- Lab 11: Mixtures and log-sum-exp marginals
- Lecture 23: EM and Mixtures
- Lecture 24: Expectation Maximization and Variational Inference
- Lab 12: Mixtures and Correlation
- Lecture 25: Variational Bayes and Generative Models
- Lecture 26: Recap