HMC
- The Idea of Hamiltonian Monte Carlo
[LINK]
MCMC
- Formal Tests for Convergence
[LINK]
- The Idea of Hamiltonian Monte Carlo
[LINK]
advi
aic
autocorrelation
- Convergence and Coverage
[LINK]
- Intro to Gibbs Sampling
[LINK]
- A gibbs sampler with lots of autocorrelation
[LINK]
backpropagation
- Maximum Likelihood Estimation
[LINK]
bayes risk
- Classification Risk
[LINK]
- Utility or Risk - Back to optimization
[LINK]
bayes theorem
bayesian
- Bayesian Statistics
[LINK]
- Sampling with $\sigma$
[LINK]
- Bayesian Regression
[LINK]
- Formal Tests for Convergence
[LINK]
- The beta-binomial model of globe-tossing
[LINK]
- Hierarchical Models
[LINK]
- Identifiability
[LINK]
- Levels of Bayesian Analysis
[LINK]
- The normal model
[LINK]
- The normal model with pymc
[LINK]
- Priors
[LINK]
- From the normal model to regression
[LINK]
- Regression with custom priors
[LINK]
- Sufficient Statistics and Exchangeability
[LINK]
- Imputation and Convergence
[LINK]
- Unidentifiability, the ridge, and the lasso
[LINK]
bayesian p-values
bayesian updating
- Bayesian Regression
[LINK]
- The beta-binomial model of globe-tossing
[LINK]
bayesian workflow
- Bayesian Workflow in the 0-inflated model
[LINK]
bernoulli distribution
- Distributions example - elections
[LINK]
- Distributions
[LINK]
beta
- Gibbs with conditional a conjugate
[LINK]
- The beta-binomial model of globe-tossing
[LINK]
- Hierarchical Models
[LINK]
beta-binomial
- Gibbs with conditional a conjugate
[LINK]
- The beta-binomial model of globe-tossing
[LINK]
- Hierarchical Models
[LINK]
bias
bias-variance tradeoff
binomial
- Entropy
[LINK]
- Distributions example - elections
[LINK]
- Gibbs with conditional a conjugate
[LINK]
- The beta-binomial model of globe-tossing
[LINK]
- Hierarchical Models
[LINK]
binomial regression
- Lab 7 - Bayesian inference with PyMC3.
[LINK]
bioassay
- Lab 7 - Bayesian inference with PyMC3.
[LINK]
boltzmann distribution
- From Annealing To Metropolis
[LINK]
- Simulated Annealing
[LINK]
bootstrap
box loop
box-muller
calculus
- Calculus for optimization
[LINK]
canonical distribution
- Exploring Hamiltonian Monte Carlo
[LINK]
- L, epsilon, and other tweaking
[LINK]
cavi
cdf
centering
- Poisson Regression - tools on islands part 2
[LINK]
central limit theorem
- Sampling and the Central Limit Theorem
[LINK]
- Distributions example - elections
[LINK]
- Monte Carlo Integrals
[LINK]
classical mechanics
- Exploring Hamiltonian Monte Carlo
[LINK]
- L, epsilon, and other tweaking
[LINK]
classification
combinatoric optimization
complexity parameter
- Regularization
[LINK]
- Learning bounds and the Test set
[LINK]
- Why do we need validation?
[LINK]
conditional
conjugate prior
- Bayesian Regression
[LINK]
- Gibbs with conditional a conjugate
[LINK]
- The beta-binomial model of globe-tossing
[LINK]
- The normal model
[LINK]
- The normal model with pymc
[LINK]
- Priors
[LINK]
- From the normal model to regression
[LINK]
- Sufficient Statistics and Exchangeability
[LINK]
convex
- Convexity and Jensen's Inequality
[LINK]
correlation
- The idea behind the GP
[LINK]
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
- Correlations
[LINK]
- Inference for GPs
[LINK]
- Geographic Correlation and Oceanic Tools
[LINK]
counterfactual plot
- Poisson Regression - tools on islands part 2
[LINK]
covariance
- The idea behind the GP
[LINK]
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
- Correlations
[LINK]
- Inference for GPs
[LINK]
- Geographic Correlation and Oceanic Tools
[LINK]
cross-entropy
- Divergence and Deviance
[LINK]
cross-validation
curse of dimensionality
- From Annealing To Metropolis
[LINK]
data augmentation
- Data Augmentation
[LINK]
- Exploring Hamiltonian Monte Carlo
[LINK]
- L, epsilon, and other tweaking
[LINK]
- Slice Sampling
[LINK]
decision risk
- Classification Risk
[LINK]
- Utility or Risk - Back to optimization
[LINK]
decision theory
- Classification Risk
[LINK]
- The beta-binomial model of globe-tossing
[LINK]
- Utility or Risk - Back to optimization
[LINK]
detailed balance
- Exploring Hamiltonian Monte Carlo
[LINK]
- L, epsilon, and other tweaking
[LINK]
- Markov Chains and MCMC
[LINK]
- From Annealing To Metropolis
[LINK]
- Metropolis-Hastings
[LINK]
deterministic error
deviance
dic
discrepancy
discrete sampling
- Sampling from a discrete distribution
[LINK]
- Metropolis-Hastings
[LINK]
discriminative models
- Machine learning- ERM and Bayesian
[LINK]
distributions
- Expectations and the Law of Large Numbers
[LINK]
divergences
- Gelman Schools and Hierarchical Pathology
[LINK]
- Gelman Schools Theory for Topics about Restaurants
[LINK]
- L, epsilon, and other tweaking
[LINK]
effective sample size
- A gibbs sampler with lots of autocorrelation
[LINK]
elbo
elections
- Distributions example - elections
[LINK]
empirical bayes
empirical distribution
- Distributions example - elections
[LINK]
empirical risk minimization
- Machine learning- ERM and Bayesian
[LINK]
- Learning a model
[LINK]
- Noisy Learning
[LINK]
- Learning bounds and the Test set
[LINK]
- Why do we need validation?
[LINK]
energy
- Exploring Hamiltonian Monte Carlo
[LINK]
- The Idea of Hamiltonian Monte Carlo
[LINK]
- L, epsilon, and other tweaking
[LINK]
- Simulated Annealing
[LINK]
entropy
equilibrium
- From Annealing To Metropolis
[LINK]
ergodicity
estimation risk
- Utility or Risk - Back to optimization
[LINK]
exchangeability
- Hierarchical Models
[LINK]
- Sufficient Statistics and Exchangeability
[LINK]
expectations
- Expectations and the Law of Large Numbers
[LINK]
- Basic Monte Carlo
[LINK]
- Importance Sampling
[LINK]
exponential distribution
- Maximum Likelihood Estimation
[LINK]
exponential family
- Sufficient Statistics and Exchangeability
[LINK]
- Formal Tests for Convergence
[LINK]
- Imputation and Convergence
[LINK]
frequentist statistics
full-data likelihood
gamma
- Sufficient Statistics and Exchangeability
[LINK]
gaussian mixture model
- Variational Inference
[LINK]
- Gaussian Mixture Model with ADVI
[LINK]
- Marginalizing over Discretes
[LINK]
- Types of learning and MCMC
[LINK]
- Mixtures and MCMC
[LINK]
- Mixture Models, and types of learning
[LINK]
gaussian process
- The idea behind the GP
[LINK]
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
- Inference for GPs
[LINK]
- Geographic Correlation and Oceanic Tools
[LINK]
gelman-rubin
- Lab 7 - Bayesian inference with PyMC3.
[LINK]
generative model
- Machine learning- ERM and Bayesian
[LINK]
- Mixture Models, and types of learning
[LINK]
gewecke
- Lab 7 - Bayesian inference with PyMC3.
[LINK]
gibbs sampler
- Data Augmentation
[LINK]
- Gibbs with conditional a conjugate
[LINK]
- Gibbs from Metropolis-Hastings
[LINK]
- Intro to Gibbs Sampling
[LINK]
- A gibbs sampler with lots of autocorrelation
[LINK]
glm
- Poisson Regression - tools on islands part 2
[LINK]
- Generalized Linear Models
[LINK]
- Poisson and 0-inflated
[LINK]
- Bayesian Workflow in the 0-inflated model
[LINK]
- Prosocial Chimps
[LINK]
global minimum
gradient descent
hamiltonian monte carlo
- Exploring Hamiltonian Monte Carlo
[LINK]
- L, epsilon, and other tweaking
[LINK]
hierarchical
- Poisson Regression - tools on islands part 2
[LINK]
- Lab 7 - Bayesian inference with PyMC3.
[LINK]
- Gelman Schools and Hierarchical Pathology
[LINK]
- Gelman Schools Theory for Topics about Restaurants
[LINK]
- Hierarchical Models
[LINK]
- L, epsilon, and other tweaking
[LINK]
hierarchical normal-normal model
- Gelman Schools and Hierarchical Pathology
[LINK]
- Gelman Schools Theory for Topics about Restaurants
[LINK]
hoeffding's inequality
- Learning bounds and the Test set
[LINK]
- Why do we need validation?
[LINK]
hypothesis space
identifiability
- Identifiability
[LINK]
- Types of learning and MCMC
[LINK]
- Mixtures and MCMC
[LINK]
- Unidentifiability, the ridge, and the lasso
[LINK]
importance sampling
imputation
- Imputation and Convergence
[LINK]
in-sample
independence
inference
- Maximum Likelihood Estimation
[LINK]
- Maximum Likelihood Estimation
[LINK]
- Inference for GPs
[LINK]
infinite-basis
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
integration
interaction-term
- Poisson Regression - tools on islands part 2
[LINK]
irreducible
- Markov Chains and MCMC
[LINK]
- From Annealing To Metropolis
[LINK]
jeffreys prior
jensen's inequality
- Divergence and Deviance
[LINK]
- Convexity and Jensen's Inequality
[LINK]
kernel
- The idea behind the GP
[LINK]
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
- Inference for GPs
[LINK]
kernel trick
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
kernelized regression
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
kidney cancer
kl-divergence
label-switching
lasso
- Regularization
[LINK]
- Unidentifiability, the ridge, and the lasso
[LINK]
latent variables
- The EM algorithm
[LINK]
- Variational Inference
[LINK]
- Mixture Models, and types of learning
[LINK]
law of large numbers
- Expectations and the Law of Large Numbers
[LINK]
- Basic Monte Carlo
[LINK]
- Monte Carlo Integrals
[LINK]
- Learning a model
[LINK]
leapfrog
- Exploring Hamiltonian Monte Carlo
[LINK]
- L, epsilon, and other tweaking
[LINK]
likelihood
- Divergence and Deviance
[LINK]
likelihood-ratio
- Divergence and Deviance
[LINK]
linear regression
- Maximum Likelihood Estimation
[LINK]
- Gradient Descent and SGD
[LINK]
- Regularization
[LINK]
link-function
- Generalized Linear Models
[LINK]
- Poisson and 0-inflated
[LINK]
- Bayesian Workflow in the 0-inflated model
[LINK]
lkj prior
log-likelihood
log-sum-exp trick
- Marginalizing over Discretes
[LINK]
logistic regression
- Maximum Likelihood Estimation
[LINK]
- Maximum Likelihood Estimation
[LINK]
- Lab 3 - Pytorch
[LINK]
loocv
loss function
- The beta-binomial model of globe-tossing
[LINK]
lotus
- Expectations and the Law of Large Numbers
[LINK]
- Basic Monte Carlo
[LINK]
- Monte Carlo Integrals
[LINK]
machine learning
- Machine learning- ERM and Bayesian
[LINK]
map
- Bayesian Statistics
[LINK]
- Sampling with $\sigma$
[LINK]
- Unidentifiability, the ridge, and the lasso
[LINK]
marginal
- The idea behind the GP
[LINK]
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
- Distributions
[LINK]
- Gaussian Mixture Model with ADVI
[LINK]
- Inference for GPs
[LINK]
- Probability
[LINK]
marginal energy distribution
- Exploring Hamiltonian Monte Carlo
[LINK]
- L, epsilon, and other tweaking
[LINK]
marginalizing over discretes
- Gaussian Mixture Model with ADVI
[LINK]
- Marginalizing over Discretes
[LINK]
markov chain
maxent
maximum likelihood
- The EM algorithm
[LINK]
- Maximum Likelihood Estimation
[LINK]
- Maximum Likelihood Estimation
[LINK]
- Frequentist Stats
[LINK]
mcmc
- Wrongly combining rejection with sampling
[LINK]
- Bayesian Statistics
[LINK]
- Sampling with $\sigma$
[LINK]
- Convergence and Coverage
[LINK]
- Data Augmentation
[LINK]
- Sampling from a discrete distribution
[LINK]
- Gibbs with conditional a conjugate
[LINK]
- Gibbs from Metropolis-Hastings
[LINK]
- Identifiability
[LINK]
- Intro to Gibbs Sampling
[LINK]
- Simulated Annealing
[LINK]
- Markov Chains and MCMC
[LINK]
- Metropolis-Hastings
[LINK]
- Priors
[LINK]
- Regression with custom priors
[LINK]
- Slice Sampling
[LINK]
- Imputation and Convergence
[LINK]
- A gibbs sampler with lots of autocorrelation
[LINK]
- Unidentifiability, the ridge, and the lasso
[LINK]
mcmc engineering
- Formal Tests for Convergence
[LINK]
- The normal model with pymc
[LINK]
- From the normal model to regression
[LINK]
- Imputation and Convergence
[LINK]
mean-field approximation
mercer's theorem
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
metropolis
- Wrongly combining rejection with sampling
[LINK]
- Convergence and Coverage
[LINK]
- Sampling from a discrete distribution
[LINK]
- Gibbs from Metropolis-Hastings
[LINK]
- Markov Chains and MCMC
[LINK]
- From Annealing To Metropolis
[LINK]
- Metropolis-Hastings
[LINK]
- Priors
[LINK]
metropolis-hastings
- Wrongly combining rejection with sampling
[LINK]
- Convergence and Coverage
[LINK]
- Gibbs from Metropolis-Hastings
[LINK]
- Simulated Annealing
[LINK]
- Markov Chains and MCMC
[LINK]
- Metropolis-Hastings
[LINK]
microcanonical distribution
- Exploring Hamiltonian Monte Carlo
[LINK]
- L, epsilon, and other tweaking
[LINK]
minibatch sgd
mixture model
- The EM algorithm
[LINK]
- Variational Inference
[LINK]
- Gaussian Mixture Model with ADVI
[LINK]
- Marginalizing over Discretes
[LINK]
- Types of learning and MCMC
[LINK]
- Mixtures and MCMC
[LINK]
- Poisson and 0-inflated
[LINK]
- Bayesian Workflow in the 0-inflated model
[LINK]
- Mixture Models, and types of learning
[LINK]
mlp
- MLP as universal approximator
[LINK]
model averaging
- Model Comparison continued
[LINK]
- Model Comparison
[LINK]
- Prosocial Chimps
[LINK]
- Utility or Risk - Back to optimization
[LINK]
model checking
model comparison
model-comparison
- Poisson Regression - tools on islands part 2
[LINK]
- Model Comparison continued
[LINK]
- Model Comparison
[LINK]
monte-carlo
multiple varying intercept
multivariate normal
- The idea behind the GP
[LINK]
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
- Correlations
[LINK]
- Inference for GPs
[LINK]
- Geographic Correlation and Oceanic Tools
[LINK]
neural network
- MLP as universal approximator
[LINK]
non-centered hierarchical model
- Gelman Schools and Hierarchical Pathology
[LINK]
- Gelman Schools Theory for Topics about Restaurants
[LINK]
- L, epsilon, and other tweaking
[LINK]
normal distribution
- The EM algorithm
[LINK]
- Entropy
[LINK]
- The idea behind the GP
[LINK]
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
- Sampling and the Central Limit Theorem
[LINK]
- Variational Inference
[LINK]
- ADVI
[LINK]
- Distributions example - elections
[LINK]
- Gaussian Mixture Model with ADVI
[LINK]
- Inference for GPs
[LINK]
- Geographic Correlation and Oceanic Tools
[LINK]
- The Inverse Transform
[LINK]
- Types of learning and MCMC
[LINK]
- Mixtures and MCMC
[LINK]
- Monte Carlo Integrals
[LINK]
normal-normal model
- Bayesian Statistics
[LINK]
- Sampling with $\sigma$
[LINK]
- Bayesian Regression
[LINK]
- Gelman Schools and Hierarchical Pathology
[LINK]
- Gelman Schools Theory for Topics about Restaurants
[LINK]
- The normal model
[LINK]
- The normal model with pymc
[LINK]
- Priors
[LINK]
- From the normal model to regression
[LINK]
normalization
- Wrongly combining rejection with sampling
[LINK]
nuts
- Exploring Hamiltonian Monte Carlo
[LINK]
- L, epsilon, and other tweaking
[LINK]
- Poisson Regression - tools on islands part 2
[LINK]
- Geographic Correlation and Oceanic Tools
[LINK]
optimization
out-of-sample error
overdispersion
- Poisson Regression - tools on islands part 2
[LINK]
parametric model
- Maximum Likelihood Estimation
[LINK]
- Maximum Likelihood Estimation
[LINK]
partial pooling
pdf
plug-in approximation
- The beta-binomial model of globe-tossing
[LINK]
pmf
poisson distribution
- Poisson Regression - tools on islands part 2
[LINK]
- Sampling from a discrete distribution
[LINK]
- Sufficient Statistics and Exchangeability
[LINK]
poisson regression
- Poisson Regression - tools on islands part 2
[LINK]
- Geographic Correlation and Oceanic Tools
[LINK]
- Generalized Linear Models
[LINK]
- Poisson and 0-inflated
[LINK]
- Bayesian Workflow in the 0-inflated model
[LINK]
poisson-gamma
- Sufficient Statistics and Exchangeability
[LINK]
posterior
- The idea behind the GP
[LINK]
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
- Box's Loop
[LINK]
- Inference for GPs
[LINK]
- Geographic Correlation and Oceanic Tools
[LINK]
posterior predictive
- The idea behind the GP
[LINK]
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
- Inference for GPs
[LINK]
- Geographic Correlation and Oceanic Tools
[LINK]
- Prosocial Chimps
[LINK]
- Model checking
[LINK]
priors
- The idea behind the GP
[LINK]
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
- Inference for GPs
[LINK]
- Regression with custom priors
[LINK]
probabilistic modeling
- Divergence and Deviance
[LINK]
- Box's Loop
[LINK]
- The Idea of Hamiltonian Monte Carlo
[LINK]
probability
- Expectations and the Law of Large Numbers
[LINK]
- Probability
[LINK]
probability rules
proposal
proposal matrix
- Sampling from a discrete distribution
[LINK]
pymc3
- Inference for GPs
[LINK]
- Marginalizing over Discretes
[LINK]
- Types of learning and MCMC
[LINK]
- Mixtures and MCMC
[LINK]
- The normal model with pymc
[LINK]
- From the normal model to regression
[LINK]
pymc3 potentials
pytorch
random variables
rat tumors
regression
- The idea behind the GP
[LINK]
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
- Poisson Regression - tools on islands part 2
[LINK]
- Bayesian Regression
[LINK]
- Lab 7 - Bayesian inference with PyMC3.
[LINK]
- Inference for GPs
[LINK]
- Generalized Linear Models
[LINK]
- Poisson and 0-inflated
[LINK]
- Bayesian Workflow in the 0-inflated model
[LINK]
- From the normal model to regression
[LINK]
- Regression with custom priors
[LINK]
regularization
rejection sampling
- Wrongly combining rejection with sampling
[LINK]
- Basic Monte Carlo
[LINK]
- Rejection Sampling
[LINK]
rejection sampling on steroids
representer theorem
- Gaussian Processes and 'Non-parametric' Bayes
[LINK]
ridge
- Regularization
[LINK]
- Unidentifiability, the ridge, and the lasso
[LINK]
sampling
- Wrongly combining rejection with sampling
[LINK]
- Sampling and the Central Limit Theorem
[LINK]
- Basic Monte Carlo
[LINK]
- Importance Sampling
[LINK]
- The Inverse Transform
[LINK]
- Rejection Sampling
[LINK]
sampling and priors
sampling as marginalization
- The beta-binomial model of globe-tossing
[LINK]
sampling distribution
- Sampling and the Central Limit Theorem
[LINK]
- Frequentist Stats
[LINK]
- Gelman Schools and Hierarchical Pathology
[LINK]
- Gelman Schools Theory for Topics about Restaurants
[LINK]
- Learning a model
[LINK]
- Noisy Learning
[LINK]
sampling distribution of variance
- Sampling and the Central Limit Theorem
[LINK]
semi-supervised learning
- Types of learning and MCMC
[LINK]
- Mixtures and MCMC
[LINK]
- Mixture Models, and types of learning
[LINK]
sgd
simulated annealing
slice sampling
standard error
- Sampling and the Central Limit Theorem
[LINK]
stationarity
- Markov Chains and MCMC
[LINK]
- From Annealing To Metropolis
[LINK]
statistical mechanics
- Entropy
[LINK]
- Exploring Hamiltonian Monte Carlo
[LINK]
- L, epsilon, and other tweaking
[LINK]
- Simulated Annealing
[LINK]
step-size
- Convergence and Coverage
[LINK]
- L, epsilon, and other tweaking
[LINK]
stochastic noise
stratification
- Stratification for Variance Reduction
[LINK]
- Stratification Example
[LINK]
sufficient statistics
- Sufficient Statistics and Exchangeability
[LINK]
supervised learning
- Classification Risk
[LINK]
- Machine learning- ERM and Bayesian
[LINK]
- Types of learning and MCMC
[LINK]
- Mixtures and MCMC
[LINK]
- Mixture Models, and types of learning
[LINK]
switchpoint
- Imputation and Convergence
[LINK]
temperature
test error
- Learning bounds and the Test set
[LINK]
- Why do we need validation?
[LINK]
test statistic
testing set
- Learning bounds and the Test set
[LINK]
- Why do we need validation?
[LINK]
training error
training set
- The EM algorithm
[LINK]
- Learning bounds and the Test set
[LINK]
- Why do we need validation?
[LINK]
transition distribution
- Exploring Hamiltonian Monte Carlo
[LINK]
- L, epsilon, and other tweaking
[LINK]
transition matrix
travelling salesman problem
true-belief model
- Utility or Risk - Back to optimization
[LINK]
type-2 mle
- Distributions example - elections
[LINK]
- Distributions
[LINK]
- Monte Carlo Integrals
[LINK]
- Regression with custom priors
[LINK]
unsupervised learning
- Types of learning and MCMC
[LINK]
- Mixtures and MCMC
[LINK]
- Mixture Models, and types of learning
[LINK]
validation error
- Why do we need validation?
[LINK]
variance
variance reduction
- Stratification for Variance Reduction
[LINK]
- Stratification Example
[LINK]
variational inference
varying intercept
- Poisson Regression - tools on islands part 2
[LINK]
- Prosocial Chimps
[LINK]
waic
- Poisson Regression - tools on islands part 2
[LINK]
- Model Comparison continued
[LINK]
- Model Comparison
[LINK]
x-likelihood
z-posterior
zero-inflated
- Poisson and 0-inflated
[LINK]
- Bayesian Workflow in the 0-inflated model
[LINK]