This talk is an introduction to kernel methods and Gaussian processes (GPs). We show where those methods as well as aspects of their underlying theory are useful in modern machine learning. To start off, we introduce the connection between GPs and kernel ridge regression (KRR). We focus on understanding basic Bayesian statistics as a foundation for uncertainty quantification. Then we show two examples of approximating the posterior for neural network uncertainty quantification and finish with two applications of deep kernel learning.