This course covers some optimization theory behind machine learning.
These notes cover continuous optimization theory and algorithms. Topics include unconstrained optimization (gradient descent, Newton and quasi-Newton methods, conjugate gradient), constrained optimization (KKT conditions, constraint qualification, augmented Lagrangians, ADMM), proximal methods, sequential quadratic programming, and interior-point methods with self-concordant barriers.
