Featured image of post CO 466

CO 466

Continuous Optimization

This course covers some optimization theory behind machine learning.

These notes cover continuous optimization theory and algorithms. Topics include unconstrained optimization (gradient descent, Newton and quasi-Newton methods, conjugate gradient), constrained optimization (KKT conditions, constraint qualification, augmented Lagrangians, ADMM), proximal methods, sequential quadratic programming, and interior-point methods with self-concordant barriers.

Licensed under CC BY-NC-SA 4.0
Notes taking with heart
Built with Hugo
Theme Stack designed by Jimmy