Mathematics & Statistics (Sci) : Line search methods including steepest descent, Newton's (and Quasi-Newton) methods. Trust region methods, conjugate gradient method, solving nonlinear equations, theory of constrained optimization including a rigorous derivation of Karush-Kuhn-Tucker conditions, convex optimization including duality and sensitivity. Interior point methods for linear programming, and conic programming.
Terms: Winter 2020
Instructors: Hoheisel, Tim (Winter)
Prerequisite: Undergraduate background in analysis and linear algebra, with instructor's approval