Computing and Optimization
Fall 2020, Princeton University (undergraduate course)
(This is the Fall 2020 version of this course. You can also access the current version, or the Fall 2019, Fall 2018, Fall 2017, Fall 2016, Fall 2015, Fall 2014 versions.)
Useful links
- Zoom (password has been emailed to registered students)
- Lectures (Tue/Thu 1:30pm-2:50pm EST) and Review Sessions (schedule on Slide 38 of Lecture 15). Join here.
- You can follow our live notes during lecture.
- All 10 office hours and 6 precepts (see schedule here). Join here.
- Lectures (Tue/Thu 1:30pm-2:50pm EST) and Review Sessions (schedule on Slide 38 of Lecture 15). Join here.
- Piazza (used only for Q&A)
- The course syllabus
- Blackboard
- Download MATLAB
- Download CVX
- Acknowledgments
Lectures
The notes below summarize most of what I cover during lecture. Please complement them with your own notes. Some lectures take one class session to cover, some others take two. Zoom recordings of our lectures will be posted on Blackboard a few hours after each lecture.
- Lecture 1: Let's play two games! (Optimization, P and NP.)
[pdf]
- Lecture 2: What you should remember from linear algebra and multivariate calculus.
[pdf]
- Lecture 3: Unconstrained optimization, least squares, optimality conditions.
[pdf]
- Lecture 4: Convex optimization I.
[pdf]
- Lecture 5: Convex optimization II.
[pdf]
- CVX: Basic examples.
[m] - Lecture 6: Applications in statistics and machine learning: LASSO + Support vector machines (SVMs)
[pdf]
- Lecture 7: Root finding and line search. Bisection, Newton, and secant methods.
[pdf]
- Lecture 7: Root finding and line search. Bisection, Newton, and secant methods.
[pdf]
- Lecture 8: Gradient descent methods, analysis of steepest descent, convergence and rates of convergence, Lyapunov functions for proving convergence.
[pdf]
- Lecture 9: Multivariate Newton, quadratic convergence, Armijo stepsize rule, nonlinear least squares and the Gauss-Newton algorithm.
[pdf]
- Lecture 10: Conjugate direction methods, solving linear systems, Leontief economy.
[pdf]
- Lecture 11: Linear programming: applications, geometry, and the simplex algorithm.
[pdf]
- Lecture 12: Duality + robust linear programming.
[pdf]
- Lecture 13: Semidefinite programming + SDP relaxations for nonconvex optimization.
[pdf]
- Lecture 14: A working knowledge of computational complexity theory for an optimizer.
[pdf] - Lecture 15: Limits of computation + course recap.
[pdf]
Problem sets and exams
Solutions are posted on Blackboard.
- Homework 1: Modeling Sudoku, perfect numbers, and a review of linear algebra, multivariate calculus, and MATLAB.
[pdf]
- Homework 2: Image compression and remembering John Conway, local and global minima, positive semidefinite matrices, copositive matrices.
[pdf], [conway.jpg]
- Homework 3: Radiation treatment planning, regression with different penalties, minimizers of convex problems, and quasiconvex functions.
[pdf], [treatment_planning_data]
- Homework 4: Support vector machines, Hillary or Bernie, convergence of Newton's method.
[pdf], [HWSVM], [Hillary_vs_Bernie]
- Practice Midterms
[pdf], [pdf], [pdf], [pdf]
- Midterm
[pdf]
- Homework 5: T heory/applications split in a course, Newton fractals.
[pdf]
- Homework 6: Orbit of the Earth and daily temperature in NYC, New gym for Princeton, Lyapunov functions.
[pdf], [Circledraw.m], [plotgrid.m], [princetoncampus.jpeg], [TemperatureNewYork.mat]
- Homework 7: Weak duality in LP and SDP, Minimum fuel optimal control, nearest correlation matrix.
[pdf]
- Homework 8: End-of-semester party at AAA's + Doodle and scheduling + SDP relaxations + NP-completeness.
[pdf], [Party_people_in_the_house_tonight.mat] , [Doodle_matrix.mat]
- Practice Finals
[pdf], [pdf], [pdf], [pdf], [pdf]
- Final Exam
[pdf]