Convex and Conic Optimization    Spring 2021, Princeton University (graduate course)

(This is the Spring 2021 version of this course. For previous versions, click here.

Useful links
  • A. Ben-Tal and A. Nemirovski, Lecture Notes on Modern Convex Optimization [link]
  • S. Boyd and L. Vandenberghe, Convex Optimization [link]
  • M. Laurent and F. Vallentin, Semidefinite Optimization [link]
  • R. Vanderbei, Linear Programming and Extentions [link]

The lecture notes below summarize most of what I cover on the whiteboard during class. Please complement them with your own notes.
Some lectures take one class session to cover, some others take two.

  • Lecture 1: A taste of P and NP: scheduling on Doodle + maximum cliques and the Shannon capacity of a graph.
  • Lecture 2: Mathematical background.

  • Lecture 3: Local and global minima, optimality conditions, AMGM inequality, least squares.
  • Lecture 4: Convex sets and functions, epigraphs, quasiconvex functions, convex hullls, Caratheodory's theorem, convex optimization problems.

  • Lecture 5: Separating hyperplane theorems, the Farkas lemma, and strong duality of linear programming.

  • Lecture 6: Bipartite matching, minimum vetex cover, Konig's theorem, totally unimodular matrices and integral polyhedra.

  • Lecture 7: Characterizations of convex functions, strict and strong convexity, optimality conditions for convex problems.
  • Lecture 8: Convexity-preserving rules, convex envelopes, support vector machines.
  • Lecture 9: LP, QP, QCQP, SOCP, SDP.

  • Lecture 10: Some applications of SDP in dynamical systems and eigenvalue optimization.
  • Lecture 11: Some applications of SDP in combinatorial optimization: stable sets, the Lovasz theta function, and Shannon capacity of graphs.

  • Lecture 12: Nonconvex quadratic optimization and its SDP relaxation, the S-Lemma.

  • Lecture 13: Computational complexity in numerical optimization.

  • Lecture 14: Complexity of local optimization, the Motzkin-Straus theorem, matrix copositivity.

  • Lecture 15: Sum of squares programming and relaxations for polynomial optimization.

  • Lecture 16: Robust optimization.

  • Lecture 17: Convex relaxations for NP-hard problems with worst-case approximation guarantees.

  • Lecture 18: Approximation algorithms (ctnd.), limits of computation, concluding remarks.

Problem sets and exams

Solutions are posted on Blackboard. 

  • Homework 1: Image compression and SVD, matrix norms, existence of optimal solutions, descent directions, dual and induced norms, properties of positive semidefinite matrices.
    [pdf] [conway.jpg
  • Homework 2: Convex analysis true/false questions, optimal control, theory-applications split in a course.

  • Homework 3: Support vector machines (Hillary or Bernie?), norms defined by convex sets, totally unimodular matrices, radiation treatment planning.

  • Practice midterms.
    See Blackboard.

  • Midterm:

  • Homework 4: A nuclear program for peaceful reasons, distance geometry, stability of a pair of matrices, SDPs with rational data and irrational feasible solutions.
  • Homework 5: The Lovasz sandwich theorem, SDP and LP relaxations for the stable set problem, Shannon capacity.

  • Homework 6:  

  • Practice final:

  • Final exam: