Convex and Conic Optimization    Spring 2018, Princeton University (graduate course)

(This is the Spring 2018 version of this course. For the Spring 2017 version click hereFor the Spring 2016 version click hereFor the Spring 2015 version click here.) 

Useful links
  • A. Ben-Tal and A. Nemirovski, Lecture Notes on Modern Convex Optimization [link]
  • S. Boyd and L. Vandenberghe, Convex Optimization [link]
  • M. Laurent and F. Vallentin, Semidefinite Optimization [link]
  • R. Vanderbei, Linear Programming and Extentions [link]

The lecture notes below summarize most of what I cover on the whiteboard during class. Please complement them with your own notes.
Some lectures take one class session to cover, some others take two.

  • Lecture 1: A taste of P and NP: scheduling on Doodle + maximum cliques and the Shannon capacity of a graph.
  • Lecture 2: Mathematical background.

  • Lecture 3: Local and global minima, optimality conditions, AMGM inequality, least squares.
  • Lecture 4: Convex sets and functions, epigraphs, quasiconvex functions, convex hullls, Caratheodory's theorem, convex optimization problems.

  • Lecture 5: Separating hyperplane theorems, the Farkas lemma, and strong duality of linear programming.

  • Lecture 6: Bipartite matching, minimum vetex cover, Konig's theorem, totally unimodular matrices and integral polyhedra.

  • Lecture 7: Characterizations of convex functions, strict and strong convexity, optimality conditions for convex problems.
  • Lecture 8: Convexity-preserving rules, convex envelopes, support vector machines.
  • Lecture 9: LP, QP, QCQP, SOCP, SDP.

  • Lecture 10: Some applications of SDP in dynamical systems and eigenvalue optimization.
  • Lecture 11: Some applications of SDP in combinatorial optimization: stable sets, the Lovasz theta function, and Shannon capacity of graphs.

  • Lecture 12: Nonconvex quadratic optimization and its SDP relaxation, the S-Lemma.

  • Lecture 13: Computational complexity in numerical optimization.

  • Lecture 14: Complexity of local optimization, the Motzkin-Straus theorem, matrix copositivity.

  • Lecture 15: Sum of squares programming and relaxations for polynomial optimization.

  • Lecture 16: Robust optimization.

  • Lecture 17: TBA (guest lecture by Dr. Oktay Gunluk, IBM Watson Research Center)

  • Lecture 18: Convex relaxations for NP-hard problems with worst-case approximation guarantees.

  • Lecture 19: Approximation algorithms (ctnd.), limits of computation, concluding remarks.

Problem sets and exams

Solutions are posted on Blackboard. 

  • Homework 1: Image compression and SVD, matrix norms, optimality conditions, properties of positive semidefinite matrices.
    [pdf],  [Shams.jpg
  • Homework 2:  Convex analysis, symmetries and convex optimization, minimum fuel optimal control, theory-application split in a course.

  • Practice midterm 1.
    [pdf], [pdf]
  • Midterm 1.

  • Homework 3: Support vector machines (Hillary or Bernie?), norms defined by convex sets, totally unimodular matrices, radiation treatment planning, Farkas lemma.
    [pdf], [treatment_planning_data.m], [Hillary_vs_Bernie.mat]

  • Homework 4: 

  • Homework 5: 

  • Practice midterm 2.

  • Midterm 2.

  • Homework 6: 

  • Practice final. 

  • Final exam.