CS 229r. Proofs, Beliefs, and Algorithms through the lens of Sum of Squares

In this graduate seminar we will cover recent research on the use of mathematical programming for problems arising from optimization, machine learning, computational complexity and more, with a particular focus on the Parrilo-Lasserre "Sum of Squares" semidefinite programming hierarchy. We will discuss both lower and upper bounds, as well as how such mathematical programs give rise to a general theory of computational difficulty, computation vs. sample size tradeoffs, and computational analogs of Bayesian probabilities.More concretely, we will touch some of the following topics:* Upper and lower bounds for various average case problems, including random constraint satisfaction, planted clique, and problems arising from machine learning.* Speeding up Sum of Squares algorithms.* Relation to Khot's Unique Games  Conjecture (UGC) and the SOS approach to refuting the UGC.* Can SoS be optimal algorithm in some settings? What are the  candidate algorithms who could do better? In what settings might *linear* programming already be optimal? What kind of implication could such optimality entail?* Semidefinite extension complexity, relations to communication complexity.* Relation to statistical physics, and algorithms such as belief propagation and survey propagation.* Relation to quantum information theory, quantum entanglement, and the log rank conjecture.* Reducing asymptotic SoS questions to finite size via symmetry and relation to Razborov's flag algebras and Turan problems.However, this is a fast moving research area and our plans may change as new results, as well as new understandings of old results, come to light. The course will not require much mathematical background beyond so called "mathematical maturity". However, some familiarity with notions such as convexity, linear programming duality, separation oracles, eigenvalues/eigenvectors and positive semidefiniteness, could be helpful.