Frederic Koehler

Hi! I am currently a third year graduate student at MIT. Before coming to MIT, I was an undergrad at Princeton. My current research interests include learning theory and related topics: probability theory, optimization, statistical physics, etc.

To reach me: [first initial][last name]@mit.edu

Publications and Preprints

  1. Mean-field approximation, convex hierarchies, and the optimality of correlation rounding: a unified perspective, joint with Vishesh Jain and Andrej Risteski. [arXiv:1808.07226]
  2. Learning Restricted Boltzmann Machines via Influence Maximization, joint with Guy Bresler, Ankur Moitra, and Elchanan Mossel. [arXiv:1805.10262]
  3. Representational Power of ReLU Networks and Polynomial Kernels: Beyond Worst-Case Analysis, joint with Andrej Risteski. [arXiv:1805.11405]
  4. The Vertex Sample Complexity of Free Energy is Polynomial, joint with Vishesh Jain and Elchanan Mossel. Conference on Learning Theory (COLT) 2018. [arXiv:1802.06129]
  5. The Mean-Field Approximation: Information Inequalities, Algorithms, and Complexity, joint with Vishesh Jain and Elchanan Mossel. Conference on Learning Theory (COLT) 2018. [arXiv:1802.06126]
  6. Information theoretic properties of Markov random fields, and their algorithmic applications, joint with Linus Hamilton and Ankur Moitra. Neural Information Processing Systems (NIPS) 2017. [arXiv:1705.11107]
  7. Busy Time Scheduling on a Bounded Number of Machines, joint with Samir Khuller. Algorithm and Data Structures Symposium (WADS) 2017. (Full Version, slides)
  8. Provable algorithms for inference in topic models, joint with Sanjeev Arora, Rong Ge, Tengyu Ma, and Ankur Moitra. International Conference on Machine Learning (ICML) 2016. [arXiv:1605.08491]
  9. Optimal batch schedules for parallel machines, joint with Samir Khuller. Algorithm and Data Structures Symposium (WADS) 2013. (Full version)