Hi! I am currently a third year graduate student at MIT. Before coming to MIT, I was an undergrad at Princeton. My current research interests include learning theory and related topics: probability theory, optimization, statistical physics, etc.
To reach me: [first initial][last name]@mit.edu
Publications and Preprints
- Mean-field approximation, convex hierarchies, and the optimality of correlation rounding: a unified perspective, joint with Vishesh Jain and Andrej Risteski. [arXiv:1808.07226]
- Learning Restricted Boltzmann Machines via Influence Maximization, joint with Guy Bresler, Ankur Moitra, and Elchanan Mossel. [arXiv:1805.10262]
- Representational Power of ReLU Networks and Polynomial Kernels: Beyond Worst-Case Analysis, joint with Andrej Risteski. [arXiv:1805.11405]
- The Vertex Sample Complexity of Free Energy is Polynomial, joint with Vishesh Jain and Elchanan Mossel. Conference on Learning Theory (COLT) 2018. [arXiv:1802.06129]
- The Mean-Field Approximation: Information Inequalities, Algorithms, and Complexity, joint with Vishesh Jain and Elchanan Mossel. Conference on Learning Theory (COLT) 2018. [arXiv:1802.06126]
- Information theoretic properties of Markov random fields, and their algorithmic applications, joint with Linus Hamilton and Ankur Moitra. Neural Information Processing Systems (NIPS) 2017. [arXiv:1705.11107]
- Busy Time Scheduling on a Bounded Number of Machines, joint with Samir Khuller. Algorithm and Data Structures Symposium (WADS) 2017. (Full Version, slides)
- Provable algorithms for inference in topic models, joint with Sanjeev Arora, Rong Ge, Tengyu Ma, and Ankur Moitra. International Conference on Machine Learning (ICML) 2016. [arXiv:1605.08491]
- Optimal batch schedules for parallel machines, joint with Samir Khuller. Algorithm and Data Structures Symposium (WADS) 2013.