Subscribe to the mailing list to receive talks announcements
For more information, contact Laurent Demanet
Fall 2025
Fall semester 4:30pm-5:30pm in room number 2-190
Date | Speaker | Abstract |
---|---|---|
September 11 |
Shay Moran |
Differentially Private Linear Algebra Abstract: Differential privacy (DP) has emerged as a powerful framework for designing algorithms that protect sensitive data. In this talk, I will present our work at the intersection of differential privacy and linear algebra, introducing efficient DP algorithms for fundamental algebraic tasks: solving systems of linear equations over arbitrary fields, solving linear inequalities over the reals, and computing affine spans and convex hulls. Our algorithms for equalities are strongly polynomial, while those for inequalities are only weakly polynomial—and this gap is provably inherent. As applications, we obtain the first efficient DP algorithms for learning halfspaces and affine subspaces. The talk will not assume prior familiarity with differential privacy; I will begin with a review of the definition. |
October 9 |
Eitan Tadmor |
Swarm-Based Gradient Descent: A Multi-Agent Approach for Non-Convex Optimization Abstract: We discuss a novel class of swarm-based gradient descent (SBGD) methods for non-convex optimization. The swarm consists of agents, each is identified with position, x, and mass, m. There are two key ingredients in the SBGD dynamics: (i) persistent transition of mass from agents at high to lower ground; and (ii) time stepping protocol which decreases with m. The interplay between positions and masses leads to dynamic distinction between ‘leaders and ‘explorers: heavier agents lead the swarm near local minima with small time steps; lighter agents use larger time steps to explore the landscape in search of improved global minimum, by reducing the overall loss of the swarm. Convergence analysis and numerical simulations demonstrate the effectiveness of SBGD method as a global optimizer. |