Ten lectures by mathematics faculty members on interesting topics from both classical and modern mathematics. All lectures accessible to students with calculus background and an interest in mathematics. At each lecture, reading and exercises are assigned. Students prepare these for discussion in a weekly problem session.
Organizer: Prof. Jörn Dunkel
Teaching assistant: Jae Hee Lee (jaehelee@mit.edu)
Attending lectures in-person is strongly recommended.
Lectures are held MWF 1:00-2:30 at 2-190.
Note: Some of the lectures will be recorded and be available the next day 3PM through Panopto for students with MIT credential.
Homeworks are due a week from the lecture, turned in on Gradescope.
Recitations are held every Thursday 10:30-12:00, 1:00-2:30 at 2-147. Both sessions cover the same material.
Office Hours are held in addition to recitations by appointment (via jaehelee@mit.edu).
Grading is strictly P/D/F. To receive a passing grade, you must attend lectures and demonstrate solid effort on the problem sets.
Monday, January 8: Laurent Demanet
Compressed Sensing
Abstract: Compressed sensing is a signal processing idea that combines sparse recovery with random sampling. It is among the most important discoveries in applied math in the 2000s.
Lecture Notes: Notes
Suggested Reading: Chen, Donoho, Saunders, Atomic decomposition by basis pursuit, 1998, and republished in 2001.
Exercises: Problem set due January 15
Wednesday, January 10: Jeremy Kepner & Hayden Jananthan
Mathematics of Big Data & Machine Learning
Abstract: Big Data describes a new era in the digital age where the volume, velocity, and variety of data created across a wide range of fields (e.g., internet search, healthcare, finance, social media, defense, ...) is increasing at a rate well beyond our ability to analyze the data. Machine Learning has emerged as a powerful tool for transforming this data into usable information. Many technologies (e.g., spreadsheets, databases, graphs, linear algebra, deep neural networks, ...) have been developed to address these challenges. The common theme amongst these technologies is the need to store and operate on data as whole collections instead of as individual data elements. This talk describes the common mathematical foundation of these data collections (associative arrays) that apply across a wide range of applications and technologies. Associative arrays unify and simplify Big Data and Machine Learning. Understanding these mathematical foundations allows the student to see past the differences that lie on the surface of Big Data and Machine Learning applications and technologies and leverage their core mathematical similarities to solve the hardest Big Data and Machine Learning challenges.
Lecture Notes: Notes
Exercises: Problem set due January 17
Contact Dr. Jeremy Kepner and Dr. Hayden Jananthan for inquiries regarding potential UROP opportunities.
Friday, January 12: Tristan Ozuch
Gauss' Theorema Egregium
Abstract: In this lecture, motivated by the problem of cartography, and the rigidity of pizza slices, we will introduce basics of metric and Riemannian geometry, with a focus on the most important quantity of the theory: curvature. We will finally state Gauss' Theorema Egregium and discuss some of its most basic but striking consequences.
Lecture Notes + Exercises: Problem set due January 19
The topics in the current lecture may be further pursued in 18.950: Differential Geometry.
Monday, January 15: No lecture (Institution holiday)
Wednesday, January 17: Jörn Dunkel
Chaos in discrete and continuous dynamical systems
Abstract: We will discuss three famous examples of chaotic dynamics.
Lecture Notes: Notes
Matlab code: Lorenz, Henon--Heiles
Exercises: Problem set due January 29
Friday, January 19: Peter Shor
Continued Fractions
Abstract: Continued fractions are a topic in number theory which has applications to rational approximations of real numbers. We will first explain what a continued fraction is, prove some basic theorems about them, and then show how they can be used to find good rational approximations.
Lecture Notes: Notes
Suggested reading: Aaron Pollack's notes
Exercises: Problem set due January 26
Monday, January 22: Jon Kelner
Random Walks, Discrete Harmonic Functions, and Electrical Circuits
Abstract: In this talk, we'll study a natural question about a random walk on an undirected graph: if you start at some vertex x, what is the probability that you hit vertex a before vertex b. We will use this to motivate the definition of a discrete harmonic function. We will then answer our question by using the properties of discrete harmonic functions to develop a surprising relationship between the probabilistic behavior of random walks and the currents and voltages in related electrical circuits.
Exercises: Problem set due January 29
Wednesday, January 24: Keaton Burns
Numerical Simulations with Exponential Accuracy
Abstract: Numerical methods for differential equations are a major area of applied mathematics. Their development brings together approximation theory, numerical linear algebra, parallel algorithms, and the theory of differential equations to build powerful computational tools for scientists and engineers. In this talk, we will examine high-fidelity numerical methods for partial differential equations (PDEs), particularly those with relevance to fluid dynamical problems like climate and biophysical modeling. First, we will review traditional finite difference methods for solving elliptic equations. We will then examine spectral methods, which offer much greater accuracy than traditional grid-based techniques. Our discussion will include spectral approximation theory with Fourier series and orthogonal polynomials, and how to use these ideas to build fast and accurate numerical solvers for wide ranges of PDEs.
Exercises: Problem set due January 31
Friday, January 26: John Bush
The fluid dynamics of COVID transmission
Abstract: We review advances in the theoretical modeling of the fluid mechanics of respiratory disease transmission. We demonstrate how they have led to a better understanding of the various modes of disease transmission and their relative importance in outdoor and indoor settings. We discuss the accompanying evolution of Safety guidelines intended to combat the spread of COVID-19.
Lecture Notes: Notes
Suggested Reading + Exercises: Problem set due February 2
Monday, January 29: Daniel Alvarez-Gavela
The Hairy Ball Theorem
Abstract: The Hairy Ball Theorem states that you can’t comb a hairy ball. More precisely, every tangent vector field on the sphere must vanish at at least one point. We will discuss this result, as well as some generalizations which point to a connection between zeros of vector fields and topology.
Suggested Reading: Milnor's article
Exercises: Problem set due Febraury 2
Wednesday, January 31: Roman Bezrukavnikov
Trees, Groups, and Approximations
Abstract: As explained in Peter Shor's lecture, close rational approximations of an irrational number are obtained from its continued fraction expansion. Given an approximation procedure, one expects it to work particularly well in specific, structured cases and less well in the generic case. However, here it turns out that a beautiful structure arises from considering numbers where the approximation is as bad as possible. In particular, the very worst example is provided by the Golden Ratio. More general ones can be read off integer solutions to the Markov equation a^2+b^2+c^2=3abc, which can be organized in a tree known as Markov tree.
Suggested Reading: Bombieri "Continued fractions and the Markoff tree" 2007, M. Aigner "Markov's Theorem and 100 years of the Uniqueness Conjecture" (1.4, 2.1, 2.2, 3.1, 4.1)
Exercises: Problem set
Friday, February 2: Paul Seidel
The Stokes Phenomenon
Abstract: You may consider linear differential equations as boring, or they may evoke horrible memories of 18.03. Things become more interesting when one considers differential equations with poles. The more straightforward part of the theory (regular singularities) is sometimes taught in engineering-oriented classes, but beyond lies a wild country of divergent series and approximations that jump. This is one of the few subjects where you can see very abstractly oriented research rub elbows with traditional applied math. I will try to give an introduction to this, with examples (partly to remind myself, the phenomenon is so weird I keep forgetting what's going on).
Lecture Notes: Available on Canvas