IAP 2020 Classes

Non-credit activities and classes:

Check out the IAP pages at http://web.mit.edu/iap/listings/

For-credit subjects:

Check out the course catalog at http://student.mit.edu/catalog/m18a.html. You can use the Subject Search functionality to limit the search to IAP listings or find Math's IAP offerings here: http://student.mit.edu/catalog/search.cgi?search=18&when=J. Our main offerings in Mathematics are:

18.02A Calculus

  • Prof Zhiwei Yun and staff
  • Dates: Jan. 6 - 31
  • MTWRF12
  • TR10-11.30 (2-132, 2-135, 2-136, 2-142) or TR2-3.30 (2-132, 2-135, 2-136, 2-142) +final
  • 54-100

12 units (only 6 will count toward IAP credit limit)

This is the second half of 18.02A and can be taken only by students who took the first half in the fall term; it covers the remaining material in 18.02.

18.031 System Functions and the Laplace Transform

  • Dr Diego Cifuentes
  • Jan. 13 - 31
  • MWF 1-3pm
    (with an extra meeting Tues Jan 21, in place of the MLK Day holiday on the 20th)
  • 2-131

3 units (P/D/F graded)

Studies basic continuous control theory as well as representation of functions in the complex frequency domain. Covers generalized functions, unit impulse response, and convolution; and Laplace transform, system (or transfer) function, and the pole diagram. Includes examples from mechanical and electrical engineering.

18.095 Mathematics Lecture Series

  • MWF1-2.30
  • R10.30-12 (2-131) or R1-2.30 (2-131)
  • 2-190

6 units (P/D/F graded)

Ten lectures by mathematics faculty members on interesting topics from both classical and modern mathematics. All lectures accessible to students with calculus background and an interest in mathematics. At each lecture, reading and exercises are assigned. Students prepare these for discussion in a weekly problem session.

Course Website

Lecture Schedule

Monday, January 6 Prof Haynes Miller Knots and Numbers

Is a granny knot really different from a square knot? I'll explain how a mathematician addresses this question, and then focus attention on the related but less knotty theory of tangles. We'll end with a verification of a theorem of Horst Schubert by means of a square dance.

Wednesday, January 8 Prof David Vogan Sixty-six miles per hour: breaking the commutative law by just a bit

One of the things that makes arithmetic easy is the commutative laws. You've learned by now to manage without commutativity when you multiply matrices, where AB has very little to do with BA. I'll talk about places where AB is only a little different from BA, and some of the things that lets you do.

Friday, January 10 Chris Rackauckas Scientific Machine Learning

Machine learning (ML) and scientific computing have previously lived in separate worlds, with one focusing on training neural networks and the other solving partial differential equations. A recently emerging discipline, scientific ML or physics-informed learning, has been bucking the trend by integrating elements of ML into scientific computing workflows. This course will be a project-based dive into scientific ML.

Monday, January 13 Prof Scott Sheffield Tug of War and Infinity Laplacian

I will discuss several games whose analysis involves interesting mathematics. First, in the mathematical version of tug of war, play begins at a game position $x_0$. At each turn a coin is tossed, and the winner gets to move the game position to any point within $\epsilon$ units of the current point. (One can imagine the two players are holding a rope, and the "winner" of the coin toss is the one who gets a foothold and then has the chance to pull one step in any desired direction.) Play ends when the game position reaches a boundary set, and player two pays player one the value of a "payoff function" defined on the boundary set.

So... what is the optimal strategy? How much does player one expect to win (in the $\epsilon \to 0$ limit) when both players play optimally? We will answer this question and also explain how this game is related to the "infinity Laplacian," to "optimal Lipschitz extension theory" and to a random turn version of a game called Hex.

Wednesday, January 15 Prof Philippe Rigollet Statistical and Computational Optimal Transport

Optimal transport is a fundamental concept in probability theory which defines a geometry on the space of measures via optimal couplings. Over the past few years, optimal transport has been applied to images, point clouds, and other objects which can be viewed as probability distributions, leading to breakneck advances across machine learning, computer vision, computer graphics, and computational biology. The increasing scale of modern data presents two challenges: to provide theoretical guarantees for the performance of optimal transport techniques, and to develop faster algorithms for solving optimal transport problems in practice. In this lecture, we investigate both the statistical and computational aspects of optimal transport and highlight an application to single cell genomic data analysis.

Friday, January 17 Prof Jeremy Kepner Mathematics of Big Data & Machine Learning

Big Data describes a new era in the digital age where the volume, velocity, and variety of data created across a wide range of fields (e.g., internet search, healthcare, finance, social media, defense, ...) is increasing at a rate well beyond our ability to analyze the data. Machine Learning has emerged as a powerful tool for transforming this data into usable information. Many technologies (e.g., spreadsheets, databases, graphs, linear algebra, deep neural networks, ...) have been developed to address these challenges. The common theme amongst these technologies is the need to store and operate on data as whole collections instead of as individual data elements. This talk describes the common mathematical foundation of these data collections (associative arrays) that apply across a wide range of applications and technologies. Associative arrays unify and simplify Big Data and Machine Learning. Understanding these mathematical foundations allows the student to see past the differences that lie on the surface of Big Data and Machine Learning applications and technologies and leverage their core mathematical similarities to solve the hardest Big Data and Machine Learning challenges.

Wednesday, January 22 Prof Elchanan Mossel Mathematical aspects of voting

I will survey some mathematical aspects of voting, beginning with its paradoxical nature observed by mathematicians in the 18th century (the beginning days of modern democracy) to very modern aspects such as gerrymandering, opinion campaigns, and polarization.

Friday, January 24 Prof Paul Seidel The Lefschetz fixed point number

Solomon Lefschetz lost both hands in an industrial accident, succesfully re-trained from an engineer to a pure mathematician, and made crucial contributions to algebraic geometry and topology. Among them is the Lefschetz fixed point number. A fixed point (of a continuous map) is a point such that f(x) = x. The Lefschetz number is a topological quantity which counts (and thereby can be used to prove the existence of) fixed points. The lecture will introduce you to this number and its properties.

Monday, January 27 Prof Chenyang Xu Elliptic function and elliptic curve

Elliptic functions was a subject studied by many giants in the history of mathematics. To understand it, mathematicians indeed developed a geometric theory to study its geometric counterpart, called elliptic curve. This can be arguably called the starting point of the subject algebraic geometry. We aim to give a survey on the basic theory.

Wednesday, January 29 Justin Solomon Expanding the Scope and Dimensionality of Field Design

The design of vector fields and frame fields is a computational problem central to applications in scientific computing and computer graphics. Recent models and algorithms for this problem employ ideas from topology and discrete differential geometry to design fields that are smooth and obey special constraints that arise in applications like quadrilateral remeshing and physical simulation. But, many of these techniques only work for static design problems on two-dimensional surfaces. In this talk, I will describe our efforts to tackle field design problems with broader applications and in dimensions larger than two. Along the way, we will introduce modern tools from computational geometry processing, numerical differential geometry, and optimization.

18.S096 Special Subject in Mathematics: Applications of Scientific Machine Learning

  • Dr. Christopher Rackauckas
  • Jan 6 - 31
  • MTWR 1-3pm
  • 2-139

6 units

Machine learning and scientific computing have previously lived in separate worlds, with one focusing on training neural networks for applications like image processing and the other solving partial differential equations defined in climate models. However, a recently emerging discipline, called scientific machine learning or physics-informed learning, has been bucking the trend by integrating elements of machine learning into scientific computing workflows. These recent advances enhance both the toolboxes of scientific computing and machine learning practitioners by accelerating previous workflows and resulting in data-efficient learning techniques ("machine learning with small data").

This course will be a project-based dive into scientific machine learning, directly going to the computational tools to learn how the practical aspects of "doing" scientific machine learning. Students will get hands-on experience building programs which:

  • Train data-efficient physics-informed neural networks
  • Accelerate scientific models using surrogate methods like neural networks
  • Solve hundred dimensional partial differential equations using recurrent neural networks
  • Solve classical machine learning problems like image classification in a more efficient manner with neural ordinary differential equations
  • Use machine learning and data-driven techniques to automatically discover physical models from data

The class will culminate with a project where students apply these techniques to a scientific problem of their choosing. This project may be tied to one's on-going research interest (this is recommended!).

Note that the difference from the recent 18.337 (https://github.com/mitmath/18337) is that 18.337 focuses on the mathematical and computational underpinning of how software frameworks train scientific machine learning algorithms. In contrast, this course will focus on the applications of scientific machine learning, looking at the current set of methodologies from the literature and learning how to train these against scientific data using existing software frameworks. Consult https://mitmath.github.io/18337/lecture15/diffeq_machine_learning as a sneak preview of the problems one will get experience solving.

18.S097 Special Subject in Mathematics: Programming with Categories

  • Drs David Spivak and Brendan Fong
  • Jan 7 - 31
  • MTWRF 2-3
  • 4-163

4 units (P/D/F graded)

In this course we explain how category theory—a branch of mathematics known for its ability to organize the key abstractions that structure much of the mathematical universe—has become useful for writing elegant and maintainable code. In particular, we'll use examples from the Haskell programming language to motivate category-theoretic constructs, and then explain these constructs from a more abstract and inclusive viewpoint. Hands-on programming exercises will be used to demonstrate categorical ideas like "the universal property of products" in working Haskell code. A rough list of topics includes:

  1. Sets, types, categories, functors, natural transformations
  2. Universal constructions and associated data types
  3. Adjunctions and cartesian closed categories
  4. Algebras, catamorphisms, anamorphisms
  5. Monads, comonads, Kleisli arrows
  6. Monoids, monoidal categories, lax monoidal functors, applicatives
  7. Profunctors, (co)ends, optics

We will assume no background knowledge on behalf of the student, starting from scratch on both the programming and mathematics. The course website can be found at http://brendanfong.com/programmingcats.html

18.S997 Special Subject in Mathematics: Introduction to Discrete Geometry

  • Dr. Zilin Jiang
  • Jan 21 - 31
  • TWRF 10-12
  • 2-139

4 units

Questions in Discrete Geometry typically involve simple geometric objects such as points, lines, circles and planes. More complicated objects such as convex polytopes are investigated too. In this course, we provide an introductory tour of several classical results in Discrete Geometry such as Carathéodory's theorem, Helly's theorem and Tverberg's theorem, their colorful / topological generalizations and applications in combinatorics. For the most part, the material is not very new, but the presentation has been strongly influenced by the recent developments. Assumes prior knowledge of linear algebra and point-set topology.