Mathematics Logo

IAP 2019 Classes

Non-credit activities and classes:

Check out the IAP pages at

For-credit subjects:

Check out the course catalog at You can use the Subject Search functionality to limit the search to IAP listings or find Math's IAP offerings here: Our main offerings in Mathematics are:

18.02A Calculus

Prof Davesh Maulik and staff

Dates: Jan. 7 - Feb. 1

Lectures: MTWRF12
Recitation: TR10-11.30 (2-132, 2-136, 2-142) or TR2-3.30 (2-132, 2-136, 2-146)

Room: 54-100

This is the second half of 18.02A and can be taken only by students who took the first half in the fall term; it covers the remaining material in 18.02.

18.031 System Functions and the Laplace Transform

Dr Philip Pearce

Dates: Jan. 14 - Feb. 1

Lectures: MWF 10am-12noon
(with an extra meeting Tues Jan 22, in place of the MLK Day holiday on the 21st)

Room: 2-131

Studies basic continuous control theory as well as representation of functions in the complex frequency domain. Covers generalized functions, unit impulse response, and convolution; and Laplace transform, system (or transfer) function, and the pole diagram. Includes examples from mechanical and electrical engineering.

18.095 Mathematics Lecture Series

Lecture: MWF1-2.30
Recitation: R10.30-12 (2-131) or R1-2.30 (2-131)

Room: 2-190

Ten lectures by mathematics faculty members on interesting topics from both classical and modern mathematics. All lectures accessible to students with calculus background and an interest in mathematics. At each lecture, reading and exercises are assigned. Students prepare these for discussion in a weekly problem session.

Here is a tentative list of speakers and dates:

  • Monday, January 7: Prof Vadim Gorin

    Persistent random walk and the telegraph equation

    Abstract: We will discuss how the evolution of a random walker on the square grid leads to a second order partial differential equation known as the telegraph equation.

  • Wednesday, January 9: Prof Peter Shor


  • Friday, January 11: Prof Kasso A Okoudjou

    Calculus on fractals


  • Monday, January 14: Prof Gilbert Strang

    The Functions of Deep Learning

    Abstract: We show how the layered neural net architecture of deep learning produces continuous piecewise linear functions as approximations to the unknown map from input to output. A combinatorial formula counts the number of linear pieces in a typical learning function.

  • Wednesday, January 16: Prof Steven Johnson


  • Friday, January 18: Prof John Bush

    Surface tension

    Abstract: Surface tension is a property of fluid interfaces that leads to myriad subtle and striking effects in nature and technology. We describe a number of surface-tension-dominated systems and how to rationalize their behavior via mathematical modeling. Particular attention is given to the influence of surface tension on biological systems.

  • Wednesday, January 23: Dr. Jeremy Kepner

    Mathematics of Big Data & Machine Learning

    Abstract: Big Data describes a new era in the digital age where the volume, velocity, and variety of data created across a wide range of fields (e.g., internet search, healthcare, finance, social media, defense, ...) is increasing at a rate well beyond our ability to analyze the data. Machine Learning has emerged as a powerful tool for transforming this data into usable information. Many technologies (e.g., spreadsheets, databases, graphs, linear algebra, deep neural networks, ...) have been developed to address these challenges. The common theme amongst these technologies is the need to store and operate on data as whole collections instead of as individual data elements. This talk describes the common mathematical foundation of these data collections (associative arrays) that apply across a wide range of applications and technologies. Associative arrays unify and simplify Big Data and Machine Learning. Understanding these mathematical foundations allows the student to see past the differences that lie on the surface of Big Data and Machine Learning applications and technologies and leverage their core mathematical similarities to solve the hardest Big Data and Machine Learning challenges. Supplementary lectures, text, and software are available at:

  • Friday, January 25: Prof Justin Solomon

    Transport, Geometry, and Computation

    Abstract: Optimal transport is a mathematical tool that links probability to geometry. In this talk, we will show how transport can be brought from theory to practice, with applications in machine learning and computer graphics.

  • Monday, January 28: Prof Scott Sheffield


  • Wednesday, January 30: Dr. Chris Rackauckas


18.S097 Special Subject in Mathematics: Applied Category Theory

Drs David Spivak and Brendan Fong

Dates: Jan 14 - Feb 1

Lecture: MTWRF 2-3

Room: 2-142

Category theory is a relatively new branch of mathematics that has transformed much of pure math research. The technical advance is that category theory provides a framework in which to organize formal systems and by which to translate between them, allowing one to transfer knowledge from one field to another. But this same organizational framework also has many compelling examples outside of pure math.

In this course we provide an introductory tour of category theory, with a viewpoint toward modelling real-world phenomena. The course will begin with the notion of poset, and introduce central categorical ideas such as functor, natural transformation, (co)limit, adjunction, the adjoint functor theorem, and the Yoneda lemma in that context. We'll then move to enriched categories, profunctors, monoidal categories, operads, and toposes. Applications to resource theory, databases, codesign, signal flow graphs, and dynamical systems will help ground these notions, providing motivation and a touchstone for intuition. The aim of the course is to provide an overview of the breadth of research in applied category, so as to invite further study.