18.095 - Mathematics Lecture Series, IAP 2018

This course consists of independent lectures given by faculty members of the mathematics department at MIT.
The lecturers assign homework problems related to the material presented.

Organizer and contact: Prof. Alan Edelman

Teaching assistant: Umut Varolgunes - umutvgzzzzzzz@mit.edu (z's removed)

Info

Lectures are held MWF 1-2:30 in room 2-190.

Homeworks are due each Friday by 4pm and need to be turned in at the pset boxes next to 4-174 (to the right of HQ). P-sets will be posted on this site shortly after each lecture.

Recitations are held every Thursday at 10.30am and 1pm in room 2-131 (both sessions cover the same material).

Office hours are W 3-4pm, in 2-238.

This course is offered with the P/D/F grading option. To receive a passing grade, we ask that you attend lectures and put forth an effort on the problem sets.
Homeworks will be collected every Friday, graded and returned to you the following week.
You will be able to check your grades and registration status at the Stellar website for the class.

Class schedule

Lecture 1. M, Jan 8, Heather Macbeth: Tangent developables
Abstract:
A piece of paper can be twisted into a cone, or a cylinder. Euler, in 1772, showed that there is a third kind of surface that can be made in this way: the "tangent developables." I will explain what tangent developables are, and show how to find the pieces of paper that twist into them. Bring scissors!
- Notes from the class
- Tangent surfaces of some curves: helix (image, Wolfram CDF Player interactive demonstration), twisted cubic (image, Java plugin interactive demonstration), rational curve of degree four (image, Java plugin interactive demonstration), unspecified curve (video).
- Homework (due Jan 12)

Lecture 2. W, Jan 10, Gilbert Strang: Linear algebra and neural networks
- Lecture notes
- Playing with neural nets
- Slides
- Homework
(due Jan 12)

Lecture 3. F, Jan 12, Steven Johnson: The brachistochrone problem and the calculus of variations
- Computational examples and exercises
- Notes from the class
- Homework is the two problems at the end of the link (due Jan 19)

Lecture 4. W, Jan 17, Jeremy Kepner: Mathematics of Big Data & Machine Learning
Abstract:
Big Data describes a new era in the digital age where the volume, velocity, and variety of data created across a wide range of fields (e.g., internet search, healthcare, finance, social media, defense, ...) is increasing at a rate well beyond our ability to analyze the data. Machine Learning has emerged as a powerful tool for transforming this data into usable information. Many technologies (e.g., spreadsheets, databases, graphs, linear algebra, deep neural networks, ...) have been developed to address these challenges. The common theme amongst these technologies is the need to store and operate on data as whole collections instead of as individual data elements. This lecture describes the common mathematical foundation of these data collections (associative arrays) that apply across a wide range of applications and technologies. Associative arrays unify and simplify Big Data and Machine Learning. Understanding these mathematical foundations allows the student to see past the differences that lie on the surface of Big Data and Machine Learning applications and technologies and leverage their core mathematical similarities to solve the hardest Big Data and Machine Learning challenges.
- Homework is to read about backpropagation, and write a page long summary of what you understand - use of actual equations are preferred, you can restrict yourself to ReLU. (due Jan 19)

Lecture 5. F, Jan 19, Dan Stroock: Some thoughts about taking square roots
Abstract:
The traditional algorithm for computing square roots is cumbersome. In this lecture, I will discuss an alternative approach, one that involves some elementary number theory.
- Lecture notes
- Homework is the exercise at the end of the notes (due Jan 26)

Lecture 6. M, Jan 22, Juan Pablo Vielma: Modeling and Solving Discrete Optimization Problems in Practice
Abstract:
Discrete optimization can model a wide range of problems in business, science and engineering. Some discrete optimization problems can be very computationally challenging in theory. However, solutions with guaranteed high-quality can sometimes be quickly found in practice using state-of-the-art solvers. In this lecture we will see how the power of these solvers can be easily accessed through the Julia-based modeling language JuMP. We will also give an overview of the mathematical concepts and tools that are the basis for the effectiveness of these solvers.

Lecture 7. W, Jan 24, Thomas Beck: Eigenvalues of the Laplacian on domains
Abstract:
We will define eigenvalues and eigenfunctions of the Laplacian on intervals and two dimensional domains, and see in particular why they play a role in understanding the vibrations of an elastic string or membrane. In certain cases these eigenfunctions can be computed explicitly, and we will see to what extent the eigenvalues (vibrating modes) determine the membrane itself.

Lecture 8. F, Jan 26, Scott Sheffield

Lecture 9. M, Jan 29, Philippe Rigollet: The Birthday Problem(s)
Abstract:
What is the probability that two students in this class are born on the same day (independently of the year)? This is the birthday problem. We are going to answer this question based on the assumption that the probability of being born on a given day is the same for all 365 days of the year. What is this assumption is not true? How many students does it take to test if it holds? We’ll also give an algorithm that does that. The class will use elementary probability.

Lecture 10. W, Jan 31, Joern Dunkel