Readings and topics
References
- The class textbook
- Numerical Methods by Anne Greenbaum and Tim Chartier
- A fun reference
- Insight through computing by Charles van Loan and K.Y. Daisy Fan
Lecture 40
Newton's method, the secant method, the fixed point form of
a nonlinear equation, and a review before the final midterm.
- Reading
- Chapter 4 Sections 4.3, 4.4.3, 4.5
- Handout
- Nonlinear equations (Pages 3, 4)
- Slides
- Lecture 40
- Julia
- Lecture 40 (html)
(ipynb)
- Notes
- Lecture 40
Lecture 39
Introduction to nonlinear equations
- Reading
- Chapter 4 Sections 4.1, 4.2
- Handout
- Nonlinear equations
- Slides
- Lecture 39
- Julia
- Lecture 39 (Bisection) (html)
(ipynb)
- Notes
- Lecture 39
Lecture 38
Stiff equations, Implicit methods, BVPs
- Reading
- Chapter 11 Sections 11.4
- Chapter 13 Sections 13.1
- Handout
- Intro to BVPs
- BVPs questions
- Julia
- Lecture 38 (Backwards Euler) (html)
(ipynb)
- Lecture 38 (BVPs) (html)
(ipynb)
- Slides
- Lecture 38
Lecture 37
Absolute stability, consistency, stability, convergence, local truncation error,
- Reading
- Chapter 11 Sections 11.4
- Handout
- Intro to ODEs (Pages 3-4)
- Slides
- Lecture 37
- Julia
- Lecture 37 (html)
(ipynb)
- Notes
- Lecture 37
Lecture 36
Solving ODEs on a computer.
- Reading
- Chapter 11 Sections 11.1, 11.2
- Slides
- Lecture 36
- Handout
- Intro to ODEs (Pages 1-2)
- Julia
- Lecture 36 (html)
(ipynb)
- Lecture 36 Example ODE (html)
(ipynb)
Lecture 35
We saw the error analysis of a Newton Cotes formula, and also looked
at the error analysis of the piecewise trapezoidal rule. I mentioned
Simpson's rule (or the three point Newton Cotes rule) see the book for
more information.
- Reading
- Chapter 10.1,10.3
- Slides
- Lecture 35
- Notes
- Lecture 35
Lecture 34
(QUIZ!) We continued talking about quadrature rules and looked at deriving
Newton-Cotes formulas.
- Reading
- Chapter 10.1,10.3
- Notes
- Lecture 34
Lecture 33
An introduction to numerical integration, the method of undetermined
coefficients and quadrature.
- Reading
- Chapter 10.1,10.3
- Slides
- Lecture 33
- Notes
- Lecture 33
Lecture 32
On the ill-conditioning of numerical differentiation, high dimensional
polynomial interpolation, and how to analytically differentiate
expressions for polynomials
- Reading
- Chapter 9 Chapter 8, 9
- Slides
- Lecture 32
- Julia
- Lecture 32 (html)
(ipynb)
- Notes
- Lecture 32
Lecture 31
The TA went over the last exam and quiz.
Lecture 30
Error combos in numerical differentiation and Richardson extrapolation.
- Slides
- Lecture 30
- Reading
- Chapter 9 Section 9.1, 9.2
- Julia
- Lecture 30 (html)
(ipynb)
- Notes
- Lecture 30
Lecture 29
We saw truncation error for numerical differentiation and how to derive
an accuracy estimate using Taylor series.
- Reading
- Chapter 9 Section 9.1
- Slides
- Lecture 29
- Notes
- Lecture 29
Lecture 28
The quiz! Then we saw the Barycentric form of the
Lagrange interpolant, as well as some error analysis.
Finally, we briefly discussed piece-wise polynomial interpolation.
- Reading
- Chapter 8 Section 8.5
- Slides
- Lecture 28
- Julia
- Lecture 28 (html)
(ipynb)
Lecture 27
Today, we saw Lagrange interpolation.
- Reading
- Chapter 8 Section 8.1, 8.2
- Handout
- Polynomial Forms
- Slides
- Lecture 27
- Notes
- Lecture 27
- Julia
- Lecture 27 (html)
(ipynb)
Lecture 26
Introduction to applied mathematics and polynomial interpolation and ApproxFun
- Reading
- Chapter 8 Section 8.1, 8.2
- Handout
- Polynomial Approximation Intro
- Slides
- Lecture 26
- Julia
- Lecture 26 (html)
(ipynb)
- Lecture 26 (ApproxFun) (html)
(ipynb)
- Software
- Chebfun and the Julia counter-part
- ApproxFun
Lecture 25
The midterm
Lecture 24
We saw the power method convergence rate, the block power method, and then
reviewed for the midterm.
- Slides
- Lecture 24
- Notes
- Lecture 24
Lecture 23
We started covering eigenvalues of matrices and saw the basis for
the power method..
- Reading
- Chapter 12 Section 12.1 (Eigenvalue methods)
- Handouts
- Eigenvalues (Pages 1-2)
- Julia
- Lecture 23 Eigenvectors of a string (html)
(ipynb)
- Slides
- Lecture 23
- Notes
- Lecture 23
Lecture 22
We finished our coverage of condition numbers for linear systems, saw
matrix norms, and got a another brief intro to iterative methods. See the
handout for much more detail.
- Reading
- Chapter 7 Section 7.4 (Conditioning)
- Chapter 12 Section 12.2 (Iterative methods)
- Julia
- Lecture 21 Conditioning and Stability (html)
(ipynb)
- Handouts
- Iterative methods
- Slides
- Lecture 22
- Notes
- Lecture 22
Lecture 21
We finished up with least squares using the QR decomposition of a matrix. (This
involved a quick study of orthogonal matrices and how they generalize the
notion of a rotation.) We had a quick aside about iterative methods for linear
systems that gets you to where you need to be for the homework, then we dove
into conditioning and stability.
- Reading
- Chapter 7 Section 7.6 (Least squares)
- Gram-Schmidt process
- Chapter 7 Section 7.4 (Conditioning of linear systems)
- Handouts
- Iterative Methods
- Conditioning and Stability
- Julia
- Lecture 21 Conditioning and Stability (html)
(ipynb)
- Slides
- Lecture 21
- Notes
- Lecture 21
Lecture 20
- Today we continued our coverage of least squares problems
- and the geometry of a simple example; and the QR factorization
- of a matrix and it's relationship to Gram-Schmidt
-
Chapter 7 Section 7.6 (Least squares)
-
Gram-Schmidt process
- Slides
- Lecture 20
- Handouts
- Intro to Least Squares
Lecture 19
Today we covered PageRank as a linear system and least squares problems
- Reading
- Chapter 7 Section 7.6
- Slides
- Lecture 19
- Julia
- Lecture 19 Galileo's problem (html)
(ipynb)
- Handouts
- PageRank as a linear system
- Intro to Least Squares
Lecture 18
We counted the number of operations in Gaussian Elimination
and saw the midterm and saw an important property of LU. See
the slides and the videos online.
- Reading
- Chapter 7 Section 7.3
- Slides
- Lecture 18
- Handouts
- Flops in LU
Lecture 17
We saw code for Gaussian Elimination with LU and Pivoting.
- Reading
- Chapter 7 Section 7.3
- Slides
- Lecture 17
- Julia
- Lecture 17 (html)
(ipynb)
- Notes
- Lecture 17
Lecture 16
We went over Linear Systems of equations, where they come from, and how
to use Gaussian Elimination and the LU factorization to solve them!
In class next time, we will see code to implement these operations in Julia
- Reading
- Chapter 7 Section 7.2
- Julia
- Lecture 16 (html)
(ipynb)
- Notes
- Lecture 16
Lecture 15
This was the memory hierarchy and high-performance matrix matrix multiplication.
- Reading
- HowToOptimizeGEMM
- Memory hierarchy
- Slides
- Lecture 15
- Notes
- Lecture 15
Lecture 14
Today, we saw an introduction to matrix methods, why they
are important, where they occur, and an brief introduction
to matrix-matrix multiplication
- Reading
- Chapter 7 Section 7.0, 7.1
- Chapter 2
- Chapter 1
- Slides
- Lecture 14
- Notes
- Lecture 14
Lecture 13
The first midterm!
Lecture 12
We reviewed the class so far! This included a briefly summary of topics,
a review of HWs 1 and 2, and a set of potential questions for the midterm.
- Slides
- Lecture 12
Lecture 11
We reviewed the problems on the Quiz. Then we went back into Monte Carlo
methods and gave a visual explanation for how Monte Carlo integration works.
This then led to a discussion of how to compute the variance properly
- Reading
- Chapter 3
- Slides
- Lecture 11
- Julia
- Lecture 11 (Monte Carlo) (html)
(ipynb)
- Lecture 11 (Variance) (html)
(ipynb)
Lecture 10
We reviewed some of the basics of probability, random variables, and expectations.
We saw how to compute Pi via a Monte Carlo method, and saw how to turn an
integral into an expectation so we can use Monte Carlo. Then we saw the CLT.
- Reading
- Chapter 3
- Slides
- Lecture 10
- Notes
- Lecture 10
Lecture 9
Introduction to Monte Carlo methods via three examples: the Monty Hall problem,
a pop-quiz on a property of the unit circle, and Google's PageRank problem.
- Reading
- Chapter 3
- Slides
- Lecture 9
- Julia
- Lecture 9 (Monty Hall) (html)
(ipynb)
- Lecture 9 (Circle points) (html)
(ipynb)
- Lecture 9 (PageRank random surfer) (html)
(ipynb)
Lecture 8
Today we had a quiz on floating point arithmetic and then we
saw a few guidelines on how to ensure good floating point computations;
and why you must always be careful!
- Reading
- Chapter 5
- Once dead by Richard Phillips (See intro to Chapter 55, page 208)
- Julia
- Lecture 8 (Accurate norms) (html)
(ipynb)
- Slides
- Lecture 8
Lecture 7
Today we covered how computers do floating point arithmetic, IEEE rounding
modes, and the guarantees of IEEE floating point arithmetic.
- Reading
- Chapter 5 - Section 5.5 and 5.6
- Further reading
- The mathematics of the Intel Floating Point Bug
- Numerical Computing with IEEE Floating Point Arithmetic by Michael Overton
- Julia
- Lecture 6 (Floating point numbers) (html)
(ipynb)
(Holdover from last time!)
- Slides
- Lecture 7
- Notes
- Lecture 7
Lecture 6
Today, we saw an intro to floating point arithmetic.
- Slides
- Lecture 6
- Notes
- Lecture 6
- Reading
- Chapter 5
Lecture 5
We did a quick intro to Latex, and more Julia including control flow,
function, and plotting. Then we did a review of the Quiz.
- Slides
- Lecture 5
- Julia
- Lecture 4 (control flow examples) (html)
(ipynb)
- Lecture 5 (plotting) (html)
(ipynb)
- Lecture 5 (functions) (html)
(ipynb)
- Lecture 5 (functions, filled-in) (html)
(ipynb)
- Lecture 4 (script) (html)
(ipynb)
- Latex
- Simple Latex Document
- Simple Latex Document Compiled
- Edited Homework 1
- Edited Homework 1 Compiled
- Edited Homework 1 Extra Files
Lecture 4
In this lecture, we did a long intro the to the julia language.
- Slides
- Lecture 4
- Reading
- Chapter 2 (but using our Julia conversion below)
- Chapter 2 converted: Using Julia (html)
(ipynb)
- Chapter 2 converted: Plotting in Julia (html)
(ipynb)
- Julia
- Lecture 4 (control flow examples) (html)
(ipynb)
Lecture 3
We had a quiz and saw how to use the Julia language.
- Slides
- Lecture 3
- Reading
- Chapter 2
- Julia
- Chapter 2 converted: Using Julia (html)
(ipynb)
- Chapter 2 converted: Plotting in Julia (html)
(ipynb)
- Lecture 3 in class commands (html)
(ipynb)
Lecture 2
We reviewed what happens in mathematical modeling and how we take a problem
from an initial statement into a compute model. We saw two examples: the
XKCD Raptor Problem as well as Google's PageRank problem.
- Slides
- Lecture 2
- Notes
- Lecture 2
- Reading
- Chapter 1
- Julia
- Lecture 2: The XKCD Raptor Problem (html)
(ipynb)
Lecture 1
We reviewed the syllabus, and saw the importance of computing.
- Slides
- Lecture 1
- Reading
- Syllabus
- Chapter 1
- Julia
- Lecture 1: Polygon Midpoints (html)
(ipynb)