a logo for the course

Numerical Analysis

David Gleich

Purdue University

Spring 2021

Course number CS-51400, MATH-51400

Online due to COVID-19 Pandemic

Social distance, wear a mask.

Zoom, Tuesday and Thursday, 10:30-11:45


Homework 4

Please answer the following questions in complete sentences in a typed manuscript and submit the solution on Gradescope by April 5th around 5am like our usual deadline.

Problem 0: Homework checklist

Problem 1: Gautschi Exercise 3.29 + more

  1. Derive the 2-point Gauss-Hermite quadrature rule.

  2. Example application (from http://ice.uchicago.edu/2012_presentations/Faculty/Judd/Quadrature_ICE11.pdf). An investor holds one bond that will be worth 1 in the future and equity whose value is where . (So this means that the log of the value of the expected utility is a normally distributed random variable.) The expected utility is the random number . where is a utility function, we'll use , where . (This is a concave utility function because having more money doesn't give you all that much more utility.) We'll use . Suppose also that and . We want to find the expected utility to the investor! This involves evaluating the integral Write a compute program to use Gauss-Hermite quadrature to approximate the value of this integral. You need to justify the number of points you use for the approximation.

Problem 2: Multivariate quadrature

In this problem, we will investigate multivariate quadrature for integrals such as using tensor product rules. Let and be the nodes and weights of an -point 1-dimensional Gauss-Legendre rule. Then the multidimensional quadrature rule is: We can derive this as follows: Of course, this makes it clear we don't have to use the same number of points to integrate in and ! So in general, let be an -point Gauss-Legendre quadrature rule for the -variable and be an -point Gaussian quadrature for the variable.

  1. Implement a computer code to perform two-dimensional Gauss-Legendre quadrature using this type of construction. Your code should allow a user to input and to determine the number of points in each variable.

  2. Use your code to estimate the integrals using points in each dimension.

  3. This is an open ended question that requires you to investigate. You can also find the answer in many textbooks, but if you do so, make sure you document your sources and demonstrate the effect that is claimed. We saw in class that an -point Gaussian quadrature rule exactly integrated polynomials of up to degree . In this problem, I want you to generate a conjecture about the degree of exactness of multidimensional Gaussian quadrature. You should use your code from part 1, along with carefully constructed examples, to support a statement such as:

    My evidence suggests that 2d Gauss-Legendre quadrature will exactly 
    integrate two-dimensional functions $f(x,y)$ when ...
    

    Here are some helpful ideas that may play a role.

    • The total degree of a multidimensional polynomial is the maximum of the sum of degrees of each term. So has total degree .

    • Another type of degree is the largest degree in each variable, so involves polynomials of degree only.

Problem 3: Monte Carlo, Gaussian-Legendre, and Clenshaw-Curtis Quadrature

In this problem, we'll study how different quadrature methods converge on a variety of problems. For a technical paper on this idea, see Trefethen, Is Gaussian Quadrature better than Clenshaw-Curtis? SIAM Review, 2008. In this problem, we'll be studying and reproducing Fig. 2 from that paper.

The functions are all defined on the region and are:

The quadrature methods are:

  1. By whatever method you want, determine the exact values of these 6 integrals. (Hint: Wolfram Alpha is okay!)

  2. Write a program to create the Jacobi matrix for Gauss-Legendre quadrature and show the eigenvalues of the matrix for .

  3. Implement a computer program for Clenshaw-Curtis quadrature. (Hint: you can copy Trefethen's routines, with attribution) but you must explain the steps. Julia implementations are advisable too.

  4. Implement a computer program for Gauss-Legendre quadrature.

  5. Note that the Monte Carlo method is a randomized algorithm, so the result will change if you do it again. Each run is called a trial or sample. Use a computer implementation of Monte Carlo integration to how much the values in a Monte Carlo integration can vary between one trial and the next. Compute the variance for and samples for each of the above functions.

  6. Prepare 6 figures like Trefethen had in his paper for the 6 functions. Except use Monte Carlo integration instead of Newton-Cotes.

  7. Empirically determine how computing the Gaussian Quadrature nodes and weights scales as a function of , the number of points, in terms of CPU time. (Hint, consider between and .) You should be able to justify your final scaling constant in terms of the runtime of a known algorithm.

  8. (Optional) There are some recent developments in fast quadrature methods that produce the nodes and weights much more quickly. In Julia, these are implemented in the FastGaussQuadrature package and the gausslegendre function and in Matlab, the function legpts.m from Chebfun implements them (an old simple version is here: http://www.mathworks.com/matlabcentral/fileexchange/23972-chebfun-v4-old-version--please-download-current-version-instead/content/chebfun/legpts.m Determine the simplest equation that describes the empirical scaling for these routines to find the quadrature points and weights. (Hint, consider between and .)

Problem 4: Gautschi Exercise 4.1

The following sequences converge to as :

Indicate the type of convergence for each sequence in terms of

Problem 5: Implementing a core routine

  1. Using any method we've seen to solve a scalar nonlinear equation (bisection, false position, secant), develop a routine to compute using only addition, subtraction, multiplication, and division (and basic control structures) to numerical precision. (Use double-precision.)

  2. Compare the results of your method to the Matlab/Julia/Python function sqrt. Comment on any differences that surprise you.