\documentclass[11pt]{article} \usepackage{fullpage,amsmath,amssymb} \usepackage[pdftex]{graphicx} \usepackage[pdftex,colorlinks=true,plainpages=false]{hyperref} \newcommand{\T}{^\mathsf{T}} \begin{document} \begin{center} {\Large COMPUTER SCIENCE 314}\\ {\Large Numerical Methods} \\ {\large SPRING 2013} \\ ASSIGNMENT \# 5 (25 points) \\ {\small March 7} \end{center} \subsection*{Announcements} \begin{itemize} \item Exam 5 is on Thursday, April 4 from noon to 12:25 pm. It covers material from this homework, Sections~4.1--4.5, 4.9 of the textbook, and Chapter~5 of the class notes. \item The final exam is scheduled for Saturday, May 4th, 1:00--3:00 pm in Grissom Hall 180. \end{itemize} \subsection*{Due Thursday, March 28 at noon.} \begin{enumerate} \item[0.] Read Sections 4.1--4.5, 4.9 of the textbook. \item (18 points) Consider the nonlinear function \[ f(x) = \frac1{64}(x + 3)^2(x - 2)(x - 6).\] \begin{enumerate} \item (3 points) Make a reasonable plot of $f$, by hand if you like, and mark the locations of the roots and the points $x$ where $f$ has zero slope. Label each of these with their precise numerical value. Consider now the application of Newton-Raphson to $f$. There exists a special pair of values $\alpha <\beta$ such that Newton-Raphson alternates between them. Show their approximate location. \item (1 point) What is the rate of convergence of Newton-Raphson to each root? \item (3 points) Code a MATLAB {\tt function x = newton(x)} or its Python equivalent for performing one Newton iteration for $f$. To avoid a pointless division by zero, do the Newton update only if $f$ is nonzero at the current iterate. (Alternatively, remove common factors from numerator and denominator.) Also, code your function so that it also works (independently) for a column vector of values $x$, e.g., use {\tt find} ({\tt where} in NumPy) instead of {\tt if} for testing. {\em Hand in} your code. \item (3 points) You will create a plot using dot markers showing the effect of the initial iterate $x_0$ on the final {\tt newton} iterate for 1101 equally spaced values of $x_0$ in the interval $[-4, 7]$: if convergence is to the $m$th root from the left (not counting multiplicities) plot the value $m$; if convergence fails, e.g., due to a vanishing slope, plot the value 4. Convergence is defined to mean that the value $x$ obtained after 100 iterations is exactly equal to one of the three roots after rounding $x$ with the {\tt single} function. For plotting clarity, invoke {\tt axis([-4 7 0 5])}. {\em Hand in} your code and your plot. \item (2 points) Find and print the set of initial iterates $x_0$ which lead to an iteration that fails due to reaching a point at which the slope of $f$ becomes zero. Do this by using {\tt fzerotx} to run {\tt newton} backwards 1 to 10 iterations from each point at which the slope of $f$ is zero. {\em Hand in} your code and your numerical results. \item (2 points) Calculate the values $\alpha <\beta$ such that Newton-Raphson alternates between them. Use your {\tt newton} function and {\tt fzerotx}. You may wish to do a plot or two before selecting a search interval. {\em Hand in} your code and your numerical results. \item (4 points) Write a script {\tt secant} that applies secant iterations to $f$ beginning with $x_0 = -2$, $x_1 = 5$ and continuing until two successive iterates are exactly equal. (Generally this might never happen, but it works here due to the special form of $f$.) Which iterates are {\em not} closer to the final iterate than they are to one of the other two roots? Your algorithm should evaluate $f$ only once per iteration. {\em Hand in} your code and your numerical results. \end{enumerate} \item (3 points) Let $f(x)$ be a function defined for all $x$ with a root at $r = 0.69$ and with a positive first and second derivative for all $x$. (That is, the function is strictly increasing and concave.) \begin{enumerate} \item Does the secant method converge for any initial guesses $x_0 < x_1 < 0.69$? for any initial guesses $x_0 > x_1 > 0.69$? \item If $x_0 = 1$ and $x_1 = 0$ ({\em take note of the indices!\/}), what can we say about the relative positions of the iterates $x_0$, $x_1$, $x_2$, $x_3$ and the root $r = 0.69$? In other words, arrange these five values from least to greatest. \end{enumerate} \item (4 points) In a golden section search if we have values \[x_0 = a, \quad x_1 = a + r^2h, \quad x_2 = a + rh, \quad x_3 = a + h, \] where $r = 2/(1 + \sqrt{5})$ and \[f(x_0) \geq f(x_1) \leq f(x_3), \] how do we choose the new $x'_0$, $x'_1$, $x'_2$, $x'_3$ based on the value of $f(x_2)$? Give formulas. \newcounter{myenumi}\setcounter{myenumi}{\value{enumi}} \end{enumerate} \newpage \subsection*{Problem not to hand in} However, a solution will be provided. \begin{enumerate} \setcounter{enumi}{\value{myenumi}} \item Consider the use of the bisection method to locate a root of a continuous function $f(x)$ known to satisfy $f(1.5)<0$ and $f(2)>0$. Determine the number of times we {\em MUST} evaluate $f(x)$ to obtain an approximation to a root of $f(x)$ that is guaranteed to have an error less than $0.01$. \end{enumerate} \subsection*{Learning objectives} \begin{itemize} \item Program the bisection method to find a zero of a function to a given tolerance or to machine accuracy. \item Calculate the number of iterations needed by the bisection method to obtain a root of given accuracy. \item Perform or program iterations of Newton's method and of the secant method. \item For simple examples determine to which root, if any, Newton's method and the secant method converges. \item Define convergence rate. Give the rate of convergence for Newton's method and the secant method. \item Give a precise high-level description of inverse interpolation and its application to zero-finding. \item Perform or program iterations of golden section search and successive parabolic interpolation. \end{itemize} \end{document}