a logo for the course

Computational Methods in Optimization

David Gleich

Purdue University

Spring 2017

Course number CS-52000

Tuesday and Thursday, 1:30-2:45pm

Location Lawson B134


Homework 7

Please answer the following questions in complete sentences in submit the solution on Blackboard by the due date there.

Problem 0: List your collaborators.

Please identify anyone, whether or not they are in the class, with whom you discussed your homework. This problem is worth 1 point, but on a multiplicative scale.

Problem 1

(Adapted from Nocedal and Wright 3.1, typos are my own) Backtracking line search is a simple method that attempts to balance sufficient decrease with a long step length by testing a sequence of points: and accepting the first point that satisfies sufficient decrease (which is the condition that for the line search function we discussed in class).

  1. Implement a backtracking line search routine to select the step length and add this to our simple gradient descent code from Lecture 9. (Note, you are free to implement gradient descent using your own code too, adapting our code is only suggested for convenience.)

  2. Prepare a plot of the step lengths (values of that were accepted) when we run gradient descent with this line search from the point and also the point .

  3. Implement Newton's method using line search as well. (Don't worry about singularity in the Hessian or any of the real issues that come up with Newton -- just use the search direction .

  4. Prepare a plot of the step lengths as in part 2.

  5. Discuss any notable differences or similarities.

Problem 2

Describe what would happen if you used backtracking line search on a strongly convex quadratic: Be as precise as possible and think about comparing with exact line search. (Which we worked out for this case in class.) Note, the answer I'm hoping for here is a case-style analysis.