$\newcommand{\eps}{\varepsilon} \newcommand{\kron}{\otimes} \DeclareMathOperator{\diag}{diag} \DeclareMathOperator{\trace}{trace} \DeclareMathOperator{\rank}{rank} \DeclareMathOperator*{\minimize}{minimize} \DeclareMathOperator*{\maximize}{maximize} \DeclareMathOperator{\subjectto}{subject to} \newcommand{\mat}[1]{\boldsymbol{#1}} \renewcommand{\vec}[1]{\boldsymbol{\mathrm{#1}}} \newcommand{\vecalt}[1]{\boldsymbol{#1}} \newcommand{\conj}[1]{\overline{#1}} \newcommand{\normof}[1]{\|#1\|} \newcommand{\onormof}[2]{\|#1\|_{#2}} \newcommand{\MIN}[2]{\begin{array}{ll} \minimize_{#1} & {#2} \end{array}} \newcommand{\MINone}[3]{\begin{array}{ll} \minimize_{#1} & {#2} \\ \subjectto & {#3} \end{array}} \newcommand{\MINthree}[5]{\begin{array}{ll} \minimize_{#1} & {#2} \\ \subjectto & {#3} \\ & {#4} \\ & {#5} \end{array}} \newcommand{\MAX}[2]{\begin{array}{ll} \maximize_{#1} & {#2} \end{array}} \newcommand{\MAXone}[3]{\begin{array}{ll} \maximize_{#1} & {#2} \\ \subjectto & {#3} \end{array}} \newcommand{\itr}[2]{#1^{(#2)}} \newcommand{\itn}[1]{^{(#1)}} \newcommand{\prob}{\mathbb{P}} \newcommand{\probof}[1]{\prob\left\{ #1 \right\}} \newcommand{\pmat}[1]{\begin{pmatrix} #1 \end{pmatrix}} \newcommand{\bmat}[1]{\begin{bmatrix} #1 \end{bmatrix}} \newcommand{\spmat}[1]{\left(\begin{smallmatrix} #1 \end{smallmatrix}\right)} \newcommand{\sbmat}[1]{\left[\begin{smallmatrix} #1 \end{smallmatrix}\right]} \newcommand{\RR}{\mathbb{R}} \newcommand{\CC}{\mathbb{C}} \newcommand{\eye}{\mat{I}} \newcommand{\mA}{\mat{A}} \newcommand{\mB}{\mat{B}} \newcommand{\mC}{\mat{C}} \newcommand{\mD}{\mat{D}} \newcommand{\mE}{\mat{E}} \newcommand{\mF}{\mat{F}} \newcommand{\mG}{\mat{G}} \newcommand{\mH}{\mat{H}} \newcommand{\mI}{\mat{I}} \newcommand{\mJ}{\mat{J}} \newcommand{\mK}{\mat{K}} \newcommand{\mL}{\mat{L}} \newcommand{\mM}{\mat{M}} \newcommand{\mN}{\mat{N}} \newcommand{\mO}{\mat{O}} \newcommand{\mP}{\mat{P}} \newcommand{\mQ}{\mat{Q}} \newcommand{\mR}{\mat{R}} \newcommand{\mS}{\mat{S}} \newcommand{\mT}{\mat{T}} \newcommand{\mU}{\mat{U}} \newcommand{\mV}{\mat{V}} \newcommand{\mW}{\mat{W}} \newcommand{\mX}{\mat{X}} \newcommand{\mY}{\mat{Y}} \newcommand{\mZ}{\mat{Z}} \newcommand{\mLambda}{\mat{\Lambda}} \newcommand{\mSigma}{\ensuremath{\mat{\Sigma}}} \newcommand{\mPbar}{\bar{\mP}} \newcommand{\ones}{\vec{e}} \newcommand{\va}{\vec{a}} \newcommand{\vb}{\vec{b}} \newcommand{\vc}{\vec{c}} \newcommand{\vd}{\vec{d}} \newcommand{\ve}{\vec{e}} \newcommand{\vf}{\vec{f}} \newcommand{\vg}{\vec{g}} \newcommand{\vh}{\vec{h}} \newcommand{\vi}{\vec{i}} \newcommand{\vj}{\vec{j}} \newcommand{\vk}{\vec{k}} \newcommand{\vl}{\vec{l}} \newcommand{\vm}{\vec{l}} \newcommand{\vn}{\vec{n}} \newcommand{\vo}{\vec{o}} \newcommand{\vp}{\vec{p}} \newcommand{\vq}{\vec{q}} \newcommand{\vr}{\vec{r}} \newcommand{\vs}{\vec{s}} \newcommand{\vt}{\vec{t}} \newcommand{\vu}{\vec{u}} \newcommand{\vv}{\vec{v}} \newcommand{\vw}{\vec{w}} \newcommand{\vx}{\vec{x}} \newcommand{\vy}{\vec{y}} \newcommand{\vz}{\vec{z}} \newcommand{\vpi}{\vecalt{\pi}} \newcommand{\vlambda}{\vecalt{\lambda}}$

# Homework 3

Please answer the following questions in complete sentences in a clearly prepared manuscript and submit the solution by the due date on Blackboard (around Sunday, September 16th, 2018.)

Remember that this is a graduate class. There may be elements of the problem statements that require you to fill in appropriate assumptions. You are also responsible for determining what evidence to include. An answer alone is rarely sufficient, but neither is an overly verbose description required. Use your judgement to focus your discussion on the most interesting pieces. The answer to "should I include 'something' in my solution?" will almost always be: Yes, if you think it helps support your answer.

## Problem 0: Homework checklist

• Please identify anyone, whether or not they are in the class, with whom you discussed your homework. This problem is worth 1 point, but on a multiplicative scale.

• Make sure you have included your source-code and prepared your solution according to the most recent Piazza note on homework submissions.

## Problem 1: Getting Poisson's equations to converge with Richardson

In class we showed that the Richardson method for $\mA \vx = \vb$ will converge for any positive semi definite matrix as long as $\alpha < 2 / \rho(\mA)$. We have not yet determined ways of computing $\rho(\mA)$, however.

In the previous homework (HW2), you found that setting $\alpha = 1$ did not result in a method that converged for Poisson's equation with $n = 10$.

1. Experimentally or theoretically, find a value of $\alpha$ such that using the Richardson method will converge for Poisson's equation (as used in HW2) with $n=10$.

2. What happens when you try $n=20$, does the same value of $\alpha$ work? If not, find a new one that does.

3. Experimentally, find a value of $\alpha$ that takes the fewest iterations to produce a solution with relative residual $10^{-5}$ for both $n=10$ and $n=20$.

4. Prove that using $\alpha < 1/4$ will result in a method that converges.

## Problem 2: Norms

Let $f(\vx)$ be any vector norm. This means that $f(\vx)$ must be a vector norm for $\vx \in \RR^{1}$. Show that if $\vx \in \RR^{1}$, then $f(\vx) = \alpha |x_1|$ for some value of $\alpha$.

## Problem 3

Consider the following function:

1. Show that $f$ is a matrix norm. (Very easy!)

2. Show that $f$ does not satisfy the sub-multiplicative property.

3. Show that there exists $\sigma > 0$ such that: is a sub-multiplicative matrix-norm.

4. Extra tough problem for the adventurous! Not graded. This problem has a relatively easy proof related to something we saw in class. But making it fully formal requires a few technicalities that are easy to get tripped up on. Now let $\normof{\mA}$ be an arbitrary matrix norm. Show that there exists $\sigma > 0$ such that $h(\mA) = \sigma \normof{\mA}$ is a sub-multiplicative matrix-norm.

## Problem 4: Solving the PageRank linear system of equations via Richardson.

We will derive the following at some point in the class, but you have everything you need to solve this right now. Let $\mP$ be a matrix that is entry-wise non-negative ($P_{i,j} \ge 0$) and also has columns that sum to $1$ ($\sum_{i} P_{i,j} = 1$).

Informally, the matrix $\mP$ describes the probability of moving around a graph, just like the matrix $\mT$ described the probability of moving around the game of Candyland. The interpretation of $P_{i,j}$ is the probability of moving from node $j$ to node $i$.

The PageRank linear system is: where $v_i \ge 0$ and $\sum_i v_i = 1$. The solution $x_{i}$ can be interpreted as the asymptotic fraction of time a random walk spends at a node of a graph when, at each step, it moves in the graph (according to $\mP$) with probability $\alpha$ and it jumps according to $\vv$ with probability $(1-\alpha)$.

Show that we can solve the PageRank linear system using the Richardson method.