$\newcommand{\eps}{\varepsilon} \newcommand{\kron}{\otimes} \DeclareMathOperator{\diag}{diag} \DeclareMathOperator{\trace}{trace} \DeclareMathOperator{\rank}{rank} \DeclareMathOperator*{\minimize}{minimize} \DeclareMathOperator*{\maximize}{maximize} \DeclareMathOperator{\subjectto}{subject to} \newcommand{\mat}[1]{\boldsymbol{#1}} \renewcommand{\vec}[1]{\boldsymbol{\mathrm{#1}}} \newcommand{\vecalt}[1]{\boldsymbol{#1}} \newcommand{\conj}[1]{\overline{#1}} \newcommand{\normof}[1]{\|#1\|} \newcommand{\onormof}[2]{\|#1\|_{#2}} \newcommand{\MIN}[2]{\begin{array}{ll} \minimize_{#1} & {#2} \end{array}} \newcommand{\MINone}[3]{\begin{array}{ll} \minimize_{#1} & {#2} \\ \subjectto & {#3} \end{array}} \newcommand{\MINthree}[5]{\begin{array}{ll} \minimize_{#1} & {#2} \\ \subjectto & {#3} \\ & {#4} \\ & {#5} \end{array}} \newcommand{\MAX}[2]{\begin{array}{ll} \maximize_{#1} & {#2} \end{array}} \newcommand{\MAXone}[3]{\begin{array}{ll} \maximize_{#1} & {#2} \\ \subjectto & {#3} \end{array}} \newcommand{\itr}[2]{#1^{(#2)}} \newcommand{\itn}[1]{^{(#1)}} \newcommand{\prob}{\mathbb{P}} \newcommand{\probof}[1]{\prob\left\{ #1 \right\}} \newcommand{\pmat}[1]{\begin{pmatrix} #1 \end{pmatrix}} \newcommand{\bmat}[1]{\begin{bmatrix} #1 \end{bmatrix}} \newcommand{\spmat}[1]{\left(\begin{smallmatrix} #1 \end{smallmatrix}\right)} \newcommand{\sbmat}[1]{\left[\begin{smallmatrix} #1 \end{smallmatrix}\right]} \newcommand{\RR}{\mathbb{R}} \newcommand{\CC}{\mathbb{C}} \newcommand{\eye}{\mat{I}} \newcommand{\mA}{\mat{A}} \newcommand{\mB}{\mat{B}} \newcommand{\mC}{\mat{C}} \newcommand{\mD}{\mat{D}} \newcommand{\mE}{\mat{E}} \newcommand{\mF}{\mat{F}} \newcommand{\mG}{\mat{G}} \newcommand{\mH}{\mat{H}} \newcommand{\mI}{\mat{I}} \newcommand{\mJ}{\mat{J}} \newcommand{\mK}{\mat{K}} \newcommand{\mL}{\mat{L}} \newcommand{\mM}{\mat{M}} \newcommand{\mN}{\mat{N}} \newcommand{\mO}{\mat{O}} \newcommand{\mP}{\mat{P}} \newcommand{\mQ}{\mat{Q}} \newcommand{\mR}{\mat{R}} \newcommand{\mS}{\mat{S}} \newcommand{\mT}{\mat{T}} \newcommand{\mU}{\mat{U}} \newcommand{\mV}{\mat{V}} \newcommand{\mW}{\mat{W}} \newcommand{\mX}{\mat{X}} \newcommand{\mY}{\mat{Y}} \newcommand{\mZ}{\mat{Z}} \newcommand{\mLambda}{\mat{\Lambda}} \newcommand{\mSigma}{\ensuremath{\mat{\Sigma}}} \newcommand{\mPbar}{\bar{\mP}} \newcommand{\ones}{\vec{e}} \newcommand{\va}{\vec{a}} \newcommand{\vb}{\vec{b}} \newcommand{\vc}{\vec{c}} \newcommand{\vd}{\vec{d}} \newcommand{\ve}{\vec{e}} \newcommand{\vf}{\vec{f}} \newcommand{\vg}{\vec{g}} \newcommand{\vh}{\vec{h}} \newcommand{\vi}{\vec{i}} \newcommand{\vj}{\vec{j}} \newcommand{\vk}{\vec{k}} \newcommand{\vl}{\vec{l}} \newcommand{\vm}{\vec{l}} \newcommand{\vn}{\vec{n}} \newcommand{\vo}{\vec{o}} \newcommand{\vp}{\vec{p}} \newcommand{\vq}{\vec{q}} \newcommand{\vr}{\vec{r}} \newcommand{\vs}{\vec{s}} \newcommand{\vt}{\vec{t}} \newcommand{\vu}{\vec{u}} \newcommand{\vv}{\vec{v}} \newcommand{\vw}{\vec{w}} \newcommand{\vx}{\vec{x}} \newcommand{\vy}{\vec{y}} \newcommand{\vz}{\vec{z}} \newcommand{\vpi}{\vecalt{\pi}} \newcommand{\vlambda}{\vecalt{\lambda}}$

Homework 2

Please answer the following questions in complete sentences in a clearly, typed prepared manuscript and submit the solution by the due date on Blackboard (Monday, September 11th, 2017, early in the morning)

Problem 0: Homework checklist

• Please identify anyone, whether or not they are in the class, with whom you discussed your homework. This problem is worth 1 point, but on a multiplicative scale.

• Make sure you have included your source-code and prepared your solution according to the most recent Piazza note on homework submissions.

Problem 1: Prove or disprove

For the following questions, either prove that the statement is correct, or show a counter-example.

1. The product of two diagonal matrices is diagonal.

2. The product of two upper triangular matrices is upper triangular

3. The product of two symmetric matrices is symmetric.

4. The product of two orthogonal matrices is orthogonal.

5. The product of two square, full rank matrices is full rank

Problem 2

There are a tremendous number of matrix norms that arise. An interesting class are called the orthgonally invariant norms. Norms in this class satisfy: for square orthogonal matrices $\mU$ and $\mV$. Recall that a square matrix is orthogonal when $\mU^T \mU = \mI$, i.e. $\mU^{-1} = \mU^T$.

Show that $\normof{ \mA }_2$ is orthogonally invariant.

Problem 3

Consider the following function:

1. Show that $f$ is a matrix norm. (Very easy!)

2. Show that $f$ does not satisfy the sub-multiplicative property.

3. Show that there exists $\sigma > 0$ such that: is a sub-multiplicative matrix-norm.

4. Extra tough problem for the adventurous! Not graded. This problem has a relatively easy proof related to something we saw in class. But making it fully formal requires a few technicalities that are easy to get tripped up on. Now let $\normof{\mA}$ be an arbitrary matrix norm. Show that there exists $\sigma > 0$ such that $h(\mA) = \sigma \normof{\mA}$ is a sub-multiplicative matrix-norm.

Problem 4

Let $\normof{\mA}$ be a matrix norm and let $\vk \not= 0$ be a real-valued vector. Consider the function:

1. Show that $f$ is a vector norm.

2. Show that if $\normof{\mA}$ is a sub-multiplicative matrix norm, then the vector norm $f$ is consistent with the matrix norm. That is:

Problem 5 (Choice 1)

Note that you only have to do one of the two choices for problem 5.

In class we mentioned that the function is a vector norm. Use a computer to prepare a 3d plot of the unit-ball for this norm where $\vx \in \RR^{3}$. We may award up to 5 bonus points for an excellent graphic.

Problem 5 (Choice 2)

Note that you only have to do one of the two choices for problem 5.

Let $f(\vx)$ be a vector norm. If $\vx \in \RR^{1}$ (that is, $x$ is a scalar), show that $f(x) = \alpha |x|$, where $|x|$ is the absolute value function.