# Network & Matrix Computations

### CIVL 2123

$\newcommand{\mat}[1]{\boldsymbol{#1}} \renewcommand{\vec}[1]{\boldsymbol{\mathrm{#1}}} \newcommand{\vecalt}[1]{\boldsymbol{#1}} \newcommand{\conj}[1]{\overline{#1}} \newcommand{\normof}[1]{\|#1\|} \newcommand{\onormof}[2]{\|#1\|_{#2}} \newcommand{\itr}[2]{#1^{(#2)}} \newcommand{\itn}[1]{^{(#1)}} \newcommand{\eps}{\varepsilon} \newcommand{\kron}{\otimes} \DeclareMathOperator{\diag}{diag} \DeclareMathOperator{\trace}{trace} \newcommand{\prob}{\mathbb{P}} \newcommand{\probof}[1]{\prob\left\{ #1 \right\}} \newcommand{\pmat}[1]{\begin{pmatrix} #1 \end{pmatrix}} \newcommand{\bmat}[1]{\begin{bmatrix} #1 \end{bmatrix}} \newcommand{\spmat}[1]{\left(\begin{smallmatrix} #1 \end{smallmatrix}\right)} \newcommand{\sbmat}[1]{\left[\begin{smallmatrix} #1 \end{smallmatrix}\right]} \newcommand{\RR}{\mathbb{R}} \newcommand{\CC}{\mathbb{C}} \newcommand{\eye}{\mat{I}} \newcommand{\mA}{\mat{A}} \newcommand{\mB}{\mat{B}} \newcommand{\mC}{\mat{C}} \newcommand{\mD}{\mat{D}} \newcommand{\mE}{\mat{E}} \newcommand{\mF}{\mat{F}} \newcommand{\mG}{\mat{G}} \newcommand{\mH}{\mat{H}} \newcommand{\mI}{\mat{I}} \newcommand{\mJ}{\mat{J}} \newcommand{\mK}{\mat{K}} \newcommand{\mL}{\mat{L}} \newcommand{\mM}{\mat{M}} \newcommand{\mN}{\mat{N}} \newcommand{\mO}{\mat{O}} \newcommand{\mP}{\mat{P}} \newcommand{\mQ}{\mat{Q}} \newcommand{\mR}{\mat{R}} \newcommand{\mS}{\mat{S}} \newcommand{\mT}{\mat{T}} \newcommand{\mU}{\mat{U}} \newcommand{\mV}{\mat{V}} \newcommand{\mW}{\mat{W}} \newcommand{\mX}{\mat{X}} \newcommand{\mY}{\mat{Y}} \newcommand{\mZ}{\mat{Z}} \newcommand{\mLambda}{\mat{\Lambda}} \newcommand{\mPbar}{\bar{\mP}} \newcommand{\ones}{\vec{e}} \newcommand{\va}{\vec{a}} \newcommand{\vb}{\vec{b}} \newcommand{\vc}{\vec{c}} \newcommand{\vd}{\vec{d}} \newcommand{\ve}{\vec{e}} \newcommand{\vf}{\vec{f}} \newcommand{\vg}{\vec{g}} \newcommand{\vh}{\vec{h}} \newcommand{\vi}{\vec{i}} \newcommand{\vj}{\vec{j}} \newcommand{\vk}{\vec{k}} \newcommand{\vl}{\vec{l}} \newcommand{\vm}{\vec{l}} \newcommand{\vn}{\vec{n}} \newcommand{\vo}{\vec{o}} \newcommand{\vp}{\vec{p}} \newcommand{\vq}{\vec{q}} \newcommand{\vr}{\vec{r}} \newcommand{\vs}{\vec{s}} \newcommand{\vt}{\vec{t}} \newcommand{\vu}{\vec{u}} \newcommand{\vv}{\vec{v}} \newcommand{\vw}{\vec{w}} \newcommand{\vx}{\vec{x}} \newcommand{\vy}{\vec{y}} \newcommand{\vz}{\vec{z}} \newcommand{\vpi}{\vecalt{\pi}}$

# Homework 2

## Problem 1: Shortest path computations

from the in-class quiz In class we saw how to compute single-source shortest path distances by using a matrix-over-a-semiring. Recall the algorithm from class:

a) Show how to adapt this to the problem of all-pairs shortest paths.

b) just think about it The following question is open-ended. I don’t have an answer, nor do I think there is one right answer.
Just write down some of your thoughts about the question. This is a chance for you to explore the material yourselfs.

If you wanted to approximate all pairs shortest paths less expensively, what would you do? (Please don’t describe how to implement the Floyd-Warshall procedure, think about approximating the shorest paths and doing so in a matrix sense.)

## Problem 2

Kleinberg’s HITS algorithm on a graph computes a hub-score and an authority-score for each vertex. Let $G=(V,E)$ be a directed graph. Then a hub should point to many authorities, and so we set the hub-score of a vertex to:

Likewise, an authority should be pointed to by many hubs:

Let $\mA$ be the adjacency matrix, then

If we iterate these two, we find:

In other words, we are looking for eigenvalues of the matrices $\mA \mA^T$ and $\mA^T \mA$, respectively. As we’ve stated these, the eigenvalue would have to be $1$. In the HITS algorithm, we keep iterating the updates:

and then normalizing the results so that $\vh\itn{k+1}$ and $\va\itn{k+1}$ have unit 2-norm. What this does is run the power method on the matrices $\mA \mA^T$ and $\mA^T \mA$, simultaneously. We can use the Perron-Frobenius theorem to help us understand when this will have a unique solution.

Recall that a matrix has a unique Perron vector (positive eigenvalue equal to its spectral radius) when it is irreducible and aperiodic.

a) Give an example that shows an irreducible graph where $\mA \mA^T$ or $\mA^T \mA$ is reducible.

b) Give an example that shows an irreducible, aperiodic graph where $\mA \mA^T$ or $\mA^T \mA$ is reducible.

c) What this means is that it may be easy for the HITS algorithm to get into trouble with respect a unique solution because the Perron vector of $\mA \mA^T$ or $\mA^T \mA$ may not be unique, even if it is for $\mA$. So let’s see what happens: Implement the HITS algorithm for graph.

d) Try at least four or five small graphs. Can you find a graph that is sensitive to the starting vector? If not, show that these graphs have a unique Perron vector by computing their eigenvalues.

## Problem 3

a) Describe an $O(|E| d)$ algorithm to check if a graph is aperiodic where the only possible operation with the graph is a matrix-vector product: $\mA \vv$.

b) Describe how to use this algorithm to check whether or not $\mA^T \mA$ is irreducible.

c) Design an improved HITS algorithm that returns the hubs and authority vectors along with a flag that indicates whether or not the vectors are unique. Use as little extra work and memory as possible.

b) Open question Think about this question and write down your thoughts, or come chat with me about it. Is there a semi-ring operation that will check if a graph is aperiodic? Note that the operations greatest common divisor = $\otimes$ and least common multiple = $\oplus$ form a semi-ring over the non-negative intergers. At the very least, work out what the $0$ and $1$ elements are in this semi-ring. See http://marcpouly.ch/pdf/internal_100712.pdf for some other semirings that might be useful.

## Problem 4: More semi-rings

In this problem, we’ll show how a few things with semi-rings.

a) Show that $a \oplus b = \max(a,b)$ and $a \otimes b = \min(a,b)$ is a semi-ring and identify the $0$ and $1$ elements.

b) Show that a $n$-by-$n$ permutation matrix is an orthogonal matrix in any semi-ring.

c) Open question Is there a semi-ring to implement Boruvka’s algorithm for finding the minimum spanning tree? This would be a nice algorithm to implement in Hadoop or Pregel. Note that there is a semi-ring algorithm to compute minimum spanning trees using the semi-ring from part (a). This was described in MINIMUM-COST SPANNING TREE AS A PATH-FINDING PROBLEM by Maggs and Plotkin.

## Problem 5: Markov chain state space classification

In this problem, you’ll get some experience classifying the state of a Markov chain.

I’ll update this with the example soon, check back on Tuesday the 27th.

## Problem 6: Find a paper

Find a paper on networks and matrix computations that you’d like to present on for 30 minute in-class presentation. Please contact me with questions.