# Network & Matrix Computations

### CIVL 2123

$\newcommand{\mat}[1]{\boldsymbol{#1}} \renewcommand{\vec}[1]{\boldsymbol{\mathrm{#1}}} \newcommand{\vecalt}[1]{\boldsymbol{#1}} \newcommand{\conj}[1]{\overline{#1}} \newcommand{\normof}[1]{\|#1\|} \newcommand{\onormof}[2]{\|#1\|_{#2}} \newcommand{\itr}[2]{#1^{(#2)}} \newcommand{\itn}[1]{^{(#1)}} \newcommand{\eps}{\varepsilon} \newcommand{\kron}{\otimes} \DeclareMathOperator{\diag}{diag} \DeclareMathOperator{\trace}{trace} \newcommand{\prob}{\mathbb{P}} \newcommand{\probof}[1]{\prob\left\{ #1 \right\}} \newcommand{\pmat}[1]{\begin{pmatrix} #1 \end{pmatrix}} \newcommand{\bmat}[1]{\begin{bmatrix} #1 \end{bmatrix}} \newcommand{\spmat}[1]{\left(\begin{smallmatrix} #1 \end{smallmatrix}\right)} \newcommand{\sbmat}[1]{\left[\begin{smallmatrix} #1 \end{smallmatrix}\right]} \newcommand{\RR}{\mathbb{R}} \newcommand{\CC}{\mathbb{C}} \newcommand{\eye}{\mat{I}} \newcommand{\mA}{\mat{A}} \newcommand{\mB}{\mat{B}} \newcommand{\mC}{\mat{C}} \newcommand{\mD}{\mat{D}} \newcommand{\mE}{\mat{E}} \newcommand{\mF}{\mat{F}} \newcommand{\mG}{\mat{G}} \newcommand{\mH}{\mat{H}} \newcommand{\mI}{\mat{I}} \newcommand{\mJ}{\mat{J}} \newcommand{\mK}{\mat{K}} \newcommand{\mL}{\mat{L}} \newcommand{\mM}{\mat{M}} \newcommand{\mN}{\mat{N}} \newcommand{\mO}{\mat{O}} \newcommand{\mP}{\mat{P}} \newcommand{\mQ}{\mat{Q}} \newcommand{\mR}{\mat{R}} \newcommand{\mS}{\mat{S}} \newcommand{\mT}{\mat{T}} \newcommand{\mU}{\mat{U}} \newcommand{\mV}{\mat{V}} \newcommand{\mW}{\mat{W}} \newcommand{\mX}{\mat{X}} \newcommand{\mY}{\mat{Y}} \newcommand{\mZ}{\mat{Z}} \newcommand{\mLambda}{\mat{\Lambda}} \newcommand{\mPbar}{\bar{\mP}} \newcommand{\ones}{\vec{e}} \newcommand{\va}{\vec{a}} \newcommand{\vb}{\vec{b}} \newcommand{\vc}{\vec{c}} \newcommand{\vd}{\vec{d}} \newcommand{\ve}{\vec{e}} \newcommand{\vf}{\vec{f}} \newcommand{\vg}{\vec{g}} \newcommand{\vh}{\vec{h}} \newcommand{\vi}{\vec{i}} \newcommand{\vj}{\vec{j}} \newcommand{\vk}{\vec{k}} \newcommand{\vl}{\vec{l}} \newcommand{\vm}{\vec{l}} \newcommand{\vn}{\vec{n}} \newcommand{\vo}{\vec{o}} \newcommand{\vp}{\vec{p}} \newcommand{\vq}{\vec{q}} \newcommand{\vr}{\vec{r}} \newcommand{\vs}{\vec{s}} \newcommand{\vt}{\vec{t}} \newcommand{\vu}{\vec{u}} \newcommand{\vv}{\vec{v}} \newcommand{\vw}{\vec{w}} \newcommand{\vx}{\vec{x}} \newcommand{\vy}{\vec{y}} \newcommand{\vz}{\vec{z}} \newcommand{\vpi}{\vecalt{\pi}}$

# Homework 3

These problems are due in-class on November 3rd.

## Problem 1 (5 pts)

For a simple, undirected graph, show that the sum of the eigenvalues of its adjacency matrix must be equal to 0.
(This is a quick problem.)

## Problem 2 (10 pts)

You’ve recently been hired as a senior researcher at Bingle!, a new scrappy search startup. They want to test out some algorithms for weakly preferential PageRank because they don’t think that resetting “everywhere” from a dangling node is a realistic model. However, they only have a routine to compute a strongly preferential PageRank vector. That is, they have a program pagerank(alpha,v) that solves the system

on their distributed MapReduce cluster for their current crawl of the web. The matrix $\mPbar$ is a column sub-stochastic matrix from a hand-tuned random walk process on their web graph, and $\vd = \ve^T - \ve^T\mPbar$ is the dangling node vector
It’s very time-consuming and expensive to change that code (and so, for this problem, you cannot). They’ve asked you to determine if there is another way to compute weakly preferential PageRank.

Thankfully, you’ve taken CS-59000 NMC at Purdue, and this is your chance to shine and earn your yearly bonus early. Show how to compute a weakly preferential PageRank vector by using their pagerank(alpha,v) routine and one additional routine mult(x) which computes $\mPbar \vx$. Psst, the weakly preferential PageRank system is:

## Problem 3 (10 pts)

Google recently announced a change in how they process rel=nofollow links on the web. Please read about it here: http://www.mattcutts.com/blog/pagerank-sculpting/ Mathematically describe how this change affects how Google constructs the matrix $\mP$ for PageRank.

## Problem 4 (10 pts)

Bingle! is back with another problem. One of their interns implemented a super-duper fast routine to compute a PseudoRank vector:

This intern claimed that they could replace all their PageRank computations using this routine if they just normalized $\vy$, i.e. $\vx = \vy/\normof{\vy}$. Unfortunately, the intern was hired by the White House to help manage the government’s data efforts and left without justifying this result. There is apparently no time left to handle Bingle!’s email’s marked URGENT!

And so, they come to you. Can you solve the mystery of the normalization? To be precise, show exactly that $\normof{\vy}$ is the correct adjustment to $\vy$.

## Problem 5 (5 pts)

Tell me your preferred ranking of the following days for the in-class presentations:

• November 15th
• November 22nd
• November 29th
• December 1st

Rank each day from 1-4. There is no guarantee I’ll be able to assign you to the day you’d like, but we’ll see what we can do.

Due to scheduling details at the end of class, we will have to adjust the length of the presentations. We need two people to volunteer to split their presentation over two days (so we can do 2.5 presentations a day). If you’d be willing to split your presentation, please let me know.