$\newcommand{\eps}{\varepsilon} \newcommand{\kron}{\otimes} \DeclareMathOperator{\diag}{diag} \DeclareMathOperator{\trace}{trace} \DeclareMathOperator{\rank}{rank} \DeclareMathOperator*{\minimize}{minimize} \DeclareMathOperator*{\maximize}{maximize} \DeclareMathOperator{\subjectto}{subject to} \newcommand{\mat}[1]{\boldsymbol{#1}} \renewcommand{\vec}[1]{\boldsymbol{\mathrm{#1}}} \newcommand{\vecalt}[1]{\boldsymbol{#1}} \newcommand{\conj}[1]{\overline{#1}} \newcommand{\normof}[1]{\|#1\|} \newcommand{\onormof}[2]{\|#1\|_{#2}} \newcommand{\MIN}[2]{\begin{array}{ll} \minimize_{#1} & {#2} \end{array}} \newcommand{\MINone}[3]{\begin{array}{ll} \minimize_{#1} & {#2} \\ \subjectto & {#3} \end{array}} \newcommand{\MINthree}[5]{\begin{array}{ll} \minimize_{#1} & {#2} \\ \subjectto & {#3} \\ & {#4} \\ & {#5} \end{array}} \newcommand{\MAX}[2]{\begin{array}{ll} \maximize_{#1} & {#2} \end{array}} \newcommand{\MAXone}[3]{\begin{array}{ll} \maximize_{#1} & {#2} \\ \subjectto & {#3} \end{array}} \newcommand{\itr}[2]{#1^{(#2)}} \newcommand{\itn}[1]{^{(#1)}} \newcommand{\prob}{\mathbb{P}} \newcommand{\probof}[1]{\prob\left\{ #1 \right\}} \newcommand{\pmat}[1]{\begin{pmatrix} #1 \end{pmatrix}} \newcommand{\bmat}[1]{\begin{bmatrix} #1 \end{bmatrix}} \newcommand{\spmat}[1]{\left(\begin{smallmatrix} #1 \end{smallmatrix}\right)} \newcommand{\sbmat}[1]{\left[\begin{smallmatrix} #1 \end{smallmatrix}\right]} \newcommand{\RR}{\mathbb{R}} \newcommand{\CC}{\mathbb{C}} \newcommand{\eye}{\mat{I}} \newcommand{\mA}{\mat{A}} \newcommand{\mB}{\mat{B}} \newcommand{\mC}{\mat{C}} \newcommand{\mD}{\mat{D}} \newcommand{\mE}{\mat{E}} \newcommand{\mF}{\mat{F}} \newcommand{\mG}{\mat{G}} \newcommand{\mH}{\mat{H}} \newcommand{\mI}{\mat{I}} \newcommand{\mJ}{\mat{J}} \newcommand{\mK}{\mat{K}} \newcommand{\mL}{\mat{L}} \newcommand{\mM}{\mat{M}} \newcommand{\mN}{\mat{N}} \newcommand{\mO}{\mat{O}} \newcommand{\mP}{\mat{P}} \newcommand{\mQ}{\mat{Q}} \newcommand{\mR}{\mat{R}} \newcommand{\mS}{\mat{S}} \newcommand{\mT}{\mat{T}} \newcommand{\mU}{\mat{U}} \newcommand{\mV}{\mat{V}} \newcommand{\mW}{\mat{W}} \newcommand{\mX}{\mat{X}} \newcommand{\mY}{\mat{Y}} \newcommand{\mZ}{\mat{Z}} \newcommand{\mLambda}{\mat{\Lambda}} \newcommand{\mSigma}{\ensuremath{\mat{\Sigma}}} \newcommand{\mPbar}{\bar{\mP}} \newcommand{\ones}{\vec{e}} \newcommand{\va}{\vec{a}} \newcommand{\vb}{\vec{b}} \newcommand{\vc}{\vec{c}} \newcommand{\vd}{\vec{d}} \newcommand{\ve}{\vec{e}} \newcommand{\vf}{\vec{f}} \newcommand{\vg}{\vec{g}} \newcommand{\vh}{\vec{h}} \newcommand{\vi}{\vec{i}} \newcommand{\vj}{\vec{j}} \newcommand{\vk}{\vec{k}} \newcommand{\vl}{\vec{l}} \newcommand{\vm}{\vec{l}} \newcommand{\vn}{\vec{n}} \newcommand{\vo}{\vec{o}} \newcommand{\vp}{\vec{p}} \newcommand{\vq}{\vec{q}} \newcommand{\vr}{\vec{r}} \newcommand{\vs}{\vec{s}} \newcommand{\vt}{\vec{t}} \newcommand{\vu}{\vec{u}} \newcommand{\vv}{\vec{v}} \newcommand{\vw}{\vec{w}} \newcommand{\vx}{\vec{x}} \newcommand{\vy}{\vec{y}} \newcommand{\vz}{\vec{z}} \newcommand{\vpi}{\vecalt{\pi}} \newcommand{\vlambda}{\vecalt{\lambda}}$

# Extra Credit

This project consists of four parts. You can choose to do just the first part or the first part along with other parts parts. It is due on Wednesday December 13 at noon.

Overall note. Do not expect generous partial credit. We expect research-level work on these problems. This is not like a homework problem. Questions will require insight into the problem and methods and we expect to see solutions that go above and beyond the stated requirements.

• 25 - a fully complete solution (unlikely)
• 20 - a mostly compete solution (likely)
• 13 - a "good" but incomplete solution (likely) in a way that failed to investigate some aspect of your method. (e.g. if something looks weird and you didn't examine it, expect to lose points.) A standard homework-level solution would likely fall in here, we are expecting more on this project.

Do not attempt a part unless you intend to complete it fully. We are not interested in incomplete, partial, or "opportunistic" solutions or ideas.

The solutions should be self-contained and self-verifying. That is, you should present all the evidence you can that you have correctly implemented your ideas. We will not run any of your code, but you should provide it for reference and show how you verified that it is correct.

## Project part 0. Collaboration policy

There is a cost for collaboration on this project. First, you must declare collaborations by Monday December 4th at 5pm. Second, if you collaborate in a group of $k$ people, each will only receive $1/k$ of the total points. (So if you collaborate with one other person, you will each receive $1/2$ of the total points.) You only submit one result for the group. You must also declare collaborations as noted above and these cannot be changed afterwards.

## Project part 1. (25 points, required for subsequent parts)

The block Lanczos procedure is a generalization of the Lanczos method to a group of $p$ vectors. A helpful description of the method can be found in Figure 1 of the paper:

A Shifted Block Lanczos Algorithm for Solving Sparse Symmetric Generalized Eigenproblems Roger G. Grimes, John G. Lewis, and Horst D. Simon https://doi.org/10.1137/S0895479888151111 SIAM J. Matrix Anal. Appl. 15(1):228--272, 1994.

Your goal in this part is to implement block Lanczos and develop a solver for a symmetric linear system of equations akin to CG and MINRES. This solver need not be "fully efficient" (see the next part, where that is the goal). More specifically, it is allowed to perform $O(k^3)$ and $O(nk^2)$ work and use $O(nk)$ storage where you have $k$ vectors computed.

Compare your solver, with and without symmetric preconditioning to CG and MINRES in terms of both relative error and residual on at least three different linear systems of equations. The key metric is the number of matrix-vector products the methods use and the relative residual the solutions attain. You must investigate how the choice of block-size $p$ effects the results. You must have one problem with over 10k rows and columns. (Note that I'd encourage you to seek out more systems too for a more comprehensive analysis. Using only three may not be sufficient for all points. Using more may also not be sufficient. The goal is to understand trade-offs... as a starting point, I would normally look at 10-15 different problems from the SuiteSparse Matrix Collection: https://sparse.tamu.edu/)

To pick the initial set of $p$ vectors, make sure to include the right hand side $b$. For the remaining $p-1$, use a set of random normals. (See part 3 if you want to investigate this further.)

Note that we expect some evidence that you have correctly implemented Block Lanczos as well as your linear solver.

## Project part 2. (25 points, optional)

Implement all the recurrence relationships necessary to achieve an efficient implementation which does no more than $O(n)$ work (amortized) per-matrix-vector opreation. (This part can be combined with the previous part as well, so you don't need to develop an "inefficient" solution for part 1 if you use this part -- although, I would highly recommend you do so as the "inefficient" version can be used to verify that your "efficient" version is correct.)

Note, we haven't actually worked this out to make 100\% sure it's possible, but did enough analysis to convince ourselves it is. Your tasks is to complete the algebra and analysis -- if you get stuck and feel that it is impossible, write up why for possible partial credit.

Again, we expect evidence that you have achieved your goals in terms of an improved runtime and efficiency. We also expect to see evidence that the implementations have not changed. Full points will also require you to use your solver on a system with at least 100k rows and show that you can run at least 10000 total mat-vecs in a reasonable amount of time.

## Project part 3. (25 points, optional)

Try using a degree $p-1$ polynomial in $\mA$ and $\vb$ to pick the initial set of $p$ vectors. Describe the empirical relationship between the iterates for a test problem. Then prove a statement that would justify that this is a good idea or a bad idea. (Using $p = 2,3,4,5$ would be quite sufficient to see the effect you should see...)

## Project part 4. (25 points, optional)

Evaluate different initialization of the block-Lanczos vectors. For this task, we expect an evaluation of 5-7 different strategies to pick the initial set of vectors for block Lanczos -- all that must have some intuition for success. (So understanding the solution of part 3 could be helpful, but not required.) Points will be awarded for clearly articulating trade-offs, advantages, etc.