There are many common types of matrix and vector structure.
The identity matrix is written: This matrix is rarely written with it's explicit dimension as that can almost always be inferred by context. That is to say, the dimension of the identity matrix is whatever it needs to be such that the matrix equation makes sense. For clarity, we might sometimes write: to denote the matrix explicitly. Thus,
The identity matrix has the property that for any matrix . It's like multiplying by 1.
We denote the th column of the identity matrix by : th position }} \\ 0 \\ \vdots \\ 0}. For instance,
Using these vectors, we can write the th column of any matrix as
It is, perhaps, alarming that is frequently used without specifying its dimension. However, just like the identity matrix above, it is almost always possible to work out the dimension. If we believe it is helpful to specify it, we'll use .
Finally, the vector will be used to denote the vector of all ones: Except sometimes it will be used as an error vector for a problem.
Some people use the matrix to represent the matrix of all ones: I don't think we'll need that in this class, however.
Throughout these examples, the matrix is , and it's elements are .
Diagonal We call a matrix diagonal if all of the non-zero entries are those where , that is, on the "diagonal" of the matrix.
Examples The identity matrix is another diagonal matrix.
Operationally We use the operator to extract the diagonal from a matrix. For instance, . This can also be used to "create" a matrix: has . So the diag operation gets a bit overloaded.
Triangular We call a matrix upper triangular if all of the non-zero entries are those where . That is, if it looks like an upper triangle. A matrix is lower triangular if all of the non-zero entries are those where . A matrix is triangular if it's either upper or lower triangular.
Examples is upper triangular. is lower triangular.
Operationally The operators triu and tril are sometimes used to denote the upper and lower triangular part of a matrix. They are implemented in Matlab, which is useful!
Symmetric We call a matrix symmetric if or equivalently, .
Examples .
Notes
The generalization of symmetric to complex-valued matrices is called Hermitian. A matrix is Hermitian if for the complex adjoint or Hermitian operator, that is, for the complex conjugate "bar".
Any real-valued matrix can be written as: where is symmetric and is skew-symmetric. A matrix is skew-symmetric if . In this case, and . If is symmetric, then and .
Orthogonal A matrix is orthogonal if . We'll get into why orthogonal matrices are special soon.
Examples The identity matrix is orthogonal. We'll see more about orthogonal matrices soon -- it's a very special structure!
Permutation A permutation matrix "shuffles" elements of a vector. Each column of a permutation matrix is a vector and a permutation matrix must also be orthogonal.
Examples . This matrix expresses the permutation . We can see this by: .
Sparse matrices are those where the vast majority of the elements in the matrix are zero. For instance: Of the entries in the matrix, there are only two non-zeros. Later in the class, we'll see how to take advantage of this structure.