Home Algorithms Commercialization Data Science Information Theories Quantum Theories Lab Linear Algebra
<< Block Matrix PDF Eigenvalues >>

$\require{cancel} \newcommand{\Ket}[1]{\left|{#1}\right\rangle} \newcommand{\Bra}[1]{\left\langle{#1}\right|} \newcommand{\Braket}[1]{\left\langle{#1}\right\rangle} \newcommand{\Rsr}[1]{\frac{1}{\sqrt{#1}}} \newcommand{\RSR}[1]{1/\sqrt{#1}} \newcommand{\Verti}{\rvert} \newcommand{\HAT}[1]{\hat{\,#1~}} \DeclareMathOperator{\Tr}{Tr}$

Eigendecomposition

First created in February 2019

In this context, the type of matrices is implied in the symbol:

  • $N$ is normal matrix ($NN^\dagger=N^\dagger N$).

  • $U$ is unitary matrix ($U^\dagger=U^{-1}$, so $UU^\dagger=U^\dagger U=I$ and therefore also normal.)

  • $\Lambda$ is diagonal matrix with eigenvalues of the subject matrix on its diagonal ($\Lambda_{ii}=\lambda_i$).

When we say basis $B$, it means a basis formed by $B$'s column vectors.


Useful presentation:

$Q=\big[\Ket{q_1}~\Ket{q_2}~\ldots~\Ket{q_n}\big],~~ Q^\dagger =\begin{bmatrix}\Bra{q_1}\\\vdots\\\Bra{q_n}\end{bmatrix} ,~~ QQ^\dagger =\sum_k\Ket{q_k}\Bra{q_k} ,~~ Q^\dagger Q =\big[\Braket{q_i\Verti q_j}\big] .$

Unitary


If $\Ket{q_i}$s are orthonormal, i.e. $\Braket{q_i\Verti q_j}=\delta_{ij}$, projecting $\Ket\psi$ on each of $\Ket{q_i}$s gives a vector on the $Q$ basis: $\Ket{\psi'} =\begin{bmatrix} \Braket{q_1\Verti\psi}\\ \Braket{q_2\Verti\psi}\\ \vdots\\ \Braket{q_n\Verti\psi}\\ \end{bmatrix} =Q^\dagger\Ket\psi ,$

which is a "vector" as seen from $Q$. i.e. How much of $\Ket\psi$ that each of $\Ket{q_i}$s has. This is useful in change of basis.


$Q\Ket{\psi'}=QQ^\dagger\Ket\psi=\Ket\psi$ would give you back the original vector.

It follows that $QQ^\dagger=I,~~Q^\dagger=Q^{-1}.~$ i.e. $Q$ is unitary (without involving eigenvectors).

Conversely, $I=Q^\dagger Q =\begin{bmatrix}\Bra{q_1}\\\vdots\\\Bra{q_n}\end{bmatrix}\big[\Ket{q_1}~\Ket{q_2}~\ldots~\Ket{q_n}\big] =\big[\Braket{q_i\Verti q_j}\big]~,$ and $\Ket{q_i}$s are orthonormal.


For a unitary matrix $U$, since $\left(U^\dagger\right)^\dagger=\left(U^{-1}\right)^\dagger=\left(U^\dagger\right)^{-1}$, $U^\dagger$ is also unitary, and therefore the rows of $U$ (duals of the columns of $U^\dagger$) are orthonormal as well. So the followings are equivalent:

  • $U$ is unitary.

  • Columns of $U$ are orthonormal.

  • Rows of $U$ are orthonormal.

  • $U$ is isometry (preserving length).

More can be found in http://www.math.tamu.edu/~dallen/m640_03c/lectures/chapter4.pdf (p.159).


$U=\big[\Ket{u_1}~\Ket{u_2}~\ldots~\Ket{u_n}\big],~~ U\Ket\psi =\sum_i\psi_i\Ket{u_i},~~ \lVert U\Ket\psi\rVert^2 =(\Bra\psi U^\dagger)(U\Ket\psi) =\Braket{\psi\Verti\psi} =\lVert\Ket\psi\rVert^2 .$

Unitary operation preserve the vector's length (isometry).

Conversely, if $\lVert U\Ket\psi\rVert^2 =\lVert\Ket\psi\rVert^2$ for any $\Ket\psi,~~ \Bra\psi U^\dagger U\Ket\psi =\Bra\psi I\Ket\psi .$

$0 =\Bra\psi U^\dagger U\Ket\psi-\Bra\psi I\Ket\psi =\Bra\psi\left(U^\dagger U-I\right)\Ket\psi$ for any $\Ket\psi$. So $U^\dagger U=I$.


Eigenvalues of a unitary matrix are of unit length (and therefore $\lvert\det(U)\rvert=1$).

Proof: If $U\Ket\psi=\lambda\Ket\psi$, as $\lVert U\Ket\psi\rVert =\lVert\lambda\Ket\psi\rVert =\lvert\lambda\rvert\cdot\lVert\Ket\psi\rVert =\lVert\Ket\psi\rVert,~~ \lvert\lambda\rvert=1 .$


Eigenvectors of a unitary matrix are orthogonal.

Proof:

Unitary matrix is invertible so all eigenvalues are non-zero (full rank). For equal eigenvalues of multiplicity $m$, you can find $m$ orthogonal eigenvectors in the $m$-dimensional eigenspace;

For two distinct eigenvalues $\lambda_1$ and $\lambda_2$ with eigenvectors $\Ket{v_1}$ and $\Ket{v_2}$ respectively,

$\Braket{v_1\Verti v_2} =\Braket{v_1\Verti U^\dagger U\Verti v_2} =\left(\Bra{v_1}U^\dagger\right)\left(U\Ket{v_2}\right) =\lambda_1^*\lambda_2\Braket{v_1\Verti v_2} ,~~ \lambda_1^*\lambda_2=1\text{ or } \Braket{v_1\Verti v_2}=0 .$

Since $\lambda_1^*\lambda_1=\lvert\lambda_1\rvert^2=1=\lambda_1^*\lambda_2$ and $\lambda_1^*\ne 0,~$ we have $\lambda_1=\lambda_2$, contrary to our assumption. Therefore $\Braket{v_1\Verti v_2}=0.~$ $\Ket{v_1}$ and $\Ket{v_2}$ are orthogonal.


For unitary $U$, let its eigenvalues be $\{\lambda_1,\lambda_2,\ldots,\lambda_n\}$ and normalised eigenvectors be $\{v_1,v_2,\ldots,v_n\}$, which are orthornomal.

We have $\displaystyle U=\sum_{k=1}^n\lambda_k\Ket{v_k}\Bra{v_k}.$ (eigendecomposition)

Proof:

Given $\Ket{v_k}$ are orthonormal, for an arbitrary vector $\Ket\psi, \left(\sum_{k=1}^n\Ket{v_k}\Bra{v_k}\right)\Ket\psi =\sum_{k=1}^n\psi_k\Ket{v_k} =\Ket\psi.~~ \therefore \sum_{k=1}^n\Ket{v_k}\Bra{v_k}=I .$

Let $U\Ket{v_k}=\lambda_k\Ket{v_k},$ where $k\in[1,n].~ U =U\sum_{k=1}^n\Ket{v_k}\Bra{v_k} =\sum_{k=1}^nU\Ket{v_k}\Bra{v_k} =\sum_{k=1}^n\lambda_k\Ket{v_k}\Bra{v_k} .$


Example: $H=\begin{bmatrix}\Rsr2&\Rsr2\\\Rsr2&-\Rsr2\end{bmatrix}$ with eigenvector $\Ket{\psi_+}=\sqrt{1-\Rsr2}\left(\Ket0+\Rsr2(\Ket0+\Ket1)\right), \lambda_+=1$ and $\Ket{\psi_-}=\sqrt{1-\Rsr2}\left(\Ket1-\Rsr2(\Ket0-\Ket1)\right), \lambda_-=-1$.

$\lambda_+\Ket{\psi_+}\Bra{\psi_+}+\lambda_-\Ket{\psi_-}\Bra{\psi_-} =\frac{1}{2}(\Ket0+\Ket+)(\Bra0+\Bra+)-\frac{1}{2}(\Ket1-\Ket-)(\Bra1-\Bra-)$\

$=(1-\Rsr2)\left(\begin{bmatrix}1+\Rsr2\\\Rsr2\end{bmatrix}\begin{bmatrix}1+\Rsr2&\Rsr2\end{bmatrix} -\begin{bmatrix}-\Rsr2\\1+\Rsr2\end{bmatrix}\begin{bmatrix}-\Rsr2&1+\Rsr2\end{bmatrix}\right) =(1-\Rsr2)\left(\begin{bmatrix}\frac{3}{2}+\sqrt2&\frac{1}{2}+\Rsr2\\\frac{1}{2}+\Rsr2&\frac{1}{2}\end{bmatrix} -\begin{bmatrix}\frac{1}{2}&-\frac{1}{2}-\Rsr2\\-\frac{1}{2}-\Rsr2&\frac{3}{2}+\sqrt2\end{bmatrix}\right)$\

$=(1-\Rsr2)\begin{bmatrix}1+\sqrt2&1+\sqrt2\\1+\sqrt2&-1-\sqrt2\end{bmatrix} =\Rsr2\begin{bmatrix}1&1\\1&-1\end{bmatrix} =H.$

Normality

If $NN^\dagger=N^\dagger N,~~N$ is a normal matrix.

(Unitary matrix $U$ is a special type of normal matrices with $UU^\dagger=U^\dagger U=I$.)

Let $N=\big[\Ket{n_1}~\Ket{n_2}~\ldots~\Ket{n_m}\big]\text{ and } N^\dagger =\begin{bmatrix}\Bra{n_1}\\\vdots\\\Bra{n_m}\end{bmatrix} ,~~ \text{we have } NN^\dagger =\sum_k\Ket{n_k}\Bra{n_k} ,~~ N^\dagger N =\big[\Braket{n_i\Verti n_j}\big] .$


For a normal matrix $N$, the followings are true:

  • There is a unitary $U$ such that $N=UDU^{-1}$ ($N$ is unitarily diagonisable.)

  • Eigenvectors of $U$ are orthogonal (and full rank).

  • $\sum_{i,j}\lvert a_{ij}\rvert^2=\sum_k\lvert\lambda_k\rvert^2.$

More can be found in http://www.math.tamu.edu/~dallen/m640_03c/lectures/chapter6.pdf (p.198).



WORKING



$NN^\dagger =\sum_k\Ket{n_k}\Bra{n_k} =\big[\sum_k(n_k)_i(n_k)_j^*\big],~~ N^\dagger N =\big[\Braket{n_i\Verti n_j}\big] =\big[\sum_k(n_i)_k(n_j)_k^*\big] ,~~ .$


Are these equivalent?

  • $QQ^\dagger=Q^\dagger Q.$ (i.e. $Q$ is normal.)

  • $\big|\Ket{q_i}\big|=1.$

  • If $A$ is normal, $B=UAU^\dagger$ is normal too.

Eigenvalues


Are these equivalent?

  • $\Ket{q_i}$s are orthonormal.

  • $QQ^\dagger=Q^\dagger Q=I.$ (i.e. $Q$ is unitary.)

  • $A=Q\Lambda Q^\dagger.$

  • $\sigma(Q)\in\{z:\lvert z\rvert=1\}.$ (i.e. eigenvalues of $Q$ are of unit length, and $\lvert\det(Q)\rvert=1.$)


$\boxed{A=U\Lambda U^{-1}=U\Lambda U^\dagger}~$ where $\Lambda$ is the diagonal matrix with the eigenvalues of $A$ on its diagonal, and corresponding normalised eigenvectors as columns of $U$. (The eigenvectors do not need to be orthogonal.)

Let us start with $Q\Lambda=\big[\lambda_1\Ket{q_1}~\lambda_2\Ket{q_2}~\ldots~\lambda_n\Ket{q_n}\big]$.

$AQ =\big[A\Ket{q_1}~A\Ket{q_2}~\ldots~A\Ket{q_n}\big] =\big[\lambda_1\Ket{q_1}~\lambda_2\Ket{q_2}~\ldots~\lambda_n\Ket{q_n}\big] =Q\Lambda.$

If all eigenvalues are non-zero (even degenerate), we can find $n$ independent $\Ket{q_i}$s to make $Q$ invertible.

$\therefore~\boxed{A=Q\Lambda Q^{-1}}$, with the following conditions:

  • All eigenvalues of $A$ are non-zeroes (but can be degenerate), so that $Q$ is invertible.

  • $\Lambda$ is diagonal matrix with eigenvalues of $A$ on its diagonal.

  • The columns of $Q$ are eigenvectors of $A$ in the same order as eigenvalues in $\Lambda$.


$A^{-1} =(Q\Lambda Q^{-1})^{-1} =Q\Lambda^{-1}Q^{-1} .$

$A^n =(Q\Lambda Q^{-1})^n =(Q\Lambda Q^{-1})(Q\Lambda Q^{-1})\ldots(Q\Lambda Q^{-1}) =Q\Lambda^nQ^{-1} .$

For a polynomial function $f(x)=\sum_{i=0}^na_ix^i,~~ f(A) =\sum_{i=0}^na_iA^i =\sum_{i=0}^na_i(Q\Lambda^iQ^{-1}) =Q\left(\sum_{i=0}^na_i\Lambda^i\right)Q^{-1} =Q~f(\Lambda)~Q^{-1} .$


Given $Q=\big[\Ket{q_1}~\Ket{q_2}~\ldots~\Ket{q_n}\big]=\sum_i\Ket{q_i}\Bra{I_i},~~ Q^\dagger =\begin{bmatrix}\Bra{q_1}\\\vdots\\\Bra{q_n}\end{bmatrix} =\sum_i\Ket{I_i}\Bra{q_i} ,$

$QQ^\dagger =\sum_{i,j}\Ket{q_i}\Braket{I_i\Verti I_j}\Bra{q_j}=\sum_i\Ket{q_i}\Bra{q_i}.~~ Q^\dagger Q =\sum_{i,j}\Ket{I_i}\Braket{q_i\Verti q_j}\Bra{I_j} =\sum_{i,j}\Braket{q_i\Verti q_j}\Ket{I_i}\Bra{I_j} =\big[\Braket{q_i\Verti q_j}\big] .$

This form can be used in change of basis: $A\Ket\psi=Q\Lambda Q^{-1}\Ket\psi$, applying $Q^{-1}$ to $\Ket\psi$, multiplying by "eigen matrix", then changing back to the original basis.

If $\Ket{q_i}$s are orthonormal, $Q^\dagger Q =\begin{bmatrix}\Bra{q_1}\\\vdots\\\Bra{q_n}\end{bmatrix} \big[\Ket{q_1}~\Ket{q_2}~\ldots~\Ket{q_n}\big] =\begin{bmatrix} |q_1|^2&0&\ldots&0\\ 0&|q_2|^2&\ldots&\vdots\\ \vdots&0&\ldots&0\\ 0&\ldots&0&|q_n|^2 \end{bmatrix} =QQ^\dagger .$

Normality

If $A$ is similar to a diagonal matrix $D$, we have $A=PDP^{-1}$.

If $P$ is unitary, $P^\dagger=P^{-1},~A^\dagger =(PDP^{-1})^\dagger =(P^{-1})^\dagger D^\dagger P^\dagger =PD^\dagger P^{-1} .$

So $AA^\dagger =(PDP^{-1})(PD^\dagger P^{-1}) =PDD^\dagger P^{-1} =PD^\dagger DP^{-1} =(PD^\dagger P^{-1})(PDP^{-1}) =A^\dagger A .~~$ (Diagonal matrices commute.)

When $AA^\dagger=A^\dagger A$, we say that $A$ is a normal matrix.


Spectral Theorem: $A$ is normal if and only if there exists a unitary matrix $U$ and a diagonal matrix $D$ such that $A=UDU^\dagger$.

Proof: The converse is already proven in the above. Let us prove that if $A$ is normal, we have $A=UDU^\dagger$.

 

<< Block Matrix Top Eigenvalues >>