Home | Algorithms | Commercialization | Data Science | Information Theories | Quantum Theories | Lab | Linear Algebra |
<< Bloch Vector | Kous Delta Operator >> |
$\require{cancel} \newcommand{\Ket}[1]{\left|{#1}\right\rangle} \newcommand{\Bra}[1]{\left\langle{#1}\right|} \newcommand{\Braket}[1]{\left\langle{#1}\right\rangle} \newcommand{\Rsr}[1]{\frac{1}{\sqrt{#1}}} \newcommand{\RSR}[1]{1/\sqrt{#1}} \newcommand{\Verti}{\rvert} \newcommand{\HAT}[1]{\hat{\,#1~}} \DeclareMathOperator{\Tr}{Tr}$
First created in May 2020
For a unitary matrix $U$, since $\left(U^\dagger\right)^\dagger=\left(U^{-1}\right)^\dagger=\left(U^\dagger\right)^{-1}$, $U^\dagger$ is also unitary, and therefore the rows of $U$ (duals of the columns of $U^\dagger$) are orthonormal as well.
The followings are equivalent:
$U$ is unitary.
Columns of $U$ are orthonormal.
Rows of $U$ are orthonormal.
$U$ is isometry (preserving length).
More can be found in http://www.math.tamu.edu/~dallen/m640_03c/lectures/chapter4.pdf (p.159).
The eigenvalues and eigenvectors of a unitary matrix have the following properties:
Eigenvalues of a unitary matrix are of unit length, in the form of $e^{i\theta}$ where $\theta\in[0,2\pi).$
Eigenvectors of a unitary matrix are orthogonal, i.e. for normalised $\Ket{\lambda_k},~$ $\Braket{\lambda_i|\lambda_j}=\delta_{ij}$.
Note: See Eigendecomposition for detailed proof if interested.
Given the eigendecomposition of an observable $A=\sum_k\lambda_k\Ket k\Bra k,$ the expectation value of an $A$ measurement on $\Ket\psi$ is $\sum_k\lambda_k p_k,$ where $p_k$ is the probability of measuring $\lambda_k$.
Since $p_k=|\Braket{\psi|k}|^2=\Braket{\psi|k}\Braket{k|\psi},$ the expectation value can be written as $\sum_k\lambda_k\Braket{\psi|k}\Braket{k|\psi} =\Bra{\psi}\left(\sum_k\lambda_k\Ket k\Bra k\right)\Ket\psi =\Braket{\psi|A|\psi} .$
e.g.
$\Ket0\Bra0+\Ket1\Bra1=I.$
$\Ket+\Bra+-\Ket-\Bra- =\frac{1}{2}(\Ket0\Bra0+\Ket0\Bra1+\Ket1\Bra0+\Ket1\Bra1) -\frac{1}{2}(\Ket0\Bra0-\Ket0\Bra1-\Ket1\Bra0+\Ket1\Bra1) =\Ket0\Bra1+\Ket1\Bra0 =X.$
$\Rsr2(\Ket0+i\Ket1)\Rsr2(\Bra0-i\Bra1)-(\Rsr2(\Ket0-i\Ket1))(\Rsr2(\Bra0+i\Bra1))\\ \qquad=\frac{1}{2}(\Ket0\Bra0-i\Ket0\Bra1+i\Ket1\Bra0+\Ket1\Bra1) -\frac{1}{2}(\Ket0\Bra0+i\Ket0\Bra1-i\Ket1\Bra0+\Ket1\Bra1) =-i\Ket0\Bra1+i\Ket1\Bra0 =Y.$
$\Ket0\Bra0-\Ket1\Bra1=Z.$
In Bloch sphere notation, $\Ket\psi=\alpha\Ket0+\beta\Ket1,$ where $\alpha=\cos(\theta/2),~\beta=e^{i\phi}\sin(\theta/2).$
$\Braket{\psi|I|\psi}=1.$
$\Braket{\psi|X|\psi} =(\bar\alpha\Bra0+\bar\beta\Bra1)(\alpha\Ket1+\beta\Ket0) =\bar\alpha\beta+\alpha\bar\beta =\cos(\theta/2)\sin(\theta/2)\left(e^{i\phi}+e^{-i\phi}\right) =\sin(\theta)\cos(\phi).$
$\Braket{\psi|Y|\psi} =(\bar\alpha\Bra0+\bar\beta\Bra1)(i\alpha\Ket1-i\beta\Ket0) =-i\bar\alpha\beta+i\alpha\bar\beta =-i\cos(\theta/2)\sin(\theta/2)\left(e^{i\phi}-e^{-i\phi}\right) =\sin(\theta)\cos(\phi).$
$\Braket{\psi|Z|\psi} =(\bar\alpha\Bra0+\bar\beta\Bra1)(\alpha\Ket0-\beta\Ket1) =\alpha^2-\beta^2 =\cos^2(\theta/2)-\sin^2(\theta/2) =\cos\theta.$
$\Braket{\psi|H|\psi} =\Rsr2(\bar\alpha\Bra0+\bar\beta\Bra1)(\alpha\Ket0+\alpha\Ket1+\beta\Ket0-\beta\Ket1) =\Rsr2\left(\bar\alpha(\alpha+\beta)+\bar\beta(\alpha-\beta)\right) =\Rsr2\left(\alpha^2-\beta^2+\bar\alpha\beta+\alpha\bar\beta\right) =\Rsr2(\cos\theta+\sin\theta\cos\phi) .$
In all cases, the expectation value is the projection of the state vector onto the axis about which the operator is rotating the vector.
Observe these relationships:
$I\Ket\psi=\Ket\psi.$
$X\Rsr2(\Ket0+\Ket1)=\Rsr2(\Ket1+\Ket0)=\Rsr2(\Ket0+\Ket1).$ Eigenvalue is $+1$.
$X\Rsr2(\Ket0-\Ket1)=\Rsr2(\Ket1-\Ket0)=-\Rsr2(\Ket0-\Ket1).$ Eigenvalue is $-1$.
$Y\Rsr2(\Ket0+i\Ket1)=\Rsr2(i\Ket1+i(-i)\Ket0)=\Rsr2(\Ket0+i\Ket1).$ Eigenvalue is $+1$.
$Y\Rsr2(\Ket0-i\Ket1)=\Rsr2(i\Ket1-i(-i)\Ket0)=-\Rsr2(\Ket0-i\Ket1).$ Eigenvalue is $-1$.
$Z\Ket0=\Ket0.$ Eigenvalue is $+1$.
$Z\Ket1=-\Ket1.$ Eigenvalue is $-1$.
$H\sqrt{1-\Rsr2}\left(\Ket0+\Rsr2(\Ket0+\Ket1)\right) =\sqrt{1-\Rsr2}\left(\Rsr2(\Ket0+\Ket1)+\Ket0\right) .$ Eigenvalue is $+1$.
$H\sqrt{1-\Rsr2}\left(\Ket1-\Rsr2(\Ket0-\Ket1)\right) =\sqrt{1-\Rsr2}\left(\Rsr2(\Ket0-\Ket1)-\Ket1\right) .$ Eigenvalue is $-1$.
Note:
For $\Ket0+\Ket+=(1+\Rsr2)\Ket0+\Rsr2\Ket1$, norm is $\sqrt{(1+\Rsr2)^2+\frac{1}{2}}=\sqrt{2+\sqrt2}$. Normalisation factor is $\frac{1}{\sqrt{2+\sqrt2}}=\frac{\sqrt{2-\sqrt2}}{\sqrt{4-2}}=\sqrt{1-\Rsr2}$.
So for $H$, eigenvector correspond to eigenvalue +1 is $\sqrt{1-\Rsr2}\Big(\Ket0+\Ket+\Big)$.
For $\Ket1-\Ket-=-\Rsr2\Ket0+(1+\Rsr2)\Ket1$, it has the same norm as above.
So for $H$, eigenvector correspond to eigenvalue -1 is $\sqrt{1-\Rsr2}\Big(\Ket1-\Ket-\Big)$.
From the above, the "head" of a Pauli axis is the eigenvector of the Pauli operator with eigenvalue $+1$, and the "tail" of it is its eigenvector with eigenvalue $-1$.
Generally speaking, if $\Ket\psi$ is an eigenstate of $A$, $B\Ket\psi$ is an eigenstate of $BA$.
Proof: $A\Ket\psi=\lambda\Ket\psi, (BA)\Ket\psi=\lambda(B\Ket\psi).$
Examples:
For two n-dimensional unitary matrices $A$ and $B$, their eigenvector representation would be $A\psi_i=\alpha_i\psi_i$ and $B\phi_j=\beta_j\phi_j$.
$(A\otimes B)\Ket{\psi_i\phi_j} =A\Ket{\psi_i}\otimes B\Ket{\phi_j} =\alpha_i\Ket{\psi_i}\otimes\beta_j\Ket{\phi_j} =\alpha_i\beta_j\Ket{\psi_i\phi_j} .$
Therefore, the $n^2$ eigenvalues of $A\otimes B$ are $\alpha_i\beta_j$ where $i,j=1,2\ldots,n,$ and their corresponding eigenvectors are $\Ket{\psi_i\phi_j}$.
Let us examine some simple two-qubit operators.
$Z$ has 2 eigenvectors:
Therefore, there are 4 eigenvectors for $Z\otimes Z$:
Please note that there are two eigenvalues $+1$ and $-1$, each has a multiplicity of 2. That means all vectors in the linear combination $\alpha_1\Ket{00}+\beta_1\Ket{11}$ form a 2-dimensional eigenspace with eigenvalue $+1$, and $\alpha_2\Ket{01}+\beta_2\Ket{10}$ with $-1$. (In practice, the state needs to be normalised, but not when considering as a subspace.)
A controlled-Z operation $\begin{bmatrix} 1&0&0&0\\ 0&1&0&0\\ 0&0&1&0\\ 0&0&0&-1 \end{bmatrix}$ can be considered as $\begin{bmatrix}I&0\\0&Z\end{bmatrix}=I\oplus Z.~~$ ($\oplus$ is direct sum.)
For a two-qubit state $\Ket{q_0}=[a,b]^T$ and $\Ket{q_1}=[c,d]^T,~ \Ket{q_0q_1} =(a\Ket0+b\Ket1)(c\Ket0+d\Ket1) =ac\Ket{00}+ad\Ket{01}+bc\Ket{10}+bd\Ket{11} =\begin{bmatrix}ac\\ad\\bc\\bd\end{bmatrix}$,
$cZ\Ket{q_0q_1} =\begin{bmatrix} 1&0&0&0\\ 0&1&0&0\\ 0&0&1&0\\ 0&0&0&-1 \end{bmatrix} \begin{bmatrix}ac\\ad\\bc\\bd\end{bmatrix} =\begin{bmatrix}ac\\ad\\bc\\-bd\end{bmatrix} =ac\Ket{00}+ad\Ket{01}+bc\Ket{10}-bd\Ket{11},~$ which is consistent.
Similarly, there are 4 eigenvectors for $I\otimes Z$:
Again, considering degeneracy with multiplicity of 3 for $+1$, $\alpha\Ket{00}+\beta\Ket{01}+\gamma\Ket{10}$ form a 3-dimensional eigenspace.
You may consider $\Ket{q_0}$ split into two subspaces: $\Ket0$ and $\Ket1$. The first subspace corresponds to $I\Ket{q_1}$ and second $Z\Ket{q_1}$.
For $\Ket{q_0q_1q_2}, ccZ =\begin{bmatrix} I&0&0&0\\ 0&I&0&0\\ 0&0&I&0\\ 0&0&0&Z \end{bmatrix} =I\oplus I\oplus I\oplus Z,$ corresponding to the four subspaces of $\Ket{q_0q_1}$.
For example, $X$ has 2 eigenvectors:
Therefore, there are 4 eigenvectors for $X\otimes X$:
Note: There are two eigenvalues $+1$ and $-1$, each has a multiplicity of 2. So there are two 2-dimensional eigenspaces.
A controlled-X operation $\begin{bmatrix} 1&0&0&0\\ 0&1&0&0\\ 0&0&0&1\\ 0&0&1&0 \end{bmatrix}$ can be considered as $\begin{bmatrix}I&0\\0&X\end{bmatrix}=I\oplus X.$
$cX\Ket{q_0q_1} =\begin{bmatrix} 1&0&0&0\\ 0&1&0&0\\ 0&0&0&1\\ 0&0&1&0 \end{bmatrix} \begin{bmatrix}ac\\ad\\bc\\bd\end{bmatrix} =\begin{bmatrix}ac\\ad\\bd\\bc\end{bmatrix} =ac\Ket{00}+ad\Ket{01}+bd\Ket{10}+bc\Ket{11},~$ which is consistent.
Similarly, there are 4 eigenvectors for $I\otimes X$:
Please note that the eigenvectors are orthonormal, which is a property of any unitary operation (like $cX$).
Another interesting point is about the multiplicity of eigenvalue of $+1$. Although the linear combination of $\alpha\Ket{00}+\beta\Ket{01}+\gamma\Rsr2(\Ket{10}+\Ket{11})$ contains all four basis states, it only spans a 3-dimensional space given that the coefficient of $\Ket{10}$ and $\Ket{11}$ are the same, reducing one degree of freedom. So a more proper representation of the 3-D eigenspace is $\alpha\Ket{00}+\beta\Ket{01}+\gamma\Ket{1+}$.
Consider $U\Ket\psi=e^{i\theta}\Ket\psi$, and the circuit
circ.h(0); circ.cU(0,psi);
$(I_2\oplus U)\Ket{+\psi} =\Rsr2\left(\Ket0\psi+\Ket1 e^{i\theta}\Ket\psi\right) =\Rsr2\left(\Ket0+e^{i\theta}\Ket1\right)\otimes\Ket\psi .$
As you see, the phase applied to $\Ket\psi$ is now on $q_0$.
For an operator $R_\phi(\theta)$ that rotate an arbitary vector about $\Ket\phi$ for $\theta$,
$\boxed{R_\phi(\theta)=\left(\sum_P\Braket{P}_\phi P(\theta)\right)+K_\phi(\theta)}~,$
where $P$ denotes the three Pauli operators $X$, $Y$, and $Z$, $\Braket{P}_\phi:=\Braket{\phi|P|\phi}$, and the K-Delta operator $K_\phi(\theta):=I\cos(\theta/2)\left(1-\sum_P\Braket{P}_\phi\right).$
It is apparent that $K_\phi(\pi)=0,$ so we have $R_\phi(\pi)=\Braket{X}_\phi X+\Braket{Y}_\phi Y+\Braket{Z}_\phi Z.$
Also, if $\theta=0,~R_\phi(0)=\left(\sum_P\Braket{P}_\phi\right)+I\left(1-\sum_P\Braket{P}_\phi\right)=I.$ Make sense.
If $\Ket\phi$ is an eigenvector of a Pauli operator $p$, $R_\phi(\theta)$ is reduced to $R_p(\theta)=p(\theta)$ and $K_p(\theta)=I\cos(\theta/2)\left(1-\Braket{p}_p\right)=0$.
Let us define $\Ket\xi$ being a state "in between" the three Pauli poles $+X$, $+Y$ and $+Z$. By symmetry, $\Braket{X}_\xi=\Braket{Y}_\xi=\Braket{Z}_\xi=\Rsr3.$
$K_\xi(\theta)=I\cos(\theta/2)\left(1-\sqrt3\right).$ After all, $K$ does have its time of being non-trivial.
As shown in Rotation - Matrix Exponential,
$R_x(\theta) =e^{-i{\theta\over 2}X} =\begin{bmatrix} \cos{\theta\over 2} & -i\sin{\theta\over 2}\\ -i\sin{\theta\over 2} & \cos{\theta\over 2} \end{bmatrix} .~~ R_x(\pi)=-iX .$
$R_y(\theta) =e^{-i{\theta\over 2}Y} =\begin{bmatrix} \cos{\theta\over 2} & -\sin{\theta\over 2}\\ \sin{\theta\over 2} & \cos{\theta\over 2} \end{bmatrix} .~~ R_y(\pi)=-iY .$
$R_z(\theta) =e^{-i{\theta\over 2}Z} =\begin{bmatrix} e^{-i\theta/2} & 0\\ 0 & e^{i\theta/2} \end{bmatrix} .~~ R_z(\pi)=-iZ .$
As shown in Kous Delta Operator
The Bloch rotation Operator about $\Ket\psi$ is $\boxed{~R_\psi(\theta) =\Braket{X}_\psi R_x(\theta)+\Braket{Y}_\psi R_y(\theta)+\Braket{Z}_\psi R_z(\theta)+\delta_\psi(\theta)~},$ where $\Braket{Q}_\psi$ is the expectation value of measurement on Pauli axis $Q$, and $\delta_\psi(\theta)$ is the Kous Delta Operator. $\boxed{\delta_\psi(\theta):=I\cos\frac{\theta}{2}\left(1-\Braket{X}_\psi-\Braket{Y}_\psi-\Braket{Z}_\psi\right) .}$
$\delta_\psi(\theta)=0$ when $\Ket\psi$ is $X, Y$ or $Z$, or when $\theta=\pi.$
Let us define $B_\psi:=R_\psi(\pi)=\Braket{X}_\psi R_x(\pi)+\Braket{Y}_\psi R_y(\pi)+\Braket{Z}_\psi R_z(\pi)=\Braket{X}_\psi X+\Braket{Y}_\psi Y+\Braket{Z}_\psi Z.~~~~ B_\psi^2=I,$ as $R_\psi(2\pi)=I.$
The givens:
$R_\psi(\theta) =e^{-i\frac{\theta}{2}B_\psi} =I\cos\frac{\theta}{2}-iB_\psi\sin\frac{\theta}{2} .~~~~ \left(\text{If }A^2=I,~~e^{i\alpha A}=\cos(\alpha)~I+i\sin(\alpha)~A .\right)$
Ref: "Rotation: Block Sphere" and http://www.vcpc.univie.ac.at/~ian/hotlist/qc/talks/bloch-sphere-rotations.pdf, in which $B$ is represented by $\HAT n\cdot\vec{\,\sigma~}.$
$R_x(\theta) =e^{-i\frac{\theta}{2}X} =I\cos\frac{\theta}{2}-iX\sin\frac{\theta}{2} ,$
$R_y(\theta) =e^{-i\frac{\theta}{2}Y} =I\cos\frac{\theta}{2}-iY\sin\frac{\theta}{2} ,$
$R_z(\theta) =e^{-i\frac{\theta}{2}Z} =I\cos\frac{\theta}{2}-iZ\sin\frac{\theta}{2} .$
$\mathrm{RHS} =\left(\Braket{X}_\psi I\cos\frac{\theta}{2}-i\Braket{X}_\psi X\sin\frac{\theta}{2}\right) +\left(\Braket{Y}_\psi I\cos\frac{\theta}{2}-i\Braket{Y}_\psi Y\sin\frac{\theta}{2}\right) +\left(\Braket{Z}_\psi I\cos\frac{\theta}{2}-i\Braket{Z}_\psi Z\sin\frac{\theta}{2}\right) +\delta_\psi(\theta) .$
$=\Big(\big(\Braket{X}_\psi+\Braket{Y}_\psi+\Braket{Z}_\psi\big)~I\cos\frac{\theta}{2}+\delta_\psi(\theta)\Big) -i\Big(\Braket{X}_\psi X+\Braket{Y}_\psi Y+\Braket{Z}_\psi Z\Big)\sin\frac{\theta}{2} =I\cos\frac{\theta}{2}-iB_\psi\sin\frac{\theta}{2}=\mathrm{LHS~~~~QED}.$
<< Bloch Vector | Top | Kous Delta Operator >> |