Home Algorithms Commercialization Data Science Information Theories Quantum Theories Lab Linear Algebra
<< Similarity PDF Sum of Ket-Bra >>

$\require{cancel} \newcommand{\Ket}[1]{\left|{#1}\right\rangle} \newcommand{\Bra}[1]{\left\langle{#1}\right|} \newcommand{\Braket}[1]{\left\langle{#1}\right\rangle} \newcommand{\Rsr}[1]{\frac{1}{\sqrt{#1}}} \newcommand{\RSR}[1]{1/\sqrt{#1}} \newcommand{\Verti}{\rvert} \newcommand{\HAT}[1]{\hat{\,#1~}} \DeclareMathOperator{\Tr}{Tr}$

Space Mapping

First created in September 2018

Mapping of two spaces

$\left\{\Ket{v_1},\Ket{v_2},\ldots,\Ket{v_m}\right\}$ is a basis of $V$.\qquad $\left\{\Ket{w_1},\Ket{w_2},\ldots,\Ket{w_n}\right\}$ is a basis of $W$.\qquad Linear mapping $A:V\mapsto W$

As the mapping is linear, if we define for each basis vector $\Ket{v_j}\in V$ the value of $A\Ket{v_j}=\Ket{\psi_j}\in W$, the mapping $A$ is well defined.

All $\psi_j$ can be represented as a linear combination of the basis vectors of $W,~~\Ket{\psi_j}=\sum_{i=1}^n a_{ij}\Ket{w_i}.$ We have $\boxed{A\Ket{v_j}=\sum_{i=1}^na_{ij}\Ket{w_i}.}$


So far we have not mentioned the word "matrix", but this mapping can be represented conveniently by a matrix. If $\left\{\Ket{v_j}\right\}$ represents the standard basis,

the mapping can be expressed as $\displaystyle A\Ket{v_j} =\begin{bmatrix}\\\Ket{w_1}&\Ket{w_2}&\ldots&\Ket{w_n}\\\\\end{bmatrix} ~~ \begin{bmatrix} a_{11}&a_{12}&\ldots&a_{1m}\\ a_{21}&a_{22}&\ldots&a_{2m}\\ \vdots&\vdots&\ddots&\vdots\\ a_{n1}&a_{n2}&\ldots&a_{nm}\\ \end{bmatrix} ~~ \Ket{v_j} .$

Here $\Ket{v_j}$ is a standard basis vector (e.g. $[0\,0\,\ldots\,1\,\ldots\,0]$). So $\Ket{v_j}$ "picks out" the $j^\mathrm{th}$ column of the $[a_{ij}]$ matrix, which produces the linear combination $\sum_{i=1}^na_{ij}\Ket{w_i}$.


From the above, one may see that $A =\begin{bmatrix}\\\Ket{w_1}&\Ket{w_2}&\ldots&\Ket{w_n}\\\\\end{bmatrix} ~~ \begin{bmatrix} a_{11}&a_{12}&\ldots&a_{1m}\\ a_{21}&a_{22}&\ldots&a_{2m}\\ \vdots&\vdots&\ddots&\vdots\\ a_{n1}&a_{n2}&\ldots&a_{nm}\\ \end{bmatrix}$

$=\Big[\sum_{i=1}^na_{i1}\Ket{w_i}~~~\sum_{i=1}^na_{i2}\Ket{w_i}~~~\ldots~~~\sum_{i=1}^na_{im}\Ket{w_i}\Big] =\Big[\Ket{\psi_1}~~~\Ket{\psi_2}~~~\ldots~~~\Ket{\psi_m}\Big] .$

This means $A$ is a collection of $\Ket{\psi_i}$, images of the $V$ basis vectors on $W$.

If $\Ket{w_i}$ are standard basis vectors as well, then the first matrix becomes identity matrix, and $A=[a_{ij}]$.

If $\Ket{w_i}$ are non-standard, $\boxed{A=\Big[\Ket{\psi_1}~~~\Ket{\psi_2}~~~\ldots~~~\Ket{\psi_m}\Big]}$ still holds, but the "coefficient operator" would be $[a_{ij}]$ only, such that $[a_{ij}]\Ket v$ is a column vector of coefficients of the non-standard basis vectors of $\Ket{w_i}$.

In short, the "coefficient operator" for $\Ket{w_i}$ is $[a_{ij}]=\left[\Ket{w_i}\right]^{-1}A$. When $\left\{\Ket{w_i}\right\}$ is standard basis, $\left[\Ket{w_i}\right]^{-1}=I$ and $A=[a_{ij}]$.


Example 1: If $A$ maps $\Ket0$ to $\Ket1$ and $\Ket1$ to $\Ket0$, the image of $\left\{\Ket0,\Ket1\right\}$ is $\left\{\Ket1,\Ket0\right\}$. $A=\Big[\Ket1~~~\Ket0\Big]=\begin{bmatrix}0&1\\1&0\end{bmatrix}=X.$

Example 2: Still the same $A$ in example 1, if $\Ket{w_1}=\Ket+$ and $\Ket{w_2}=\Ket-,~ \left[\Ket{w_i}\right]=H,~~ A=\left[\Ket{w_i}\right][a_{ij}] =\begin{bmatrix}\Rsr2&\Rsr2\\\Rsr2&-\Rsr2\end{bmatrix} \begin{bmatrix}a_{11}&a_{12}\\a_{21}&a_{22}\end{bmatrix} =\begin{bmatrix}0&1\\1&0\end{bmatrix} .$

As $[a_{ij}]=\left[\Ket{w_i}\right]^{-1}A=H^{-1}X,~~ \begin{bmatrix}a_{11}&a_{12}\\a_{21}&a_{22}\end{bmatrix} =\begin{bmatrix}\Rsr2&\Rsr2\\-\Rsr2&\Rsr2\end{bmatrix} =HX ,~~$ which maps $\left\{\Ket0,\Ket1\right\}$ to $\left\{\Ket1,\Ket0\right\}$ to $\left\{\Ket-,\Ket+\right\}$.\qquad Note: $H^{-1}=H.$


Let $\left\{\Ket{v_i}\right\},~\left\{\Ket{w_j}\right\}\text{ and }\left\{\Ket{x_k}\right\}$ be bases of spaces $V, W$ and $X$ of dimensions $m, n$ and $p$.\qquad $A:V\mapsto W,~B:W\mapsto X.$

If $A\Ket v=\Ket w$ and $B\Ket w=\Ket x$, then $(BA)\Ket v=\Ket x.$


Proof: Let us look at a basis vector $\Ket{v_i}$ and it will generalise to $\Ket v$ through linear combination. $A$ is $n\times m$ and $B$ is $p\times n$.

$A\Ket v=\Ket w$ means $\displaystyle A\Ket{v_i} =\begin{bmatrix}\\\Ket{w_1}&\Ket{w_2}&\ldots&\Ket{w_n}\\\\\end{bmatrix} ~~ \begin{bmatrix} a_{11}&a_{12}&\ldots&a_{1m}\\ a_{21}&a_{22}&\ldots&a_{2m}\\ \vdots&\vdots&\ddots&\vdots\\ a_{n1}&a_{n2}&\ldots&a_{nm}\\ \end{bmatrix} ~~ \Ket{v_i} .$

$B\Ket w=\Ket x$ means $\displaystyle B\Ket{w_j} =\begin{bmatrix}\\\Ket{x_1}&\Ket{x_2}&\ldots&\Ket{x_p}\\\\\end{bmatrix} ~~ \begin{bmatrix} b_{11}&b_{12}&\ldots&b_{1n}\\ b_{21}&b_{22}&\ldots&b_{2n}\\ \vdots&\vdots&\ddots&\vdots\\ b_{p1}&b_{p2}&\ldots&b_{pn}\\ \end{bmatrix} ~~ \Ket{w_j} .$

$\ldots$


WORKING BEGIN


$(BA)\Ket v=\Ket x$ means $\displaystyle (BA)\Ket{v_i} =\begin{bmatrix}\\\Ket{x_1}&\Ket{x_2}&\ldots&\Ket{x_p}\\\\\end{bmatrix} ~~ \begin{bmatrix} b_{11}&b_{12}&\ldots&b_{1n}\\ b_{21}&b_{22}&\ldots&b_{2n}\\ \vdots&\vdots&\ddots&\vdots\\ b_{p1}&b_{p2}&\ldots&b_{pn}\\ \end{bmatrix} ~~ \begin{bmatrix} a_{11}&a_{12}&\ldots&a_{1m}\\ a_{21}&a_{22}&\ldots&a_{2m}\\ \vdots&\vdots&\ddots&\vdots\\ a_{n1}&a_{n2}&\ldots&a_{nm}\\ \end{bmatrix} ~~ \Ket{v_i} .$

$B\Ket w=B(A\Ket v),~~ \begin{bmatrix}\\\Ket{x_1}&\Ket{x_2}&\ldots&\Ket{x_p}\\\\\end{bmatrix} ~~ \begin{bmatrix} b_{11}&b_{12}&\ldots&b_{1n}\\ b_{21}&b_{22}&\ldots&b_{2n}\\ \vdots&\vdots&\ddots&\vdots\\ b_{p1}&b_{p2}&\ldots&b_{pn}\\ \end{bmatrix} ~~ \begin{bmatrix}\\\Ket{w_1}&\Ket{w_2}&\ldots&\Ket{w_n}\\\\\end{bmatrix} ~~ \begin{bmatrix} a_{11}&a_{12}&\ldots&a_{1m}\\ a_{21}&a_{22}&\ldots&a_{2m}\\ \vdots&\vdots&\ddots&\vdots\\ a_{n1}&a_{n2}&\ldots&a_{nm}\\ \end{bmatrix} ~~ \Ket{v_i} .$

Proof: Let us look at a basis vector $\Ket{v_i}$ and it will generalise to $\Ket v$ through linear combination. $A$ is $n\times m$ and $B$ is $p\times n$. (Please note the indexing and notation of $A$ and $B$ are not necessarily the same as the above.)

$A\Ket{v_i}=\sum_{j=1}^na_{ji}\Ket{w_j}$ and $B\Ket{w_j}=\sum_{k=1}^pb_{kj}\Ket{x_k}$.\qquad $i\in[1,m],~j\in[1,n],~k\in[1,p].$

Let $p\times m$ matrix $C=BA$, so $c_{ki}=\sum_{j=1}^nb_{kj}a_{ji}.$

$A\Ket{v_i} =\sum_{j=1}^na_{ji}\Ket{w_j} =\sum_{j=1}^na_{ji}\left(\sum_{k=1}^pb_{kj}\Ket{x_k}\right) =\sum_{k=1}^p\left(\sum_{j=1}^nb_{kj}a_{ji}\right)\Ket{x_k} =\sum_{k=1}^pc_{ki}\Ket{x_k} .$

$\displaystyle C\Ket{v_i} =\begin{bmatrix}\\\Ket{x_1}&\Ket{x_2}&\ldots&\Ket{x_p}\\\\\end{bmatrix} ~~ \begin{bmatrix} c_{11}&c_{12}&\ldots&c_{1m}\\ c_{21}&c_{22}&\ldots&c_{2m}\\ \vdots&\vdots&\ddots&\vdots\\ c_{p1}&c_{p2}&\ldots&c_{pm}\\ \end{bmatrix} ~~ \Ket{v_i} .$


WORKING END


Completeness Relation

e.g. $\left\{\Ket+,\Ket-\right\},~\Ket+\Bra++\Ket-\Bra- ={1\over2}\left(\Ket0\Bra0+\Ket0\Bra1+\Ket1\Bra0+\Ket1\Bra1\right) +{1\over2}\left(\Ket0\Bra0-\Ket0\Bra1-\Ket1\Bra0+\Ket1\Bra1\right) =\Ket0\Bra0+\Ket1\Bra1 =I .$

Proof: For any vector $\Ket v=\sum c_i\Ket i,~~ \left(\sum\Ket i\Bra i\right)\Ket v =\left(\sum\Ket i\Bra i\right)\sum c_i\Ket i =\sum_i\sum_j\Ket i\Bra ic_j\Ket j =\sum_i\Ket ic_i+\sum_i\left(\Ket i\sum_{j\ne i}\Bra ic_j\Ket j\right) =\sum c_i\Ket i+0 =\Ket v .$


Find the eigenvalues of $H$:

$\left|H-\lambda I\right| =\left(\left(\Rsr2-\lambda\right)\left(-\Rsr2-\lambda\right)-{1\over2}\right) =\left(-{1\over2}+\lambda^2-{1\over2}\right)=0.~~\therefore~\lambda=\pm 1 .$

$H\Ket v=\pm\Ket v,~~ H[v_0~~v_1]^T=\pm[v_0~~v_1]^T,~~ \Rsr2(v_0+v_1)=\pm v_0~\text{and}~\Rsr2(v_0-v_1)=\pm v_1 .$

Case +: $v_1=\left(\sqrt2-1\right)v_0~\text{and}~v_0=\left(\sqrt2+1\right)v_1$ (linear dependent).

Normalise: $v_0^2+v_1^2=v_0^2(1+2-2\sqrt2+1)=1,~ \Ket v=\left[{1\over\sqrt{4-2\sqrt2}}~~{1\over\sqrt{4+2\sqrt2}}\right]^T =\cos\pi/8\Ket0+\sin\pi/8\Ket1 =\mathrm{Bloch}(\pi/4,0) .$

Case -: $v_1=\left(-\sqrt2-1\right)v_0~\text{and}~v_0=\left(-\sqrt2+1\right)v_1$ (linear dependent but $v_0v_1<0$).

Normalise: $v_0^2+v_1^2=v_0^2(1+2+2\sqrt2+1)=1,~ \Ket v=\left[{1\over\sqrt{4+2\sqrt2}}~~{-1\over\sqrt{4-2\sqrt2}}\right]^T =\cos 3\pi/8\Ket0-\sin 3\pi/8\Ket1 =\mathrm{Bloch}(3\pi/4,\pi) .$

Code Section

In [ ]:
# Initialisation

import sys
sys.path.append('../')
from qtol import *
In [ ]:
# Iteration

# Number of qubits
qbNum = 1

# Define the Quantum and Classical Registers
q = QuantumRegister(qbNum)
c = ClassicalRegister(qbNum)

qc = QuantumCircuit(q, c)

# Preparation
qc.iden(q)

# Circuit building
# ...

# Finalisation
# ...

show_me(qc, q, c, show_latex=True, show_bloch_vector=True, show_histogram=True)

circuit_drawer(qc)

Kept

$\Ket v=\sum_{j=1}^m a_j\Ket{v_j}~,~~~ \Ket w=\sum_{i=1}^n a_i\Ket{w_i}~,~~~ a\left(\sum_{j=1}^m a_j\Ket{v_j}\right) =\sum_{j=1}^m a_jA\Ket{v_j} .$

 

<< Similarity Top Sum of Ket-Bra >>