xiwi

These are the solutions for Chapter 2: Introduction to Quantum Mechanics.

I decided not to solve problems 2.1–2.50, since they are mostly linear algebra, but I might come back to them later. Problems 2.72-2.81 are missing as well, at least for now.


2.51

We have to verify that $H^{\dagger}H = \mathbb{I}$.

We know that $H = \frac{1}{\sqrt 2}\left(\begin{smallmatrix}1 & 1\newline 1 & -1\end{smallmatrix}\right)$, so that $H^\dagger = \frac{1}{\sqrt 2}\left(\begin{smallmatrix}1 & 1\newline 1 & -1\end{smallmatrix}\right)^\dagger = H$.

Then $$\begin{align*} H^\dagger H = H^2 &= \frac{1}{2}\begin{pmatrix} 1&1\newline 1&-1 \end{pmatrix} \begin{pmatrix} 1&1\newline 1&-1\end{pmatrix}\newline &= \frac{1}{2}\begin{pmatrix} 2 & 0\newline 0 & 2\end{pmatrix} = \Bbb{I}. \end{align*}$$

2.52

Thanks to the previous exercise, we know that $H^\dagger H = \Bbb{I}$, and since $H^\dagger H = H^2$, then $H^2 = \Bbb{I}$.

2.53

We solve $$ \begin{align*} \det(H-\lambda\Bbb I) &= \det \begin{pmatrix} \frac{1}{\sqrt 2}-\lambda & \frac{1}{\sqrt 2}\newline \frac{1}{\sqrt 2}&-\frac{1}{\sqrt 2}-\lambda \end{pmatrix} = 0\newline \Rightarrow \lambda &= \pm 1. \end{align*} $$

For $\lambda_+ = 1$ and $\ket\psi_+ = \left(\begin{smallmatrix}\alpha\newline\beta\end{smallmatrix}\right)$ we have the eigenvector condition $H\ket\psi_+ = \lambda_+\ket\psi_+ $. We get $$ \begin{matrix} \Rightarrow & \frac{1}{\sqrt 2}\alpha + \frac{1}{\sqrt 2}\beta = \alpha & \text{and} & \frac{1}{\sqrt 2}\alpha - \frac{1}{\sqrt 2}\beta = \beta. \end{matrix} $$ Solving this system yields $\ket\psi_+ = \begin{pmatrix}1\newline \sqrt2 -1\end{pmatrix}$ before normalisation.

For $\lambda_{-} = -1$ we can do an analogous analysis yielding $\ket\psi_{-} = \begin{pmatrix}1\newline -\sqrt2 -1\end{pmatrix}$ before normalisation.

2.54

Suppose $A$ and $B$ are both Hermitian and they commute. Sine they are Hermitian, they are diagonalisable $$\begin{matrix} A = \sum_i a_i \ket i\bra i & \text{and} & B = \sum_i b_i \ket i\bra i, \end{matrix} $$ where {$\ket i$} is the common eigenvalue basis between the two. We note that the sum $A + B = \sum_i (a_i + b_i)\ket i\bra i$ is Hermitian too (and thus diagonalisable).

Exponentiating $A$ and $B$ we have $$\Rightarrow\begin{matrix} \exp A = \sum_i e^{a_i} \ket i\bra i & \text{and} & \exp B = \sum_i e^{b_i} \ket i\bra i, \end{matrix} $$ so that $$\begin{align*} \exp A\exp B &= \sum_{ij} e^{a_i}e^{b_j}\ket i\braket{i\vert j}\bra j\hspace{0.5cm} (\braket{i\vert j} = \delta_{ij})\newline &= \sum_i e^{a_i+b_j}\ket i\bra i, \end{align*} $$ which is just $\exp{A+B}$. We can verify that $\exp B\exp A$ gives the same result.

2.55

We know that $$ U(t_1,t_2) \equiv \exp\left[\frac{-i\cal{H}(t_2-t_1)}{\hbar}\right], $$ where the Hamiltonian $\cal H$ is a Hermitian operator. Using its spectral decomposition it has the form $$ U(t_1,t_2) = \sum_E \exp\left[\frac{-iE(t_2-t_1)}{\hbar}\right]\ket E\bra E, $$ so that one of the unitary conditions becomes $$ \begin{align*} U(t_1,t_2)^\dagger U(t_1,t_2) &= \left(\sum_E \exp\left[\frac{-iE(t_2-t_1)}{\hbar}\right]\ket E\bra E\right)^\dagger\left(\sum_{E'} \exp\left[\frac{-iE'(t_2-t_1)}{\hbar}\right]\ket{E'}\bra {E'}\right)\newline &= \sum_{E,E'}\exp\left(\frac{iE(t_2-t_1)}{\hbar}\right)\exp\left(\frac{-iE'(t_2-t_1)}{\hbar}\right)\ket{E}\braket {E\vert E'}\bra {E'}\newline &= \sum_{E} \exp\left[\frac{iE(t_2-t_1)}{\hbar}+\frac{-iE(t_2-t_1)}{\hbar}\right]\ket E\bra E\newline &= \sum_E\ket E \bra E = \Bbb I. \end{align*} $$ One can follow exactly the same procedure to show that $U(t_1,t_2)U(t_1,t_2)^\dagger = \Bbb I$. Therefore $U(t_1,t_2)$ is unitary.

2.56

Since unitary $U$ satisfies $UU^\dagger = \Bbb I$, it is a normal operator and has a spectral decomposition $U = \sum_j u_j\ket j\bra j$, where $u_j$ are its corresponding eigenvalues. We also know that because $U$ is unitary, its eigenvalues $u_j$ have the form $e^{i\theta_j}$ for values $\theta_j \in [0,2\pi)$. This means that $$ \begin{align*} -i\log U &= -i\sum_j\log(u_j)\ket j\bra j\newline &= -i\sum_j\log(e^{i\theta_j})\ket j\bra j \newline &= \sum_j\theta_j\ket j\bra j. \end{align*} $$ Since $\theta_j\in\Bbb R$, the operator $K$ defined by $K \coloneqq \sum_j\theta_j\ket j\bra j$ is diagonal and satisfies $K = K^\dagger$, which proves hermicity.

2.57

We have two sets of measurement operators {$M_m$} and {$L_l$}. Suppose we have a system in an initial state $\ket\psi$ and we apply the second set of mesurements, measuring the value $l$. This means that the state of the system is now $$\ket{\psi_l} = \frac{L_l\ket\psi}{\sqrt{\bra\psi L_l^\dagger L_l\ket\psi}}. $$ If we apply the first set of measurements now and measure $m$, the state of the system becomes

$$ \begin{align*} \ket{\psi_{m|l}} &= \frac{M_m}{\sqrt{\bra{\psi_l} M_m^\dagger M_m\ket{\psi_l}}}\ket{\psi_l}\newline &= \frac{M_m}{\sqrt{\bra{\psi_l} M_m^\dagger M_m\ket{\psi_l}}}\frac{L_l\ket{\psi}}{\sqrt{\bra\psi L_l^\dagger L_l\ket\psi}} \newline &=\frac{M_m L_l}{\sqrt{\bra{\psi} L_l^\dagger M_m^\dagger M_m L_l\ket{\psi}}}\ket\psi. \end{align*} $$ We can now define the measurement operators {$N_n$} $\equiv$ {$M_m L_l$}, such that every value $n\in\Bbb N$ corresponds to a unique pair $m,l$. Then, if we applied this new set of measurements to the initial state $\ket\psi$, the system would be in the state $$ \begin{align*} \ket{\psi_n} &= \frac{M_m L_l}{\sqrt{\bra{\psi} L_l^\dagger M_m^\dagger M_m L_l\ket{\psi}}}\ket\psi\newline &= \ket{\psi_{m|l}}. \end{align*} $$

We can verify that {$N_n$} indeed constitutes a measurement set: $$ \begin{align*} \sum_n N_n^\dagger N_n &= \sum_{m,l} L_l^\dagger M_m^\dagger M_m L_l\newline &= \sum_l L_l^\dagger \left(\sum_m M_m^\dagger M_m\right)L_l\newline &= \sum_l L_l^\dagger L_l = \Bbb I. \end{align*} $$

2.58

Since the system is already in an eigenstate of the observable $M$, $M\ket\psi = m\ket\psi$. Then $$ \begin{align*} \langle M \rangle &= \bra\psi M \ket\psi = m\newline \langle M^2 \rangle &= \bra\psi M^2 \ket\psi = m\bra\psi M \ket\psi = m^2 \end{align*}, $$ so that $\Delta(M) = \sqrt{\langle M^2 \rangle - \langle M \rangle^2} = m^2-m^2 = 0$.

2.59

We know that $X\ket x = \ket{x+1}$. Thus:

  1. $\langle X\rangle = \bra{0}X\ket{0} = \braket{0\vert 1} = 0$,
  2. $\langle X^2\rangle = \bra{0}X^2\ket{0} = \bra{0}X\ket{1} = \braket{0\vert 0} = 1$},
  3. $\Delta(X) = \sqrt{\langle X^2\rangle-\langle X\rangle^2} = 1$.

2.60

Suppose $\vec v$ is any real three-dimensional vector. Then $$ \begin{align*} \vec v \cdot \vec\sigma &= v_1\begin{pmatrix}0&1\newline 1&0\end{pmatrix}+v_2\begin{pmatrix}0&-i\newline i&0\end{pmatrix}+v_3\begin{pmatrix}1&0\newline 0&-1\end{pmatrix}\newline &= \begin{pmatrix}v_3 & v_1-iv_2\newline v_1+iv_2 & -v_3\end{pmatrix}. \end{align*} $$ Solving $$ \begin{align*} \det(\vec v \cdot \vec\sigma-\lambda\Bbb I) &= \det \begin{pmatrix} v_3-\lambda & v_1-iv_2\newline v_1+iv_2 & -v_3-\lambda \end{pmatrix} = 0\newline \Rightarrow \lambda^2 &- |v|^2 = 0\newline \therefore \lambda &= \pm 1. \end{align*} $$ To find the projectors we first need to find the associated eigenvectors, since $P_{\pm} = \ket{\lambda_\pm}\bra{\lambda_\pm}$.

For $\lambda_+ = 1$:

2.61

2.62

2.63

2.64

2.65

2.66

2.67

2.68

2.69

2.70

2.71