This is the fourth installment of a condensed summary of linear algebra theory following Axler’s text.
We continue with isometries and singular value decomposition.
An operator $T \in \L(V)$ is called positive if $T$ is self-adjoint and
\[\inner{Tv}{v} \geq 0\]for all $v \in V$.
If $V$ is a complex vector space, then $\inner{Tv}{v} \in \R$ for all $v \in V$ implies $T$ is self-adjoint, so self-adjointness can be dropped from the definition.
(Example.) If $U$ is a subspace of $V$, then the orthogonal projection $P_U$ is a positive operator. Let $\dim U = m$ and $\dim U\uperp = M$, and let $e_i$ be an orthonormal basis of $U$. We have $P_U(v) = \inner{v}{e_1}e_1 + \dots + \inner{v}{e_m}e_m$. Then $\inner{P_U(v)}{v} = \inner{\inner{v}{e_1}e_1 + \dots + \inner{v}{e_m}e_m}{v} = \norm{P_U(v)}^2$, where the last step follows from the fact that $V = U \oplus U\uperp$ implies $P_U(v) = \inner{v}{e_1}e_1 + \dots + \inner{v}{e_m}e_m$ can be extended to $v$ by adding $\sum _ {j=0} ^ M a_j f_j$ where each $f_j$ is an orthonormal basis vector of $U\uperp$. Therefore $\inner{Tv}{v} \geq 0$. To check self-adjointness, we have, for any $v, w \in V$, that
\[\begin{align*} \inner{P_U(v)}{w} &= \inner{P_U(v)}{P_U(w) + (w - P_U(w))} \\ &= \inner{P_U(v)}{P_U(w)} \\ &= \inner{v}{P_U(w)}, \end{align*}\]where $w - P_U(w) \in U\uperp$.
An operator $R$ is called a square root of an operator $T$ if $R^2 = T$.
Let $T \in \L(V)$. Then the following are equivalent:
(Proof.) We prove $(1) \implies (2) \implies \dots \implies (5) \implies (1).$ We have $(1) \implies (2)$ because, if $v$ is an eigenvector of $T$, then $\inner{Tv}{v} = \lambda \norm{v}^2 \geq 0$ implies $\lambda \geq 0$.
Now assume $(2)$. Then by the Spectral Theorem, $T$ has an orthonormal basis $e_k$ of eigenvectors. We are therefore allowed to construct a linear map $R$ defined by its action on the basis with
\[R(e_k) = \sqrt{\lambda_k} e_k,\]where $Te_k = \lambda_k e_k$. This $R$ is a square root of $T$. Let $v = a_1 e_1 + \dots + a_n e_n$ and $w = b_1 e_1 + \dots + b_n e_n$. We have
\[\begin{align*} \inner{Rv}{w} &= \sum_ { k=1} ^n a_k \sqrt{\lambda_k}\inner{e_k}{w} \\ &= \sum_ {j=1} ^ n \sum_ { k=1} ^n a_k \overline{b_j} \sqrt{ \lambda_k}\inner{e_k}{e_j} \\ &= \sum_{k=1}^n a_k \overline{b_k} \sqrt \lambda_k \end{align*}\]which can be shown to be equal to $\inner{v}{Rw}$.
Now $(3) \implies (4)$ by definition, so assume $(4)$. If $R$ is a self-adjoint square root of $T$, then $T = RR = R\adj R$. Now assume $(5)$. If there exists $R$ such that $T=R\adj R$, then $T\adj = (R\adj R)\adj = R\adj R$. Furthermore, for $v \in V$, we have $\inner{R\adj R v}{v} = \inner{Rv}{Rv} \geq 0$ by positivity of the inner product.
Every positive operator on $V$ has a unique positive square root.
(Proof.) Let $T \in \L(V)$ be a positive operator. Suppose $v \in V$ is an eigenvector of $T$. Then there exists $\lambda \geq 0$ such that $Tv = \lambda v$. Let $R$ be a positive square root of $T$. We want to show that $Rv = \sqrt{\lambda} v$. This will imply that $R$ is uniquely determined, because the Spectral Theorem asserts that $V$ has a basis of eigenvectors of $T$.
We know by the Spectral Theorem that $V$ has a basis of eigenvectors $e_k$ of $R$. Each eigenvalue, being a nonnegative real, can be expressed as the square root of another nonnegative real; we will write the eigenvalue of each $e_j$ as $\sqrt{\lambda_j}$. If $v = a_1 e_1 + \dots + a_n e_n$, then we have $R^2v = \lambda_1 a_1 e_1 + \dots + \lambda_n a_n e_n$. But $R^2v = Tv = \lambda a_1 e_1 + \dots + \lambda a_n e_n$. This implies that $(\lambda_j - \lambda)a_j =0 $ for all $j$. Then $v = \sum_{\lambda_j = \lambda} a_j e_j$, i.e., $R^2 e_j = \lambda e_j$ for all the $e_j$ with nonzero $a_j$. This implies $R^2 v = \lambda v$.
(Isometries preserve norms.) An operator $S \in \L(V)$ is called an isometry if
\[\norm{Sv} = \norm{v}\]for all $v \in V$.