Random math: 19

Source of most content: Gunhee Cho

Diagonalization

Recall that the orthonormal basis(ONB) consisting of eigenvectors of $T$ exists iff $T$ is normal as shown in RM 18.

Corollary

\[A \in Mat_{n \times n} (\mathbb{C}) \text{ is normal iff } \exists \text{ unitary } Q \text{ s.t. } Q^*AQ \text{ is a diagonal matrix.} \nonumber\]

Proof

Since $A$ is normal, lin. op. $L_A : \mathbb{C}^n \rightarrow \mathbb{C}^n $ is also normal. Thus, there exists ONB $\beta = \{ v_1, \cdots, v_n \}$ of $\mathbb{C}^n $ consisting of eigenvectors of $L _A $. $\Leftrightarrow$ $[ L_A ] _{\beta} $ is a diagonal matrix s.t. $ [ L_A ] _\beta = [ Id ]^\beta _{\epsilon _n} [ L_A ]^{\epsilon _n} _{\epsilon _n} [ Id ]^{\epsilon _n} _\beta = Q^{-1} A Q = Q^* A Q$. Thus, there exists a unitary matrix $Q$ s.t. $Q^* A Q$ is a diagonal matrix.

NB1; $A \in Mat_{n \times n} (\mathbb{R})$. Then, $A$ is self-adjoint iff $A=A^T$, i.e., $A$ is symmetric.
NB2; If $A\in Mat_{n \times n} (\mathbb{R})$ is symmetric, then $A$ is normal. But note that converse is not true.
NB3; If for $A\in Mat_{n \times n} (\mathbb{C})$, $A=A^H$ holds, then $A$ is normal. But symmetric $A$ does not mean $A$ is normal.

Lemma

\[\text{Assume } T: V \rightarrow V \text{ is self-adjoint (S-A). Then, } \nonumber\]
  1. Every eigenvalue of $T$ is real.
  2. Let $V$ be f.d.i.p.s over $\mathbb{R}$. Then, $\varphi_T (t)$ splits. (thus we can use Schur’s thm)

Proof

  1. Let $T(v) = \lambda v$ since $T$ is normal ($\because T$ is self-adjoint). Then, $T(v)= T^* (v) = \bar{\lambda} v = \lambda v = T(v)$ $\Leftrightarrow (\lambda - \bar{\lambda})v = 0 \Leftrightarrow \lambda = 0 \in \mathbb{R}$.
  2. By G-S process, there exists ONB $\beta$ of $V$. Let $A = [ T ] _\beta $. Then, extend $L_A$ to $\hat{L}_A: \mathbb{C}^n \rightarrow \mathbb{C}^n \ (X \mapsto AX ) $, where clearly $\hat{L}_A \mid _{\mathbb{R}^n} = L_A$. Let $\epsilon$ denote the standard basis of $\mathbb{C}^n$. Then, $[ \hat{L}_A ] _\epsilon = A = A^H$ since $A^H =( [ T ] _\beta )^H \stackrel{\text{ONB } \beta}{=} [ T^H ] _\beta$$ \stackrel{\text{S-A } T}{=} [ T ] _\beta = A$. Thus $\hat{L}_A$ is also self-adjoint. Since all $\lambda_i \in \mathbb{R}$, $\varphi _{\hat{L} _A} (t) = \varphi _{ L_A }(t) = \Pi _{i}(t-\lambda_i Id)$. This implies that $ \varphi _{ L_A }(t) $ splits.

Theorem

\[\text{Given f.d.i.p.s } V \text{ over } \mathbb{R} \text{ and } T: V \rightarrow V. \text{Then, following holds: } \nonumber\]
  1. There exists ONB $\beta$ consisting of eigenvectors of $V$ s.t. $ [ T ] _\beta$ is a diagonal matrix iff $T$ is self-adjoint.
  2. $A \in Mat_{n\times n} (\mathbb{R})$ is symmetric iff $\exists$ orthogonal matrix $Q$ s.t. $Q^\top A Q$ is a diagonal matrix.

Proof

    • $(\Rightarrow)$ $ [ T^* ] _\beta \stackrel{\text{ONB} \beta}{=} ([ T ] _\beta)^H = diag(\bar{\lambda} _1, \cdots, \bar{\lambda} _n ) \stackrel{\lambda \in \mathbb{R}}{=} diag(\lambda _1, \cdots,\lambda _n ) = [ T ] _\beta $. Thus, $T= T^*$, $T$ is S-A.
    • $(\Leftarrow)$ Since $T$ is S-A, by above lemma-2, $\varphi_T (t)$ splits. By Schur’s thm, there exists ONB $\beta$ s.t. $ [ T ] _\beta $ is upper triangular matrix. Then, $ [ T ] _\beta \stackrel{\text{S-A } T}{=} [ T^H ] _\beta \stackrel{\text{ONB} \beta}{=} ([ T ] _\beta)^H$ $\Leftrightarrow [ T ] _\beta $ is a diagonal matrix.
  1. Above Corollary. (NB1)

Unitary and Orthogonal

Def. Let $(V, \langle , \rangle)$ be f.d.i.p.s over $F$. Then, $T$ is called unitary and orthogonal if $T\circ T^* = T^* \circ T = Id$, for $F \in \mathbb{C}$ and $F \in \mathbb{R}$ respectively.

Proposition

\[\text{TFAE} \nonumber\]
  1. $\forall x \in V, \ \Vert T(x) \Vert = \Vert x \Vert $.
  2. $ T T^* = T^* T = Id$, i.e., $T$ is unitary/orthogonal.
  3. $\forall x,y \in V$, $< T(x), T(y) > = < x, y> $, i.e., $T$ preserves the inner product.
  4. Let $\beta = \{ v_1, \cdots, v_n \}$ be ONB. Then, $T(\beta) = \{ T(v_1), \cdots, T(v_n) \}$ is also ONB.
  5. There exists ONB $\beta$ s.t. $T (\beta)$ is ONB of $V$.

Proof

It starts from following lemma.

\[\text{Assume } T: V \rightarrow V \text{ is S-A. If } \forall x \in V, \ < x, T(x) > = 0 \text{ then, } T = 0. \nonumber\]
  1. $ (1 \Rightarrow 2) $: NTS $Id - T^* T = 0$, i.e., $ \forall x \in V, < x, (Id - T ^* T)(x) > 0$ by above lemma. Here, $ < x,x > - < x, T^* T(x) > = < x, x > - < T(x) , T(x) > = \Vert x \Vert^2 - \Vert T(x) \Vert^2 \stackrel{1}{=} 0$.
  2. $ (2 \Rightarrow 3) $: $ < T(x), T(y) > = < x, T^* T(y) > \stackrel{2}{=} < x, y >$.
  3. $ (3 \Rightarrow 4)$ : $ < T(v_i), T(v_j) > \stackrel{3}{=} < v_i, v_j > = \delta_{ij}$ $\implies \{ T(v_i) \}$ is ONB.
  4. $ (4 \Rightarrow 5)$ : By 4, if ONB $\beta$ exists, 5 holds. By G-S process, there exists ONB $\beta$ of $V$.
  5. $ (5 \Rightarrow 1)$ : Let $\beta=\{ v_1, \cdots, v_n \}$ be ONB s.t. $T(\beta)$ is ONB. $\forall x \in V$ we can write $x = \sum_{i=1}^n a_i v_i$. Then, $ \Vert T(x) \Vert ^2 = < T(x), T(x) > $ $= < \sum_{i=1}^n a_i T(v_i), \sum_{j=1}^n a_j T(v_j) > = \sum_{i=1}^n \sum_{j=1}^n a_i \bar{a_j} < T(v_i), T(v_j) > \stackrel{5}{=} \sum_{i=1}^n \sum_{j=1}^n a_i \bar{a_j} < v_i, v_j > $$ = < \sum_{i=1}^n a_i v_i, \sum_{j=1}^n a_j v_j > = < x, x > = \Vert x \Vert^2$.

Proof of lemma

Since $T$ is S-A, there exists ONB $\beta = \{ v_1, \cdots, v_n \}$ consisting of eigenvectors. By Asm, $\forall i, < v_i, T(v_i) > = 0$ $= < v_i, \lambda_i v_i > = \bar{\lambda}_i < v_i, v_i > = \bar{\lambda}_i$. Thus, $\forall i, \lambda_i = 0 \implies [ T ] _\beta = 0$.

Corollary

\[\text{TFAE} \nonumber\]
  1. $\forall x \in \mathbb{R} \text{ or } \mathbb{C},\ \Vert Ax \Vert = \Vert x \Vert$.
  2. $A$ is unitary/orthogonal.
  3. $\forall x,y , \ < Ax, Ay > = < x, y >$.

Projection

Def. Let $(V, \langle , \rangle)$ be f.d.i.p.s over $F$. Then, $T: V\rightarrow V$ is called projection if

  1. $V = \text{Im}(T) \oplus \ker (T)$.
  2. For $x \in \text{Im} (T), \ T(x) x$, i.e., $T \mid_{\text{Im}T}=Id$.

Def. Let $(V, \langle , \rangle)$ be f.d.i.p.s over $F=\mathbb{R}$ or $\mathbb{C}$. Then, a projection $T: V\rightarrow V$ is called orthogonal if $\ker (T) = \text{Im}(T)^\perp$.

Proposition

\[T: V\rightarrow V \text{ is a projection iff } T\circ T = T^2 = T. \nonumber\]

Proof

  1. $(\Rightarrow)$ Since $T$ is a projection, $V= \text{Im}(T) \oplus \ker(T)$. Thus, $\forall (x\in \text{Im}(T),y\in\ker(T)) \in V$, $T^2((x,y))= T \circ T(x,y) \stackrel{y\in\ker(T)}{=} T((x,0))\stackrel{x\in\text{Im}(T)}{=}(x,0)=T(x,y)$.
  2. $(\Leftarrow)$ It suffices to show $\text{Im}(T) \cap \ker(T) = \{ 0 \}$ to prove $V = \text{Im}(T) \oplus \ker(T)$ by the dimension formula, $dim V = dim \text{Im}(T) + dim \ker (T)$. If $x \in \text{Im}(T) \cap \ker(T)$, then $x = T(y)$ for some $y \in V$. Since $x \in \ker T$, $T(x) = 0 = T(T(y)) \stackrel{T^2 = T}{=} T(y) = x$.

\[T: V\rightarrow V \text{ is an orthogonal projection iff } T\circ T = T^2 = T^*. \nonumber\]

Proof

  1. $(\Leftarrow)$ $\ker(T) \stackrel{NB1}{=} (\text{Im}(T^* ))^\perp \stackrel{ T=T^* }{=} (\text{Im}(T))^\perp \leadsto \ker(T) = (\text{Im}(T ))^\perp$.
  2. $(\Rightarrow)$ For any $x, y \in V$, let $x= x_1 + x_2$ and $y= y_1 + y_2$ for $x_1, y_1 \in \text{Im}(T)$ and $x_2, y_2 \in \ker (T)$. Then, $< T(x), y > = < x_1, y_1+y_2 > = < x_1, y_1 >$ and $< T^* (x), y > = < x, T(y) > = < x_1 + x_2, y_1 > = < x_1 , y_1 > $. Thus, $ T = T^* $.

NB1; For f.d.i.p.s $V$, $ \ker (T) = (\text{Im}(T^* ))^\perp$: $x \in \ker(T) \Leftrightarrow T(x) = 0 \Leftrightarrow \forall y \in V, < T(x), y > = 0 \Leftrightarrow \forall y \in V, < x, T^* (y) > = 0 \Leftrightarrow x \perp \text{Im}(T^* ) \Leftrightarrow x \in (\text{Im}(T^* ))^\perp$

Spectral Theorem

\[\text{Given a f.d.i.p.s } (V, \langle , \rangle) \text{ over } F,\ \text{let linear operator } T: V\rightarrow V. \text{Let } \{ \lambda_i \}_{i=1}^k \text{be mutually distinct eigenvalues} \nonumber\] \[\text{and } \{ W_i \}_{i=1}^k \text{ be eigenspaces of } T \text{ corresponding to } \lambda, \text{ i.e., } W_i = \ker (T- \lambda_i Id). \nonumber\] \[\text{Assume } T \text{ is normal/self-adjoint based on the field. Then, following holds: } \nonumber\]
  1. $V = W_1 \oplus \ldots \oplus W_k$.
  2. $W_1 \perp W_j$ if $i \ne j$.
  3. $\forall i, \ W_i^\perp = \ldots \oplus W_{i-1} \oplus W_{i+1} \oplus \ldots \oplus W_k$.
  4. Define $T_i : V \rightarrow V \left( (x_1, \ldots, x_k) \mapsto (0, \ldots, 0, x_i, 0, \ldots, 0) \right)$. Then, $T_i$ is orthogonal projection satisfying

    a. $T_i \circ T_j = \delta_{ij}T_i$,
    b. $\sum_{i=1}^k T_i= Id$,
    c. $\sum_{i=1}^k \lambda_i T_i = T$.

NB2; This indeed implies that there exists orthonormal basis consisting of eigenvectors and its proof is given in RM 18.
NB3; Let $\forall i, \beta_i $ be an orthonormal basis of $ W_i $, i.e., $\beta=\cup_{i=1}^k \beta_i$. Then, $ [ T ] _{\beta} = diag( \lambda _1 I _{n _1}, \ldots, \lambda _k I _{n_k} ) $, where $n_i = dim W_i$.