Random math: 16

Source of most content: Gunhee Cho

Inner product space

Def. Given a finite dimenstional vector space $V$ over $F$, we call a function $\langle\ , \rangle: V \times V \rightarrow F$ the inner product if it satisfies:

  1. (Linearity w.r.t. 1st component) $\forall x,y,z \in V, \langle x+y, z\rangle = \langle x+z\rangle + \langle x, z\rangle.$
  2. (Linearity w.r.t. 1st component) $\forall c \in F, \forall x,y \in V, \ \langle cx, y \rangle = c\langle x, y\rangle.$
  3. (Conjugate symmetry) $\forall x,y \in V, \langle x, y \rangle = \overline{ \langle y, x \rangle}.$
  4. (Positive-definite) $\langle\ x, x \rangle > 0$ if $ x\ne 0$, $x=0 \leadsto \langle x, x \rangle = 0$.

Inner product introduces the concept of length, distance and angle.

NB1; Let $F=\mathbb{R}$, then $\langle, \rangle$ is symmetric and bilinear, i.e., it is linear w.r.t. 1st component and 2nd component.
NB2; Let $F=\mathbb{C}$, then $\langle, \rangle$ is not symmetric and not bilinear since $\langle x, cy \rangle = \overline{c} \langle x, y\rangle$ but Hermitian (conjugate linear).

Def. Given an inner product space $(V, \langle,\rangle )$, define norm of $x$, $\Vert x \Vert := \sqrt{\langle x, x \rangle}$, distance of $x$ and $y$, $d(x,y) := \Vert x - y \Vert$.

Def. Given a finite dimensional inner product space $(V, \langle , \rangle)$, let $S \leqslant V$. Define orthogonal complement of $S$, $S^\perp := \{v\in V \mid \langle v, s \rangle =0, \forall s \in S \Leftrightarrow v \perp S \}$.

e.g.) $\{0\}^\perp = V$ and $V^\perp = \{ 0\}$.

Proposition

\[\text{Let } S= \{ v_1, \cdots, v_k \mid v_i \ne 0\} \text{ be orthogonal subset of } V \text{, i.e., } \forall i \ne j, \langle v_i, v_j \rangle = 0. \nonumber\] \[\text{ Then, } S \text{ is linearly independent and } \forall y \in span(S), y = \sum_{i=1}^k a_i v_i, \text{ where } a_i = \frac{\langle y, v_i \rangle}{\langle v_i, v_i\rangle} \nonumber\]

NB3; With inner product, we can find the value of coefficient.

Proof

  1. Let $y=a_1 v_1 + \cdots + a_k v_k =0,\ a_i \in F$. Then, by definition, $a_i = \frac{\langle 0, v_i \rangle}{\langle v_i, v_i\rangle} = 0$. Thus, $S$ is linearly independent.
  2. $\langle y, v_i \rangle = \langle \sum_{j=1}^k a_j v_j , v_i \rangle \stackrel{linearity}{=} \sum_{j=1}^k \langle a_j v_j , v_i \rangle \stackrel{orthogonal}{=} \langle a_i v_i , v_i \rangle = a_i \langle v_i , v_i \rangle.$

\[\text{Given a finite dimensional inner product space } (V, \langle , \rangle). \text{ Let } W \leqslant V. \text{ Then, } W^\perp \leqslant V \text{ and } V = W \oplus W^\perp. \nonumber\]


NB4; For $v \in V\setminus \{0\}$, we can write $v= w+ w^\perp, w\in W$. From above proposition and this, we can see that this expression is unique. Then, $w$ is the unique vector in $W$ that is closest to $v$, i.e., $ \Vert v -w \Vert \leq \Vert v- x \Vert, \forall x \in W$.

Proof

  1. Since $0 \in W^\perp$, $W^\perp \ne \emptyset$. Take any $v_1, v_2 \in W^\perp,$ then $\langle v_1 - v_2, w\rangle = \langle v_1, w\rangle - \langle v_2, w \rangle = 0-0 =0$, i.e., $v_1 - v_2 \in W^\perp$. Therefore, $W^\perp \leqslant V$.
  2. To prove $V= W \oplus W^\perp$, we need to show (a) $V=W+W^\perp$ and (b) $W \cap W^\perp = \{ 0 \}$.
    • (a) Let $\beta = \{ v_1, \cdots, v_k \}$ be an orthonormal basis of $W$ with $dim W = k$, i.e., $\Vert v_i \Vert =1, \forall i$. Take any $v \in V$ and define $u:= \sum_{i=1}^k \langle v, v_i \rangle v_i \in W$. Then, $\forall i, \langle v - u, v_i \rangle = \langle v -\sum_{j=1}^k \langle v, v_j \rangle v_j , v_i \rangle$ $= \langle v, v_i \rangle- \sum_{j=1}^k \langle v, v_j \rangle \langle v_j , v_i \rangle \stackrel{\text{orthonormal}}{=} \langle v, v_i\rangle - \langle v, v_i\rangle \langle v_i, v_i\rangle = 0$. Thus $v-u \perp W$, i.e., $v-u \in W^\perp$, which implies that $V= W+ W^\perp$.
    • (b) Take any $x \in W \cap W^\perp$. Then, $ \langle x, x\rangle = 0$ which implies that $ x= 0$, i.e., $W \cap W^\perp = \{ 0\}$.

Riesz representation

\[\text{ Consider } V^* = \{ f: V \rightarrow F \mid f \text{ is linear} \}. \text{ Take any } g \in V^*. \text{ Then, } \exists! y\in V \text{ s.t. } g=g_y, \text{ where } g_y: x \mapsto \langle x, y \rangle \nonumber\]


NB5; $g_y(x) = \langle x, y \rangle$ is linear since $\langle , \rangle$ is linear w.r.t. 1st component.
NB6; $\Phi: V^* \rightarrow V \ (\langle,y\rangle \mapsto y)$ is a VS isomorphism.

Proof

Choose an orthonormal basis $\beta = \{ v_1, \cdots, v_n \}$ of $V$.

  1. (Uniqueness) Suppose $y_1, y_2$ exist. Then, $g_{y_1} = g_{y_2}= g$, i.e., $\forall x \in V, \ \langle x , y_1 \rangle = \langle x, y_2 \rangle$ $ \leadsto \forall x \in V, \ \langle x , y_1 - y_2 \rangle =0$. Take $x = y_1 - y_2$, then $ \Vert y_1 - y_2 \Vert =0 \Leftrightarrow y_1 = y_2$.
  2. (Existence) If $y$ exists, then it can be represented by basis $y=\sum_{i=1}^n \langle y, v_i \rangle v_i.$ Thus, let $y = \sum_{i=1}^n \overline{g(v_i)}v_i$. Then, $\forall i, \ g_y(v_i) = \langle v_i, y \rangle = \langle v_i, \sum_{j=1}^n \overline{g(v_j)}v_j\rangle = \sum_{j=1}^n g(v_j) \langle v_i, v_j\rangle = g(v_i)$. Therefore, $g_y(x) = g(x), \forall x \in V $.

(Existence) Let $g \in V^*$. If $g=0$, then take $y=0$ which leads to $\forall x \in V, \ g(x)=\langle x, 0 \rangle = 0$. So assume $g \ne 0$. Let $\ker g = M$. Then, $M^\perp \ne \{ 0 \}$ by assumption $(g \ne 0)$. Take $t \in M^\perp$ s.t. $\Vert t \Vert = 1$ and $g(t) \ne 0 \because t \in M^\perp$. Then, let $y= \overline{g(t)}t$. Since $\forall x \in V, \ g(x-\frac{g(x)}{g(t)}t)=g(x) - \frac{g(x)}{g(t)}t=0 \leadsto$ $x-\frac{g(x)}{g(t)}t \in \ker f = M$. Since $y \in M^\perp $ $\leadsto \langle x-\frac{g(x)}{g(t)}t, y \rangle = \langle x, y \rangle - \langle \frac{g(x)}{g(t)}t, \overline{g(t)}t \rangle = \langle x, y \rangle - g(x)\Vert t \Vert =0$. Thus, $\forall x \in V,\ g(x) = g_y(x)$.

Gram–Schmidt process

Given a finite dimensional vector space $V$ over $F$, linearly independent subset $S= \{ w_1, \cdots, w_k\} \subseteq V$. Gram–Schmidt process transforms $S$ to have orthogonal basis for $span(S)$.

  1. Put $v_1 := w_1$.
  2. Put $v_2 := w_2 - \frac{\langle w_2, v_1 \rangle}{\langle v_1, v_1 \rangle} v_1$, where $\langle w_2, v_1 \rangle$ implies projection of $w_2$ onto $v_1$.
  3. In general, put $v_i := w_i -\sum_{j=1}^{k-1} \frac{\langle w_i, v_j \rangle}{\langle v_j, v_j \rangle} v_j$.

Let $S’ = \{ v_1, \cdots, v_k \}$ be the result of G-S process. Then, We can show that $S’$ is orthogonal subset of $V$ by M.I..

  1. $ \langle v_1, v_2 \rangle = \langle v_1, w_2 - \frac{\langle w_2, v_1 \rangle}{\langle v_1, v_1 \rangle}v_1 \rangle = \langle v_1, w_2 \rangle - \frac{\langle v_1, w_2 \rangle}{\langle v_1, v_1 \rangle} \langle v_1, v_1 \rangle = 0.$
  2. Assume $\{ v_1, \cdots, v_m \}$ is mutually orthogonal, i.e., $\langle v_i, v_j \rangle=0, \forall 1 \leq i < j \leq m < t$. Then, $\forall 1 \leq i \leq m, \ \langle v_i, v_{m+1} \rangle = \langle v_i, w_{m+1} - \sum_{j=1}^{m} \frac{\langle w_{m+1}, v_j \rangle}{\langle v_j, v_j \rangle} v_j \rangle = \langle v_i, w_{m+1} \rangle - \frac{\langle v_i, w_{m+1} \rangle}{\langle v_i, v_i \rangle} \langle v_i, v_i \rangle = 0 \because \langle v_i, v_j \rangle=0$.
    Thus, $S’$ is mutually orthogonal subset of $V$ and is linearly independent from above proposition, i.e., $S’$ is orthogonal basis for $span(S)$.
\[\therefore \text{Let } V \text{ be a finitie dimensional inner product space. Then, } V \text{ always has an orthogonal basis.} \nonumber\]