Random math: 14

Source of most content: Gunhee Cho

Determinant

Let $A = (a_{ij}) \in Mat_{n\times n}(F),\ B = (b_{jk}) \in Mat_{n\times n}(F)$ and $C=A \times B = (c_{ij}),$ where $c_{ij} = \sum_{k=1}^n a_{ik}b_{kj}$ and $F$ is a field. Hence, searching $A^{-1}$ is sames as finding $(b_{kj})$ s.t. $f_{ij} = \sum_{k=1}^n a_{ik}b_{kj}$ where $f_{ij}=\delta_{ij}$, Kronecker delta $\leadsto AA^{-1}=A^{-1}A=I_n$.

Def. (Laplace expansion) $\det A = \sum_{k=1}^n a_{kj}M_{kj}$, where $M_{kj}=(-1)^{k+j}N_{kj}$ is a cofactor and $N_{kj}$ is a $k,j$ minor of $A$.

Take $b_{kj}:= M_{jk} = M_{kj}^\top$. Then, $\sum_{k=1}^n a_{ik}b_{kj}= \sum_{k=1}^n a_{ik}M_{jk}=\delta_{ij}\det A$. Hence, $\sum_{k=1}^n a_{ik}(\frac{1}{\det A}M_{kj}^\top)=f_{ij},$ i.e., $(\frac{1}{\det A}M_{kj}^\top)$ is the answer that we want to find. Define $c_{ij} := M_{ji}^\top$, i.e., $C=(c_{ij})$ is adjugate matrix. Then, $A^{-1} = \frac{1}{\det A}C=\frac{1}{\det A} M^\top$. Thus, $\det A$ should be non-zero in order to determine $A^{-1}$.

Def. $A=(a_{ij}) \in Mat_{n\times n}(F), \det A = \sum_{\sigma\in S_n} sgn(\sigma) \Pi_{i=1}^n a_{i\sigma(i)}$ is called determinant.

NB1; $\det A$ is the signed volume spanned by row vectors of $A$.
NB2; $\det A^\top = \sum_{\sigma\in S_n} sgn(\sigma) \Pi_{i=1}^n a_{\sigma(i)i}$. Since $\sigma$ is bijective, $\det A^\top = \sum_{\sigma\in S_n} sgn(\sigma) \Pi_{i=1}^n a_{i\sigma^{-1}(i)}$. Since $S_n$ is symmetric group, $\exists \sigma^{-1}=\nu \in S_n$. Therefore, $\det A^\top = \sum_{\nu \in S_n} sgn(\nu) \Pi_{i=1}^n a_{i\nu(i)}= \det A$.
NB3; From NB2, $\det A$ is the signed volume spanned by column vectors of $A$.
NB4; $A, B \in Mat_{n \times n}(F),\ \det (A\times B) = \det A \times \det B$.

Def. There are three types of elementary row operations:

  1. Row multiplication: multiply a non-zero constant in some row.
  2. Row switching: interchange two rows.
  3. Row addition: add a row with multiplying a constant to another row.

Def. Two matrices are row equivalent if one can be changed to the other by elementary row operations.

Proposition

\[\text{Those followings are equivalent: Let } A \in Mat_{n \times n} (F), \nonumber\]
  1. $A$ has rank $n$, i.e., $A$ has full-rank. $(dim \text{Im}A = n)$
  2. $AX=0$ has only trivial solution , i.e., $X=0$.
  3. $\det A \ne 0$.
  4. $A$ is invertible, i.e., $\exists A^{-1}$ s.t. $AA^{-1} = A^{-1}A = I$.
  5. $A$ is row equivalent to $I$.
  6. $AX=B$ has a unique solution for any $n\times 1$ matrix B.
  7. Row vectors of $A$ are linearly independent.
  8. Column vectors of $A$ are linearly independent.

Proof

  1. (1) $\Leftrightarrow$ (2)
    • $(\Rightarrow)$ Let $A: F^n \rightarrow F^n$ be a linear transformation over $F$ (linear operator). $A$ has rank $n$ implies that $dim \ker A = 0$ by rank-nullity theorem. This implies that $AX=0$ has only trivial solution $X=0$.
    • $(\Leftarrow)$ $AX=0$ has only trivial solution $\Leftrightarrow$ $dim \ker A = 0 \leadsto dim \text{ Im} A =n$.
  2. (3) $\Leftrightarrow$ (4)
    • $(\Rightarrow)$ If $\det A \ne 0,\ \exists \frac{1}{\det A} M^\top =: A^{-1}$.
    • $(\Leftarrow)$ If $\exists A^{-1} = \frac{1}{\det A} M^\top$, this implies that $\det A \ne 0$.
  3. (4) $\Leftrightarrow$ (5): If $A^{-1}$ exists, we can find $A^{-1}$ using augmented matrix $(A \mid I) \rightarrow (I \mid A^{-1})$ by elementary row operations.
  4. (1) $\Leftrightarrow$ (6) : $dim \text{Im}A = n$ implies that basis of $A$ spans $F^n \Leftrightarrow$ There is a unique solution(representation) for $AX = B$
  5. (2) $\Leftrightarrow$ (7),(8) : Since $AX=0,$ $\forall i \in [1, n], \ \sum_{k=1}^n a_{ik} x_k =0$. Then, $x$ has trivial solution $\Leftrightarrow$ column vectors are linearly independent. Similarly, it holds for row vectors.
  6. (7),(8) $\Leftrightarrow$ (5) : linearly independent col/row vectors can be derived from $I$ only by elementary row operations.

Coordinate change matrix

Def. Let $V$ be a vector space over $F$ with $dim V = n < \infty$ and $Id:V\rightarrow V \ (v \mapsto v)$ be a indentity map. Choose two bases $\beta, \gamma$ for $V$. Then, $[ v ] _\gamma = [ Id ] _{\beta}^\gamma [ v ] _\beta$, where $[ Id ] _{\beta}^\gamma$ is called coordinate change matrix from $\beta$ to $\gamma$.

Let $\beta = \{ v_1, \cdots, v_n \}, \gamma = \{ w_1, \cdots, w_n \}$. Then, for any $x \in V, \ x = \sum_{i=1}^n a_i v_i = \sum_{i=1}^n b_i w_i$. Let $v_j = \sum _{k=1}^n c _{kj}w _k,$ for $ 1 \leq j \leq n \leadsto [ Id ] _{\beta}^\gamma = (c _{ij}) _{1 \leq i,j \leq n}.$ Then, $x = \sum _{j=1}^n a_j v_j= \sum _{j=1}^n a_j (\sum _{k=1}^n c _{kj}w _k) = \sum _{j=1}^n \sum _{k=1}^n (a_j c _{kj}) w _k $ $\Leftrightarrow b_i = \sum _{k=1}^n (a_j c _{kj})$ for $1 \leq i \leq n \leadsto [ x ] _\gamma = [ Id ] _\beta^\gamma [ x ] _\beta.$

Proposition

\[\text{Let } V \text{ be a VS}/F \text{ with } dimV = n < \infty, \text{ and two bases } \beta, \gamma \text{ for } V. \text{ Then, following holds:} \nonumber\]
  1. $[ Id ] _\beta := [ Id ] _\beta^\beta = I_n$.
  2. $[ Id ] _\beta^\gamma = ([ Id ] _\gamma^\beta)^{-1}$. (Every coordinate change matrix is invertible)
  3. Every invertible matrix in $Mat_{n\times n}(F)$ is a coordinate change matrix in $F^n$.

Proof

  1. It is obvious from the definition of matrix representation.
  2. Consider $(V, \beta) \stackrel{Id}{\rightarrow} (V, \gamma) \stackrel{Id}{\rightarrow} (V, \beta)$. By NB5, $[Id \circ Id] _\beta^\beta = [ Id ] _\gamma^\beta \times [ Id ] _\beta^\gamma = I_n$.
  3. Let $A=(a_{ij}) \in Mat_{n \times n}(F)$ be an invertible matrix. Let $\beta = (A^1, \cdots, A^n),$ where $A^i$ is i-th column vector of $A$. Then, $A$ is invertible iff $\beta$ is linearly independent in $F^n$, i.e., iff $\beta$ is a basis. Let $\epsilon_n=(e_1, \cdots, e_n)$ be the standard basis. Then, $[ Id ] _\beta^{\epsilon _n} = A,$ since $Id(A^j)=A^j = a _{1j}e _1 + \cdots + a _{nj}e _n$.

Similarity

Def. Let $A, B \in Mat_{n \times n}(F), \ A$ is similar to $B$ if there exists invertible matrix $P \in Mat_{n \times n}(F)$ s.t. $B=P^{-1}AP$ and it is denoted by $A \sim B$.


\[B \sim A \text{ iff there exists a basis } \beta \text{ for } F^n \text{ s.t. } [ L_A ]_\beta = B, \text{ where } L_A:F^n \rightarrow F^n \ (X \mapsto AX). \nonumber\]

Proof

  1. $(\Rightarrow)$ Let $B=P^{-1}AP=P^{-1}[ L_A ] _{\epsilon_n}^{\epsilon_n} P$ (eq. holds by the definition of matrix representation). Let $\beta = \{P^1, \cdots, P^n \},$ where $P^i$ is i-th column vector of invertible $P \Leftrightarrow$ column vectors are linearly independent. Then, $ P = [ Id ] _\beta^{\epsilon_n}$. Thus, $B= [ Id ] _{\epsilon_n}^\beta [ L_A ] _{\epsilon_n}^{\epsilon_n} [ Id ] _\beta^{\epsilon_n} = [ L_A ] _{\beta}^{\beta} =: [ L_A ] _{\beta}.$
  2. $(\Leftarrow)$ Let $[ L_A ] _\beta = B$. Then, $[ L_A ] _\beta = [ Id ] _{\epsilon_n}^\beta [ L_A ] _{\epsilon_n}^{\epsilon_n} [ Id ] _\beta^{\epsilon_n} = P^{-1} [ L_A ] _{\epsilon_n}^{\epsilon_n} P = P^{-1} A P \Leftrightarrow B \sim A$.