Cool mathematical facts
An example of a distill-style blog post highlighting key insights
Matrix Insights
-
Remarkable Fact 1: When $A$ and $B$ are $n\times m$ matrices, both $A^\top B$ and $B^\top A$ exhibit a striking similarity in their reduced spectra. Specifically, if $m>n$, then one of them contains $m-n$ zero eigenvalues. This assertion is grounded in determinant and characteristic polynomials, as demonstrated in this proof.
-
Valuable Linear Algebra Identities:
- Equation 1: \(Tr(A^k)=\sum_{i=1}^n \lambda_i(A)^k\)
- Equation 2: \(Tr((A^*A)^k) = \sum_{i=1}^n \sigma_i(A)^{2k}\)
- Equation 3:
\(\log\det(A-z I_n)=\sum_{i=1}^n \log|\lambda_i(A)-z|=\sum_{i=1}^n\log|\sigma_i(A-z I)|\)
- Equation 4:
\(\log\det(A)=\sum_{i=1}^n|dist(X_i,span(X_1,\dots,X_{i-1}))|\)
Mathematical Concepts
- Lindenbert’s Exchange Method: This method, explained in detail in Tao’s presentation, involves two key steps:
- The Gaussian Case: Demonstrating the validity of the law when all underlying random variables are Gaussian.
- Invariance: Showing that the limiting distribution remains unchanged when non-Gaussian random variables are replaced by Gaussian random variables.
-
Weyl Inequality: For symmetric $n\times n$ matrices $S$ and $T$, the Weyl inequality states:
\begin{align}
\max_i |\lambda_i(S) -\lambda_i(T)| \le |S - T|
\end{align}
This inequality offers a powerful tool to bound the eigenvalues of perturbed matrices.
-
Hoffman-Wielandt Inequality: This inequality relates to the deviation of eigenvalues based on the Frobenius norm of deviations:
\begin{align}
\sum_{i=1}^n\ (\lambda_i(A)-\lambda_i(B))^2 \le |B|_F^2
\end{align}
For more information, refer to Jalil Chafai’s blog and Tao’s Blog.
- Davis-Kahan Theorem: When $S$ and $T$ are symmetric matrices, and the $i$-th eigenvalue of $S$ is well-separated, the Davis-Kahan theorem states:
\begin{align}
\max_{j\neq i}|\lambda_i(S)-\lambda_j(S)|\ge \delta
\implies \sin\angle(v_i(S),v_i(T)) \le \frac{|S-T|}{\delta}
\end{align}
This result provides insights into the closeness of eigenvectors.
Convexity in Symmetric Matrices
- Convexity of Trace Exponential Function: The trace exponential function, defined as $f(X):=\exp\tr(X)$, is convex in the space of symmetric matrices.