Eigenvalues and Eigenvectors

Matrices aren’t just grids of numbers—they’re powerful tools for performing linear transformations. Stretching, rotating, reflecting, and projecting: matrix operations reshape space and reveal structure in data, geometry, and beyond. 🔄

Eigenvalues and Eigenvectors

For a square matrix A \in \mathbb{R}^{n \times n} a non-zero vector \mathbf{v} \in \mathbb{R}^n is an eigenvector of A if there exists a scalar \lambda \in \mathbb{R} such that
A \mathbf{v} = \lambda \mathbf{v}

Here, \lambda is called the eigenvalue associated with eigenvector \mathbf{v}.

Eigenvectors identify directions invariant under A.

Eigenvalues indicate the factor by which those directions are stretched or compressed.

Characteristic Equation

Eigenvalues satisfy the characteristic equation:

\det(A - \lambda I) = 0

For each eigenvalue \lambda, the corresponding eigenvectors are the non-zero solutions \mathbf{v} to:

(A - \lambda I) \mathbf{v} = \mathbf{0}