Matrices aren’t just grids of numbers—they’re powerful tools for performing linear transformations. Stretching, rotating, reflecting, and projecting: matrix operations reshape space and reveal structure in data, geometry, and beyond. 🔄
Linear Transformation
A transformation (function or mapping) \(T : \mathbb{R}^n \rightarrow \mathbb{R}^m\) denoted \(T\) from \(\mathbb{R}^n\) to \(\mathbb{R}^m\) is a rule that assigns to each vector \(\mathbf{x} \in \mathbb{R}^n\) a vector \(T(\mathbf{x}) \in \mathbb{R}^m\).
Matrices serve as linear transformations
\[
T(\mathbf{x}) = A\mathbf{x}
\]
The set \(\mathbb{R}^n\) is called the domain of \(T\), and the set \(\mathbb{R}^m\) is called the codomain of \(T\).
Assume a matrix \(A\) with dimensions \(m\) x \(n\), we can compute the dot product or matrix multiplication by
To perform the matrix multiplication \(A\mathbf{x}\), the number of columns in \(A\) must match the number of entries in the vector \(\mathbf{x}\).
That is, if \(A\) is an \(m \times n\) matrix and \(\mathbf{x}\) is an \(n \times 1\) column vector, the multiplication is valid and the result will be an \(m \times 1\) column vector.
If the dimensions do not align, the dot product is undefined.