Matrix Operations

Matrices aren’t just grids of numbers—they’re powerful tools for performing linear transformations. Stretching, rotating, reflecting, and projecting: matrix operations reshape space and reveal structure in data, geometry, and beyond. 🔄

Linear Transformation

A transformation (function or mapping) \(T : \mathbb{R}^n \rightarrow \mathbb{R}^m\) denoted \(T\) from \(\mathbb{R}^n\) to \(\mathbb{R}^m\) is a rule that assigns to each vector \(\mathbf{x} \in \mathbb{R}^n\) a vector \(T(\mathbf{x}) \in \mathbb{R}^m\).

Matrices serve as linear transformations

\[ T(\mathbf{x}) = A\mathbf{x} \]

The set \(\mathbb{R}^n\) is called the domain of \(T\), and the set \(\mathbb{R}^m\) is called the codomain of \(T\).

Assume a matrix \(A\) with dimensions \(m\) x \(n\), we can compute the dot product or matrix multiplication by

\[ A\mathbf{x} = \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_n \end{bmatrix} = \begin{bmatrix} a_{11}x_1 + a_{12}x_2 + \cdots + a_{1n}x_n \\ a_{21}x_1 + a_{22}x_2 + \cdots + a_{2n}x_n \\ \vdots \\ a_{m1}x_1 + a_{m2}x_2 + \cdots + a_{mn}x_n \end{bmatrix} = \mathbf{b} \]

\[ A\mathbf{x} = \mathbf{b} \]

To perform the matrix multiplication \(A\mathbf{x}\), the number of columns in \(A\) must match the number of entries in the vector \(\mathbf{x}\).
That is, if \(A\) is an \(m \times n\) matrix and \(\mathbf{x}\) is an \(n \times 1\) column vector, the multiplication is valid and the result will be an \(m \times 1\) column vector.
If the dimensions do not align, the dot product is undefined.