1.2 Vector Operations

Linear algebra operations are the building blocks of data transformations. Whether you’re scaling a vector or multiplying matrices, these operations make abstract ideas concrete—and power everything from graphics to machine learning!

Vector Addition

Two vectors can be added only when the are of the same length. The addition is element-wise addition, and the resulting vector is also of the same length. \bold{u}+\bold{v} =\begin{bmatrix}u_1\\u_2\\\vdots \\ u_n\end{bmatrix}+\begin{bmatrix}v_1\\v_2\\\vdots \\v_n\end{bmatrix}=\begin{bmatrix}u_1+u_2\\u_2+v_2\\\vdots \\u_n+v_n\end{bmatrix}.

Geometry of Vector Addition

Adding vectors places them head-to-tail, and the result is the diagonal of the parallelogram they form.

\bold{u} + \bold{v} = \begin{bmatrix} u_1 \\ u_2 \end{bmatrix} + \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} = \begin{bmatrix} u_1 + v_1 \\ u_2 + v_2 \end{bmatrix}

Vector Subtraction

Two vectors can be subtracted only when the are of the same length. The subtraction is element-wise subtraction, and the resulting vector is also of the same length. \bold{u}-\bold{v} =\begin{bmatrix}u_1\\u_2\\\vdots \\ u_n\end{bmatrix}-\begin{bmatrix}v_1\\v_2\\\vdots \\v_n\end{bmatrix}=\begin{bmatrix}u_1-u_2\\u_2-v_2\\\vdots \\u_n-v_n\end{bmatrix}.

Geometry of Vector Subtraction

Subtracting vectors gives the vector that points from the tip of (\bold{v}) to the tip of (\bold{u}).

\bold{u} - \bold{v} = \begin{bmatrix} u_1 \\ u_2 \end{bmatrix} - \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} = \begin{bmatrix} u_1 - v_1 \\ u_2 - v_2 \end{bmatrix}

Scalar Multiplication

For a scalar k\in\R, the scalar multiplication of a vector \bold{v}\in\R^n by k is element-wise scalar multiplication by k. k\bold{u} =k\begin{bmatrix}u_1\\u_2\\\vdots \\ u_n\end{bmatrix}=\begin{bmatrix}ku_1\\ku_2\\\vdots \\ku_n\end{bmatrix}.

Geometry of Scalar Multiplication

Multiplying a vector by a scalar stretches or shrinks its length, and reverses its direction if the scalar is negative.

k\bold{u} = k \begin{bmatrix} u_1 \\ u_2 \end{bmatrix} = \begin{bmatrix} k u_1 \\ k u_2 \end{bmatrix}

Dot Product

Given two vectors \bold{u},\bold{v}\in\mathbb{R}^n of length the same length, their dot product \langle\bold{u},\bold{v}\rangle (also known as inner product) is the sum over the products of the elements at the same position: \langle\bold{u},\bold{v}\rangle=\sum_{i=1}^n u_iv_i.

Dot products are a useful tool in data science with a variety of practical applications. For example, if \bold{u} presents a set of features and \bold{w} a set of corresponding weights, then their inner product \langle \bold{u}, \bold{v}\rangle gives us the weighted average.

Geometry of Dot Product

After normalizing two vectors to have unit length, the dot products express the cosine of the angle between them.

** Implement in Python