FANDOM


The dot product is the most common way to define an inner product between elements of \R^n (n-dimensional vectors).

Definition
Let \mathbf x=(x_1,\ldots,x_n)\in\R^n and \mathbf y=(y_1,\ldots,y_n)\in\R^n . We define the dot product \mathbf x\cdot\mathbf y between \mathbf x and \mathbf y by

\mathbf x\cdot\mathbf y=\sum_{i=1}^n x_iy_i=x_1y_1+\cdots+x_ny_n

Note that some texts use the symbol \langle\mathbf x,\mathbf y\rangle to denote the dot product between \mathbf x and \mathbf y , preserving the inner-product notation.

The dot product is one of three common types of multiplication compatible with vectors; the other being the cross product and scalar multiplication, the latter belonging to the vector space nature of \R^n .

\R^n as an inner-product space

We will now prove that the dot product \cdot:\R^n\times\R^n\to\R turns \R^n into an inner-product space. There are four statements to prove, namely, given any \mathbf x,\mathbf y,\mathbf z\in\R^n and any scalar \alpha\in\R , the following is true:

  1. \mathbf x\cdot\mathbf y=\mathbf y\cdot\mathbf x
  2. (\alpha\mathbf x)\cdot\mathbf y=\alpha(\mathbf x\cdot\mathbf y)
  3. (\mathbf x+\mathbf y)\cdot\mathbf z=\mathbf x\cdot\mathbf z+\mathbf y\cdot\mathbf z
  4. \mathbf x\cdot\mathbf x\ge0 with equality if and only if \mathbf x=\mathbf 0
Proof.
  1. \mathbf x\cdot\mathbf y=\sum_{i=1}^n x_iy_i=\sum_{i=1}^n y_ix_i=\mathbf y\cdot\mathbf x
  2. (\alpha\mathbf x)\cdot\mathbf y=\sum_{i=1}^n(\alpha x_i)y_i=\alpha\sum_{i=1}^n x_iy_i=\alpha(\mathbf x\cdot\mathbf y) , where we in the second step factored out the \alpha
  3. (\mathbf x+\mathbf y)\cdot\mathbf z=\sum_{i=1}^n(x_i+y_i)z_i=\sum_{i=1}^n(x_iz_i+y_iz_i)=\sum_{i=1}^n x_iz_i+\sum_{i=1}^n y_iz_i=\mathbf x\cdot\mathbf z+\mathbf y\cdot\mathbf z
  4. \mathbf x\cdot\mathbf x=\sum_{i=1}^n x_i^2=x_1^2+\cdots+x_n^2 . But each x_i^2\ge0 (i=1,\ldots, n), so \mathbf x\cdot\mathbf x\ge0 , as required. Now suppose that \mathbf x=\mathbf 0 . Then clearly \mathbf x\cdot\mathbf x=\sum_{i=1}^n 0^2=0 . If \mathbf x\cdot\mathbf x=0 , then x_1^2+\cdots+x_n^2=0 . Suppose for the sake of contradiction that some x_i\ne0 . Then x_i^2>0 so that \mathbf x\cdot\mathbf x\ne0 . But this is a contradiction, so we must have \mathbf x=\mathbf 0

This completes the proof.

Euclidean norm and \mathbf R^n as a metric space

Once we have defined the dot product between elements of Euclidean n-space, we may define a map \|\cdot\|:\R^n\to\R , when applied to \mathbf x\in\R^n is called the norm of \mathbf x .

Definition
If \mathbf x=(x_1,\ldots,x_n)\in\R^n , we define the norm of \mathbf x , denoted by \|\mathbf x\| , by

\|\mathbf x\|=\sqrt{\mathbf x\cdot\mathbf x}=\sqrt{\sum_{i=1}^n x_i^2}

One can show that if \mathbf x\in\R^n and \mathbf y\in\R^n , then \|\mathbf y-\mathbf x\| is a valid distance between \mathbf x and \mathbf y , and hence turns \R^n into a metric space. In fact, this metric space is complete, meaning that every Cauchy sequence of elements in \R^n converges to some point in \R^n .

Angles between two elements

The dot product can be used to determine the angle between two elements: \cos(\theta)=\frac{V\cdot W}{|V||W|}

Orthogonality

Two elements in an inner-product space are said to be orthogonal if and only if their inner-product is 0. In \R^n this translates to: \mathbf x and \mathbf y in \R^n are orthogonal if and only if \mathbf x\cdot\mathbf y=0 . Note that the zero vector is orthogonal to every vector.

See also

Ad blocker interference detected!


Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.