---PAGE_BREAK---
7.1 Vector Spaces
A vector space ($\mathbf{V}$, $\mathbb{F}$) is a set of vectors $\mathbf{V}$, a set of scalars $\mathbb{F}$, and two operators that satisfy the following properties:
Vector Addition
Associative: $\vec{u} + (\vec{v} + \vec{w}) = (\vec{u} + \vec{v}) + \vec{w}$ for any $\vec{v}, \vec{u}, \vec{w} \in \mathbf{V}$.
Commutative: $\vec{u} + \vec{v} = \vec{v} + \vec{u}$ for any $\vec{v}, \vec{u} \in \mathbf{V}$.
Additive Identity: There exists an additive identity $\vec{0} \in \mathbf{V}$ such that $\vec{v} + \vec{0} = \vec{v}$ for any $\vec{v} \in \mathbf{V}$.
Additive Inverse: For any $\vec{v} \in \mathbf{V}$, there exists $-\vec{v} \in \mathbf{V}$ such that $\vec{v} + (-\vec{v}) = \vec{0}$. We call $-\vec{v}$ the additive inverse of $\vec{v}$.
Closure under vector addition: For any two vectors $\vec{v}, \vec{u} \in \mathbf{V}$, their sum $\vec{v} + \vec{u}$ must also be in $\mathbf{V}$.
Scalar Multiplication
Associative: $\alpha(\beta\vec{v}) = (\alpha\beta)\vec{v}$ for any $\vec{v} \in \mathbf{V}$, $\alpha, \beta \in \mathbb{F}$.
Multiplicative Identity: There exists $1 \in \mathbb{F}$ where $1 \cdot \vec{v} = \vec{v}$ for any $\vec{v} \in \mathbb{F}$. We call $1$ the multiplicative identity.
Distributive in vector addition: $\alpha(\vec{u} + \vec{v}) = \alpha\vec{u} + \alpha\vec{v}$ for any $\alpha \in \mathbb{F}$ and $\vec{u}, \vec{v} \in \mathbf{V}$.
Distributive in scalar addition: $(\alpha + \beta)\vec{v} = \alpha\vec{v} + \beta\vec{v}$ for any $\alpha, \beta \in \mathbb{F}$ and $\vec{v} \in \mathbf{V}$.
Closure under scalar multiplication: For any vector $\vec{v} \in \mathbf{V}$ and scalar $\alpha \in \mathbb{F}$, the product $\alpha\vec{v}$ must also be in $\mathbf{V}$.
You have already seen vector spaces before! For example, $(\mathbb{R}^n, \mathbb{R})$ is the vector space of all $n$-dimensional vectors. With the definitions of vector addition and scalar multiplication defined in the previous notes you could show that it satisfies all the properties above. In fact, matrices also are a vector space $(\mathbb{R}^{n \times m}, \mathbb{R})$ since they fulfill all of the properties above as well – but in this class we will generally only deal with vector spaces containing vectors in $\mathbb{R}^n$ or $\mathbb{C}^n$.
Additional Resources For more on vector spaces, read Strang pages 123 - 125 and try Problem Set 3.1.
In Schaum's, read pages 112-114 and try problems 4.1, 4.2, and 4.71 to 4.76. Extra: Read and Understand Polynomial Spaces, Spaces of Arbitrary "Field." ---PAGE_BREAK---
7.1.1 Bases
We can use a series of vectors to define a vector space. We call this set of vectors a basis, which we define formally below:
Definition 7.1 (Basis):
Given a vector space $(V, \mathbb{F})$, a set of vectors ${\vec{v}_1, \vec{v}_2, \dots, \vec{v}_n}$ is a basis of the vector space if it satisfies the following two properties:
$\vec{v}_1, \vec{v}_2, \dots, \vec{v}_n$ are linearly independent vectors
For any vector $\vec{v} \in V$, there exist scalars $\alpha_1, \alpha_2, \dots, \alpha_n \in \mathbb{F}$ such that $\vec{v} = \alpha_1\vec{v}_1 + \alpha_2\vec{v}_2 + \dots + \alpha_n\vec{v}_n$.
Intuitively, a basis of a vector space is the minimum set of vectors needed to represent all vectors in the vector space. If a set of vectors is linearly dependent and “spans” the vector space, it is still not a basis because we can remove at least one vector from the set and the resulting set will still span the vector space.
The next natural question to ask is: Given a vector space, is the basis unique? Intuitively, it is not because multiplying one of the vectors in a given basis by a nonzero scalar will not affect the linear independence or span of the vectors. We could alternatively construct another basis by replacing one of the vectors with the sum of itself and any other vector in the set.
To illustrate this mathematically, suppose ${\vec{v}_1, \vec{v}_2, \dots, \vec{v}_n}$ is a basis for the vector space we are considering. Then
where $\alpha \neq 0$ is also a basis because, just as we've seen in Gaussian elimination row operations, multiplying a row by a nonzero constant does not change the linear independence or dependence of the rows. We can generalize this to say that multiplying a vector by a nonzero scalar also does not change the linear independence of the set of vectors. In addition, we know that
because any vector in $\operatorname{span}({\vec{v}_1, \vec{v}_2, \dots, \vec{v}_n})$ can be created as a linear combination of the set ${\alpha\vec{v}_1, \vec{v}_2, \dots, \vec{v}_n}$ by dividing the scale factor on $\vec{v}_1$ by $\alpha$. We can use a similar argument to show that ${\vec{v}_1 + \vec{v}_2, \vec{v}_2, \dots, \vec{v}_n}$ is also a basis for the same vector space.
Example 7.1 (Vector space ($\mathbb{R}^3, \mathbb{R}$)): Let's try to find a basis for the vector space $(\mathbb{R}^3, \mathbb{R})$. We want to find a set of vectors that can represent any vector of the form $\begin{bmatrix} a \ b \ c \end{bmatrix}$ where $a,b,c \in \mathbb{R}$. One basis could be the set of standard unit vectors:
---PAGE_BREAK---
The set of vectors is linearly independent and we can represent any vector $[\begin{matrix} a \ b \ c \end{matrix}]$ in the vector space using the three vectors:
Alternatively, we could show that
is a basis for the vector space.
Now that we have defined bases, we can define the dimension of a vector space.
Definition 7.2 (Dimension): The dimension of a vector space is the number of basis vectors.
Since each basis vector can be scaled by one coefficient, the dimension of a space as the fewest number of parameters needed to describe an element or member of that space. The dimension can also be thought of as the degrees of freedom of your space – that is, the number of parameters that can be varied when describing a member of that space.
Example 7.2 (Dimension of ($\mathbb{R}^3, \mathbb{R}$)): Previously, we identified a basis
for the vector space $(\mathbb{R}^3, \mathbb{R})$. The basis consists of three vectors, so the dimension of the vector space is three.
Note that a vector space can have many bases, but each basis must have the same number of vectors.
We will not prove this rigorously, but let's illustrate our arguments. Suppose a basis for the vector space we're considering has $n$ vectors. This means that the minimum number of vectors we can use to represent all vectors in the vector space is $n$, because the vectors in the basis would not be linearly independent if the vector space could be represented with fewer vectors. Then we can show that any set with less than $n$ vectors cannot be a basis because it does not have enough vectors to span the vector space — there would be some vectors in the vector space that cannot be expressed as a linear combination of the vectors in the set. In addition, we can show that any set with more than $n$ vectors must be linearly dependent and therefore cannot be a basis. Combining the two arguments, we have that any other set of vectors that forms a basis for the vector space must have exactly $n$ vectors.
We introduced quite a few terms in this lecture note, and we'll see how we can connect these with our understanding of matrices in the next lecture note! ---PAGE_BREAK---
Additional Resources For more on bases, read Strang pages 167 - 171 and try Problem Set 3.4. Extra: Read Sections on Matrix and Function Space.
In Schaum's, read pages 124-126 and pages 127-129. Try Problems 4.24 to 4.28, 4.97 to 4.103, and 4.33 to 4.40.
7.2 Practice Problems
These practice problems are also available in an interactive form on the course website.
True or False: ${\begin{bmatrix} -3 \ 1 \end{bmatrix}, \begin{bmatrix} -1 \ 0 \end{bmatrix}, \begin{bmatrix} 5 \ 2 \end{bmatrix}}$ spans $\mathbb{R}^2$.
True or False: ${\begin{bmatrix} 1 \ 2 \ 3 \end{bmatrix}, \begin{bmatrix} 5 \ -2 \ 1 \end{bmatrix}, \begin{bmatrix} -3 \ 6 \ 5 \end{bmatrix}}$ is a basis for $\mathbb{R}^3$.
The following vectors span $\mathbb{R}^3$:
Which vectors of this set form a basis for $\mathbb{R}^3$?
(a) $\vec{x}_1, \vec{x}_2, \vec{x}_3, \vec{x}_4, \vec{x}_5$
(b) $\vec{x}_1, \vec{x}_3, \vec{x}_5$
(c) $\vec{x}_1, \vec{x}_2, \vec{x}_4$
(d) $\vec{x}_1, \vec{x}_3, \vec{x}_4, \vec{x}_5$