Linear Algebra

Time for a little intro to (finite-dimensional) linear algebra. This requires a few preliminary definitions. A group is a nonempty set {G} with an operation {\star:G\times G\rightarrow G} satisfying

  • Associativity: For all {a,b,c\in G}, {(a\star b)\star c = a\star (b\star c)}.
  • Existence of identity: There exists {e\in G} such that for all {a\in G}, {e\star a=a\star e=e}.
  • Existence of inverse elements: For each {a\in G}, there exists {a^{-1}\in G} such that {a\star a^{^-1}=a^{-1}\star a=e}.

For convenience we will write {ab} for {a\star b} when the operation is assumed. If in addition

\displaystyle a\star b = b\star a

for all {a,b\in G}, we say that {G} is an abelian group.

Now suppose {F} is a non-empty set. If {F} is an abelian group under operation {\oplus} (i.e. “addition”) with identity {0} and {F\setminus\{0\}} is an abelian group under operation {\otimes} (i.e. “multiplication”) with identity {1} (with {1\ne 0}), we say that {F} is a field provided that

  • For all {a,b,c\in F}, {a\otimes(b\oplus c) = (a\otimes b)\oplus(a\otimes c)}.

Now, a nonempty set {V} is a vector space over the field {F} if {V} is an abelian group under {+} with identity {0_V} and

  • For all {a,b\in F} and {v\in V}, {a(bv) = (ab)v}.
  • For all {v\in V}, {1v=v} (where {1} is again the multiplicative identity)
  • For all {a\in F} and {u,v\in V}, {a(u+v) = au+av}.
  • For all {a,b\in F} and {v\in V}, {(a+b)v = av+bv}.

We call elements of {V} vectors and elements of {F} scalars I’ll skip the tedious (but important things!) such as proving that the additive/multiplicative identity in a group/field is unique.

On to a very fundamental definition: A finite set of vectors {\{v_1,v_n,\ldots, v_k\}\subset V} is said to be linearly independent if the equation

\displaystyle c_1v_1 + \cdots + c_kv_k = 0

implies that {c_1=\cdots=c_k=0}. If a set of vectors is not linearly independent, it is said to be linearly dependent. From here on I will write “(in)dependent” for “linearly (in)dependent.” A basic (but important) consequence of this definition:

Theorem 1 If {S=\{v_1, \ldots, v_k\}} is independent, then any subset of {S} is independent.

Proof: Let {T\subset S} and assume without loss of generality that {T=\{v_1, \ldots, v_m\}} for some {m\leqslant k} (we can always renumber the elements in {S}). Then if

\displaystyle c_1v_1+\ldots+c_mv_m=0,

we have

\displaystyle c_1v_1 + \ldots + c_mv_m + c_{m+1}v_{m+1} + \ldots c_kv_k=0,

where {c_{m+1}=\cdots =c_k=0}. By independence of {S}, it follows that {c_1=\cdots=c_m=0}, so that {T} is independent. \Box

Another consequence:

Theorem 2 If {S\subset V} and {0_V\in S}, then {S} is dependent.

Proof: Let {S=\{v_1, \ldots, v_k, 0_V\}}. Then

\displaystyle 0v_1 + \cdots + 0v_k + 1\cdot 0_V=0,

which means that {S} is dependent. \Box

By a similar argument, we can see that if {S} is dependent, then any {T\supset S} is also dependent.

A new definition: If {S\subset V}, the span of {S} is the set of all linear combinations of elements of {S}, i.e.

\displaystyle \mathrm{Span}(S) = \left\{\sum_{v\in S}c_vv:c_v\in F \right\}.

It is clear that for any {S\subset V}, {0_V\in\mathrm{Span}(S)}, for {0_V = \sum_{v\in S}0\cdot v} (if {S=\varnothing}, we interpret this as the empty sum, which is again {0_V}). One basic result about span:

Theorem 3 For any {S\subset V}, {S\subset\mathrm{Span}(S)\subset V}. For any {T\subset S\subset V}, {\mathrm{Span}(T) \subset \mathrm{Span}(S)}.

Proof: If {u\in\mathrm{Span}(S)}, then

\displaystyle  u = 1\cdot u + \sum_{v\in S\setminus\{u\}}0v\in\mathrm{Span}(S).

If {w\in\mathrm{Span}(T)},

\displaystyle w=\sum_{v\in T}c_vv = \sum_{v\in T}c_vv + \sum_{v\in S\setminus T}0v\in\mathrm{Span}(S).

\Box

Now for a theorem that is less trivial:

Theorem 4 If {S\subset V} is dependent, there is a set {T\subset S} such that {T} is independent, and {\mathrm{Span}(T) = \mathrm{Span}(S)}.

Proof: We construct the set {T} as follows. If {S=\{0_V \}}, take {T=\varnothing}. If not, then there is an element {v\in S} such that

\displaystyle \sum_{w\in S\setminus\{v\}}c_ww + c_vv = 0,

with {c_v\ne0}. Let {S'=S\setminus\{v\}}. Then we can write

\displaystyle v = \sum_{w\in S' }-\frac{c_w}{c_v}c_w,

so that {v\in\mathrm{Span}(V')}. Let {T=S'}. Continue this process until there are no more such vectors {v} (this algorithm terminates in finite time since {S} is a finite set). Then by construction, {T} is independent, as no element in {T} can be written as a linear combination of the other elements of {T}. Further, {T\subset S} and {S\subset\mathrm{Span}(T)}, so {\mathrm{Span}(T)=\mathrm{Span}(S)}. \Box

The next post will introduce the concepts of subspace and basis.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s