UNIT I Vector Space (1) PDF
Document Details
Uploaded by Deleted User
Tags
Summary
This document is a set of lecture notes on vector spaces, covering axioms, examples, and subspace definitions. It features a series of examples to demonstrate important concepts in the mathematical field of linear algebra.
Full Transcript
**UNIT II** **VECTOR SPACES** **2.1 The axioms of a vector space** **Definition 2.1.1:** Let **K** be a set of numbers. We shall say that **K** is **field** if it satisfies the following conditions: a\) If x, y are elements of K then x + y and xy are also elements of K. b\) If x is an element o...
**UNIT II** **VECTOR SPACES** **2.1 The axioms of a vector space** **Definition 2.1.1:** Let **K** be a set of numbers. We shall say that **K** is **field** if it satisfies the following conditions: a\) If x, y are elements of K then x + y and xy are also elements of K. b\) If x is an element of K, then --x is also an element of K. Furthermore, if x ≠ 0, then x-1 is also an element of K. c\) 0 and 1 are elements of K. **Example 2.1.1:** The set of all real numbers ℜ and the set of all complex numbers ℂ are fields. **Activity 2.1.1:** Are ℤ (The set of all integers) and Q (the set of all rational numbers fields? **Remark**: The essential thing about a field is that its elements can be added and multiplied and the **Definition 2.1.2: A vector space V over a field K** is a set of objects which can be added and can be multiplied by elements of K. It satisfies the following properties. V1) For any u, v ∈ V and a ∈ K, we have, i) u + v ∈ V and ii) a u ∈ V V2) For any u, v, w ∈ V, (u + v) + w = u + (v + w) V3) There is an element of V, denoted by O (called the zero element), such that 0 + u = u + 0 = u for all elements u of V. V4) For u ∈ V, there exists --u ∈ V such that u + (-u) = 0 V5) For u, v ∈ V, we have u + v = v + u V6) For u, v ∈ V and a ∈ k, a(u + v) = au + av V7) For u ∈ V and a, b ∈ k, (a + b) u = au + bu and (ab) u = a(bu) V8) For u ∈ v, 1u = u **Activity 2.1.2:** What is the name given for each of the above properties? Other properties of a vector space can be deduced from the above eight properties. For example, the property 0u = O can be proved as : By adding --u to both sides of ou + u = u, we have 0u = O **Examples of different models of a vector space** 1\) Consider sets ℜ2 and ℜ 2\) Let V = ℜ2 and K = ℂ **Activity 2.2.1:** Which of the following are vector spaces? a\) C on ℜ2 b\) Cn over ℂ c\) Qn over Q d\) ℜn over ℂ **2.3 Subspaces, Linear Combinations and generators** **Definition2.3.1:** Suppose V is a vector space over k and W is a non-empty subset of V. If, under the addition and scalar multiplication that is defined on V, W is also a vector space then we call W a **subspace** of V. Using this definition and the axioms of a vector space, we can easily prove the following: A non-empty subset W of a vector space V is called a **subspace** of V if: i\) **W** is closed under addition. That is, if u, w ∈ W, then u + w ∈ W ii\) **W** is closed under scalar multiplication. That is, if u∈W and a ∈ k, then au∈W. Then as ∅ ≠W ⊆ V, properties V1 -- V8 are satisfied for the elements of W. Hence W itself is a vector space over k. We call W a **subspace** of V. **Example 2.3.1:** Consider H = {(x, y): x, y ∈ ℜ and x + 4y = 0}. H is a subset of the vector space ℜ2 over ℜ. To show that H is a subspace of V, it is enough to show the above two properties hold in H. Let u = (x1, y1) and w = (x2, y2) be in H. Then x1 + 4y1 = 0 and x2 + 4y2 = 0 u + w = (x1 + x2, y1 + y2) and (x1 + x2) + 4(y1 + y2) = x1 + 4y + x2 + 4y2 = 0 + 0 = 0 Which shows ***u*** + ***w*** ∈ ***H***. For ***a***∈ℜ***, au*** = (***ax**1 **,ax**2* ) and ***( ax**1 **)***+ *4**( ax**2 **)*** = ***a( x**1* + *4**x**2 **)*** = *0**a*** = 0. ∴ H is a subspace of ℜ2 **Activity 2.3.1:** Take any vector A in ℜ3. Let W be the set of all vectors B in ℜ3 where B.A = 0. Discuss whether W is a subspace of ℜ3 or not. **Definition 2.3.2:** Let v1, v2,..., vn be elements of a vector space V over k. Let x1, x2,..., xn be elements of k. Then an expression of the form x1v1 + x2v2 +... + xn vn is called a **linear combination** of v1, v2,..., vn.. **Example 2.3.2:** The sum 2(3, 1) + 4(-1, 2) +(1, 0) is a linear combination of (3, 1), (-1, 2) and (1, 0). As this sum is equal to (3, 10), we say that (3, 10) is a linear combination of the three ordered pairs. **Activity 2.3.2:** i. Take two elements v1 and v2 of ℜ3. Let **W** be the set of all linear combinations of ii. Generalize i) by showing that a set w generated by elements v1, v2,..., vn of a vector space **V** is a subspace. **2.4. Linear dependence and independence of vectors** **Definition 2.4.1:** Let **V** be a vector space over k. Elements v1, v2,..., vn of **V** are said to be **linearly independent** if and only if the following condition is satisfied: whenever a1, a2,..., an are in k such that a1v1 + a2v2 +... + anvn = 0, then ai = 0 for all i = 1, 2,..., n. If the above condition does not hold, the vectors are called **linearly dependent**. In other words v1, v2,..., vn are linearly dependent if and only if there are numbers a1, a2,..., an where a1v1 + a2v2 +... + anvn = 0 for at least one non-zero ai. **Example 2.4.1:** Consider v1 = (1, -1,1) , v2 = (2, 0, -1) and v3 = (2, -2, 2) i\) a1v1 + a2v2 = a1 (1, -1, 1) + a2 (2, 0, -1) = (a1 + 2a2, -a1, a1 -- a2) ii\) a1v1 + a2v3 = a1 (1, -1, 1) + a2 (2, -2, 2) Take a1 = 2 and a2 = -1, we get 2(1, -1, 1) + (-1) (2, -2, 2) = 0. As the constants are not all equal to zero, v1 and v3 are linearly dependent. **Activity 2.4.1:** Show that v1, v2 and v3 are also linearly dependent. **Remark:** If vectors are linearly dependent, at least one of them can be written as a linear combination of the others. **Activity 2.4.2:** Show that (1, 0, 0,...,0), (0, 1,0,...)..., (0,0,0,..., 1) are linearly independent vectors in ℜn. **2.5 Bases and dimension of a vector space** **Definition 2.5.1:** If elements e1, e2,..., en of a vector space **V** are linearly independent and generate **V**, then the set B = {e1, e2,..., en} is called a **basis** of **V**. we shall also say that the elements e1, e2,..., en **constitute** or **form** abasis of **V**. **Example 2.5.1:** 1\) Show that e1 = (0, -1) and e2 = (2, 1) form a basis of ℜ2. Solution: we have to show that i\) e1 and e2 are linearly independent ii\) They generate ℜ2 i.e every element (x,y) of ℜ2 can be written as a linear combination of e1and e2. i\) a1 e1 + a2 e2 = O ⇒ a1(0, -1) + a2(2,1) = (0, 0) ⇒ 2a2 = 0 and --a1 + a2 = 0 ⇒ a2 = 0 and a1 = 0 ∴ e1 and e2 are linearly independent ii\) (x, y) = a1e2 + a2 e2 ⇒ (x, y) = (0, -a1) + (2a2, a2) ⇒ x = 2a2 and y = -a1 + a2 ⇒ a2 = [\$\\frac{\\text{x\\ }}{2}\$]{.math.inline} and a1 = a2 -- y... (\*) Therefore, given any (x, y), we can find a1 and a2 given by (\*) and (x, y) can be written as a linear combination of e1 and e2 as Note that {(1, 0), (0, 1)} is also a basis of ℜ2. Hence a vector space can have two or more basis. Find other bases of ℜ2. 2\) Show that e1 = (2, 1, 0) and e2 = (1, 1, 0) form a basis of ℜ3. (3, 4, 2) = a1 (2, 1, 0) + a2 (1, 1, 0). Hence {(2, 1, 0), (1,1,0)} is not a basis of ℜ3. The vectors E1 = (1, 0, 0) , E2 = (0, 1, 0), E3 = (0, 0, 1) are linearly independent and every element (x, y, z) of ℜ3 can be written as (x, y, z) = x(1, 0, 0) + y(0, 1, 0) + z (0, 0, 1) = xE1 + yE2 + zE3 Hence {E1, E2, E3} is a basis of ℜ3. Note that the set of elements E1 = (1, 0, 0,...,0), E2 = (0, 1, 0,... 0),...,En = (0, 0, 0,...,1) is a basis of ℜn. It is called a **standard basis**. Let B = {e1 , e2,..., en} be a basis of **V**. since B generates **V**, any u in **V** can be represented as u = a1e1 + a2 e2 +... + an en. Since the ei are linearly independent, such a representation is unique. We call (a1, a2,..., an) the **coordinate vector** of u with respect to the basis B, and we call ai the **i -- th coordinate.** **Example 2.5.2** 1\) In 1) of example 3.3.1 The coordinate vector of (4,3) with respect to the basis {(0, -1), (2,1)} is (-1, 2). But with respect to the standard basis it is (4, 3). Find coordinates of (4,3) in some other basis of ℜ2. 2\) Consider the set **V** of all polynomial functions f: ℜ → ℜ which are of degree less than or equal to 2. Every element of **V** has the form f(x) = bx2 + cx + d, where b, c, d ∈ ℜ **V** is a vector space over ℜ (show). Clearly, e1 = x2, e2 = x and e3 = 1 are in **V** and a1e1 + a2 e2 + a3e3 = O (0 is the zero function) ⇒ a1x2 + a2e2 + a3 e3 = 0 for all x ⇒ a1 = a2 = a3 = 0. Which shows e1, e2 and e3 are linearly independent bx2 + cx + d = a1e1 + a2e2 + a3e3 for all x ⇒ bx2 + cx + d = a1x2 + a2x + a3 ⇒ b = a1, c = a2 and d = a3 Thus e1, e2 and e3 generate **V**. ∴ {x2, x1 1} is a basis of **V** and the coordinate vector of an element f(x) = bx2 + cx + d is (b, c, d) The coordinate vector of x2 -- 3x + 5 is (1, -3, 5) **Activity 2.5.1:** Show that the polynomials form a basis of a vector space **V** defined in **2)** of **example 2.5.2**. What is the coordinate of **f(x) = 2x2 -- 5x + 6** with respect to the basis **{E1, E2, E3}**? E = {(1, 0, 0), (0,1,0), (0,0,1)} and B = {(-1,1,0), (-2, 0, 2), (1, 1, 1)} are bases of *3* ℜ and each has three elements. Can you find a basis of *3* ℜ having two elements? four elements? The main result of this section is that any two bases of a vector space have the same number of elements. To prove this, we use the following theorem. **Theorem 2.5.1:** Let V be a vector space over the field K. Let {v1, v2,...,vn} be a basis of V. If ***w1, w2,...,wm*** are elements of V, where m \> n, then w1, w2,..., wm are linearly dependent. **Proof** (reading assignment) **Theorem 2.5.2:** Let V be a vector space and suppose that one basis B has n elements, and another basis W has m elements. Them m = n. **Proof:** As B is a basis, m \> n is impossible. Otherwise by theorem 3.4.1, W will be a linearly dependent set. Which contradicts the fact that W is a basis. Similarly, as W is a basis, n \> m is also impossible. Hence n = m. **Definition 2.5.2:** Let V be a vector space having a basis consisting of n elements. We shall say that n is the **dimension** of V. It is denoted by **dim V**. **Remarks** : 1. If V = {0}, then V doesn‟t have a basis, and we shall say that dim v is zero. **Example 2.5.3:** 1\) *3* ℜ over ℜ has dimension 3. In general ***n*** ℜ over ℜ has dimension n. 2\) ℜ over ℜ has dimension 1. In fact, {1} is a basis of ℜ , because ***a.**1* = *0* ⇒ ***a*** = *0* and any number *x*∈ℜhas a unique expression *x* = *x*.1. **Definition 2.5.3:** The set of elements {v1, v2,...,vn}of a vector space V is said to be a **maximal set of linearly independent elements** if v1, v2,...,vn are linearly independent and if given any element w of V, the elements w,v1, v2,..., vn are linearly dependent. **Example 2.5.4:** In *3* ℜ {(1, 0, 0), (0, 1, 1), (0, 2, 1)} is a maximal set of linearly independent elements. We now give criteria which allow us to tell when elements of a vector space constitute a basis. **Theorem 2.5.3:** Let **V** be a vector space and {v1, v2,...,vn}be a maximal set of linearly independent elements of **V**. Then {v1, v2,...,vn}is a basis of **V**. **Proof:** It suffices to show that v1, v2,...,vn generate **V**. (Why?) Let ***w***∈***v***.Then w, v1, v2,...,vn are linearly dependent (why?). Hence there exist numbers ao, a1, a2,..., an not all 0 such that ao w + a1v1 + a2v2 +...+ anvn = O In particular ***ao*** ≠ *0* (why? Therefore, by solving for w, w=[\$\\frac{- a\_{1}}{a\_{0}}\$]{.math.inline}v~1~-[\$\\frac{- a\_{2}}{a\_{0}}\$]{.math.inline} v~2~-...-[\$\\frac{- a\_{n}}{a\_{0}}\$]{.math.inline} v~n~ This proves that w is a linear combination of v1, v2,...,vn. **Theorem 2.5.4:** Let dim **V** = n, and let v1, v2,...,vn be linearly independent elements ofv. Then {v1, v2,...,vn} is a basis of v. **Proof:** According to theorem 3.4.1, {v1, v2,...,vn} is a maximum set of linearly independent elements of **V**. Hence it is a basis by theorem 2.5.3 **Corollary 2.5.1:** Let W be a subspace of V. If dim W = dimV, then V = W **Proof:** Exercise **2.6. Direct sum and direct product of subspaces** Let **V** be a vector space over the field K. Let **U, W** be subspaces of **V**. We define the **sum** of **U** and **W** to be the subset of **V** consisting of all sums **u + w** with *u* ∈*U* and *w*∈*W*. We denote this sum by **U +W** and it is a subspace of **V**. Indeed, if *u u* ∈*U* 1 2 , and *w w* ∈*W* 1 2 , then *u* + *w* + *u* + *w* = *u* + *u* + *w* + *w* ∈*U* +*W* If *c*∈ *K* , then *c u* + *w* = *cu* + *cw* ∈*U* +*W* Finally, 0 + 0∈*U* +*W*. This proves that U + W is a subspace. **Definition 2.6.1:** A vector space V is a **direct sum** of U and W if for every element v in V there exist unique elements *u* ∈*U* and *w*∈*W* such that *v* = *u* + *w*. **Theorem 2.6.1:** Let V be a vector space over the field K, and let U, W be subspaces. If U + W = V, and if*U* ∩*W* = {0}, then V is the direct sum of U and W. **Proof:** Exercise **Note:** When V is the direct sum of subspaces U, W we write: *V* =*U* ⊕*W* **Theorem 2.6.2:** Let V be a finite dimensional vector space over the field K. Let W be a subspace. Then there exists a subspace U such that V is the direct sum of W and U. **Proof:** Exercise **Theorem 2.6.3:** If **V** is a finite dimensional vector space over the field K, and is the direct sum of subspaces U, W then **dim V = dim U + dim W** **Proof:** Exercise **Remark:** We can also define V as a direct sum of more than two subspaces. Let W1,W2,...., Wr be subspaces of V. We shall say that V is their **direct sum** if every element of can be expressed in a unique way as a sum *r v* = *w* + *w* +\...\....+ *w* 1 2 With wi in Wi. Suppose now that U, W are arbitrarily vector spaces over the field K(i.e. not necessarily subspaces of some vector space). We let UXW be the set of all pairs (u, w) whose first component is an element u of U and whose second component is an element w of W. We define the addition of such pairs component wise, namely, if (*u* ,*w* )∈*UXW* 1 1 and (*u* ,*w* )∈*UXW* 2 2 we define *u w* + *u w* = *u* + *u w* + *w* If *c*∈ *K* , we define the product ( , ) 1 1 *c u w* by ( , ) 1 1 *c u w* =( , ) 1 1 *cu cw* It is then immediately verified that UXW is a vector space, called the **direct product** of U and W. **Note:** If n is a positive integer, written as a sum of two positive integers, *n* = *r* + *s* , then we see that *n K* is the direct product *r s K XK* and dim(*UXW*) = dim*U* + dim*W*. **Example 2.6.1:** Let, 3 *V* = *R* , = { ∈ℜ} 3 3 *U* (0,0, *x* ), *x* , and = { ∈ℜ} 1 2 1 2 *W* (*x* , *x* ,0), *x* , *x*. Show that V is the direct sum of W and U. **Solution:** Since V, U and W are vector spaces, and in addition to that U and W are subspaces of V. The sum of U and W is: U + W = { ∈ℜ} 1 2 3 1 2 3 (*x* , *x* , *x* ), *x* , *x* , *x* = *R* =*V* 3 Thus; V = U + W The intersection of U and W is: *U* *W* = {0} Therefore, V is the direct sum of W and U. **Activity 2.6.1: 1.** Let, 3 *V* = *R* , = { ∈ℜ} 1 3 1 3 *U* (*x* ,0, *x* ), *x* , *x* , and = { ∈ℜ} 2 2 *W* (0, *x* ,0), *x*. Show that V is the direct sum of W and U. **2.** Let,3 *V* = *R* , = { ∈ℜ} 1 2 1 2 *U* (*x* , *x* ,0), *x* , *x* , and = { ∈ℜ} 3 3 *W* (0,0, *x* ), *x*. Show that V is the direct sum of W and U. **Exercise 2.1** 1\. Let k be the set of all numbers which can be written in the form**a** + **b** 2 , where a, b are rational numbers. Show that k is a field. 2\. Show that the following sets form subspaces a\. The set of all (x, y) in 2 ℜ such that x = y b\. The set of all (x, y) in 2 ℜ such that x -- y = 0 c\. The set of all (x, y, z) in 3 ℜ such that x + y = 3z d\. The set of all (x, y, z) in 3 ℜ such that x = y and z = 2y 3\. If U and W are subspaces of a vector space V, show that **U****W**and **U**+ **W**are subspaces. 4\. Decide whether the following vectors are linearly independent or not (on ℜ) a\) (π, 0) and (0, 1) b\) (-1, 1, 0) and (0, 1, 2) c\) (0, 1, 1), (0, 2, 1), and (1, 5, 3) 5\. Find the coordinates of X with respect to the vectors A, B and C a\. X = (1, 0, 0), A = (1, 1, 1), B = (-1, 1, 0), C = (1, 0, -1) b\. X = (1, 1, 1) , A = (0, 1, -1), B = (1, 1, 0), C = (1, 0, 2) 6\. Prove: The vectors (a, b) and (c, d) in the plane are linearly dependent if and only if ad -- bc = 0 7\. Find a basis and the dimension of the subspace of 4 ℜ generated by {(1, -4, -2, 1), (1, -3, -1, 2), (3, -8, -2, 7)}. 8\. Let W be the space generated by the polynomials x3 + 3x2 -- x + 4, and *Linear Algebra I* 37 2x3 + x2 -- 7x -- 7. Find a basis and the dimension of W. 9\. Let V = {(a, b, c, d) ∈ ℜ4: b -- 2c + d = 0} W = {(a, b, c, d) ∈ ℜ4: a = d, b = 2c} Find a basis and dimension of a\) V b) W c) **V****W** 10\. What is the dimension of the space of 2 x 2 matrices? Give a basis for this space. Answer the same question for the space of n x m matrices. 11\. Find the dimensions of the following a\) The space of n x n matrices all of whose elements are 0 except possibly the diagonal elements. b\) The space of n x n upper triangular matrices c\) The space of n x n symmetric matrices d\) The space of n x n diagonal matrices 12\. Let V be a subspace of ℜ3. What are the possible dimensions for V? Show that if V ≠ℜ3, then either V = {0}, or V is a straight line passing through the origin, or V is a plane passing through the origin.