Tema 12: Espacios Vectoriales PDF

Summary

This document is a lecture or study guide on vector spaces, including topics like linear varieties, applications between vector spaces, and theorems of isomorphism. The document covers concepts, definitions, theorems, and proofs related to these topics.

Full Transcript

# Academia DEIMOS ## Oposiciones: - Secundaria - Diplomados en Estadística del Estado ## Contact: - Phone: 669 31 64 06 - 91 479 23 42 - Website: www.academiadeimos.es - Email: [email protected] - Email: [email protected] # Tema 12: Espacios Vectoriales ## Topics: - Espacios V...

# Academia DEIMOS ## Oposiciones: - Secundaria - Diplomados en Estadística del Estado ## Contact: - Phone: 669 31 64 06 - 91 479 23 42 - Website: www.academiadeimos.es - Email: [email protected] - Email: [email protected] # Tema 12: Espacios Vectoriales ## Topics: - Espacios Vectoriales - Variedades Lineales - Aplicaciones entre Espacios Vectoriales - Teoremas de Isomorfía ## Section 12.1: Concepto de Espacio Vectorial ## Section 12.2: Subespacios Vectoriales ### Section 12.2.1: Definition: - Subspace, or linear variety - A subset of a vector space V that is **itself a vector space with the same operations** - **Equivalently**: For any vectors *u* in U and scalars *λ*, *µ* in K, - *λu* + *µv* is also in U. ### Section 12.2.2: Observations: 1. The zero vector (0) is in every subspace. 2. Both {0} and V are improper subspaces. 3. All other subspaces are proper subspaces. 4. Linear combinations of vectors in U are also in U. ### Section 12.2.3: Definition: - **Subspace spanned by a set** - The set of all linear combinations of vectors in a set S - The smallest subspace that contains S. ### Section 12.2.4: Intersection and Sum of Subspaces: 1. - **Intersection**: The intersection of any family of subspaces is a subspace. - **Union**: The union of subspaces is **not** generally a subspace. 2. - **Sum**: The sum of subspaces is the set of all vectors that can be expressed as sums of vectors from each subspace. - **Direct Sum**: The sum is direct if every vector in the sum has a unique representation as a sum of vectors from each subspace. - This occurs when the intersection of the subspaces is {0}. 3. - **Supplementary Subspaces**: Two subspaces are supplementary if their sum is the entire space and their intersection is {0}. ## Section 12.3: Linear Dependence ### Section 12.3.1: Definition: - A set of vectors is **linearly independent** if the only way to get a linear combination of them that equals 0 is if all the coefficients are 0. - A set of vectors is **linearly dependent** if there exists a linear combination of them that equals 0 where at least one coefficient is not 0. ### Section 12.3.2: Observations: 1. A set of vectors is linearly dependent if and only if one vector can be expressed as a linear combination of the others. 2. A set of vectors is linearly dependent if it contains the zero vector. 3. A non-empty set containing a non-zero vector is linearly independent. 4. Adding vectors to a linearly dependent set makes it linearly dependent. 5. Removing vectors from a linearly independent set makes it linearly independent. ## Section 12.4: Finite Dimensional Spaces ### Section 12.4.1: Definition: - A vector space is **finite dimensional** if it has a finite spanning set (a set of vectors that can be used to generate all vectors in the space) ### Section 12.4.2: Fundamental Theorem of Linear Independence - **Proof**: - Assume that a linearly independent set I has more vectors than a spanning set G. - Express the first vector of I as a linear combination of the vectors in G. - Since I is linearly independent, at least one coefficient in the combination must be non-zero. - Remove that vector from G, producing a smaller spanning set. - Repeat this process for each vector in I until you reach a contradiction. ### Section 12.4.3: Definition: - A **basis** of a finite dimensional vector space is a linearly independent spanning set. ### Section 12.4.4: Theorem: Existence of Bases - **Proof**: Every spanning set contains a basis. ### Section 12.4.5: Theorem: Dimension of a Space - **Proof**: Every basis of a finite dimensional vector space has the same number of vectors. - If there are two bases with differing numbers of vectors, then one must have more vectors than the other. - But by the fundamental theorem, the linearly independent set cannot have more vectors than the spanning set. - Therefore, both bases must have the same number of vectors, which is defined as the **dimension** of the space. ### Section 12.4.6: Consequences: 1. If a spanning set has the same number of vectors as the dimension of the space, then it is a basis. 2. If a linearly independent set has the same number of vectors as the dimension of the space, then it is a basis. ### Section 12.4.7: Theorem: Extension of a Linearly Independent Set - **Proof**: Any linearly independent set can be extended to a basis by adding vectors from another basis until the set has the same number of vectors as the dimension of the space. ### Section 12.4.8: Dimension of a Subspace - **Proof**: The dimension of a subspace is less than or equal to the dimension of the entire vector space. - If the dimension of the subspace is equal to the dimension of the entire vector space, then the subspace is the entire space. ### Section 12.4.9: Coordinates: - If a vector is a linear combination of basis vectors, then the coefficients of the combination are called the **coordinates** of the vector with respect to that basis. ## Section 12.5: Linear Transformations ### Section 12.5.1: Definition: - A **homomorphism**, or **linear transformation**, from a vector space V to a vector space W is a function that preserves vector addition and scalar multiplication. ### Section 12.5.2: Properties: 1. The image of the zero vector in V is the zero vector in W. 2. If a set of vectors is linearly dependent in V, then their images are linearly dependent in W. 3. The composition of two linear transformations is also a linear transformation. ### Section 12.5.3: Determinations of Linear Transformations - A linear transformation from a finite dimensional space V to a space W can be uniquely defined by specifying the images of the basis vectors of V. ### Section 12.5.4: Kernel and Image: - **Kernel**: The set of all vectors in V that map to the zero vector in W. - **Image**: The set of all vectors in W that are the image of some vector in V. ## Section 12.6: Isomorphisms - **Injective (one-to-one):** A linear transformation is injective if and only if its kernel is {0}. - **Surjective (onto):** A linear transformation is surjective if and only if its image is the entire space W. - **Bijective (one-to-one and onto):** A linear transformation is bijective if it is both injective and surjective. - **Isomorphism:** A bijective linear transformation. It preserves the structure of the vector spaces, meaning they are essentially the same. - **Automorphism:** An isomorphism from a vector space to itself. ## Section 12.7: Quotient Vector Space ### Section 12.7.1: Quotient Space: - Given a subspace U of V, a quotient space V/U is formed by considering the equivalence classes of vectors in V where two vectors are equivalent if their difference is in U. - This is denoted as **u + U**, representing the equivalence class of u. - Operations in a quotient vector space: - (u + U) + (v + U) = (u + v) + U - λ(u + U) = λu + U ### Section 12.7.2: Theorem: Dimension of a Quotient Space: - The dimension of the quotient V/U is the difference between the dimension of V and the dimension of U. ### Section 12.7.3: Isomorphism Theorems: 1. **V/ker f is isomorphic to Im f** 2. **(U1 + U2) / U1 is isomorphic to U2 / (U1 ∩ U2)** 3. **(V / U1) / (U2 / U1) is isomorphic to V / U2** ### Section 12.7.4: Consequences 1. **dim(ker f) + dim(Im f) = dim(V)** 2. **Grassmann's Formula**: **dim(U1 + U2) + dim(U1 ∩ U2) = dim(U1) + dim(U2)**

Use Quizgecko on...
Browser
Browser