ebook img

Linear Algebra PDF

58 Pages·2017·0.462 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Linear Algebra

LINEAR ALGEBRA SIMONWADSLEY Contents 1. Vector spaces 2 1.1. Definitions and examples 2 1.2. Linear independence, bases and the Steinitz exchange lemma 4 1.3. Direct sum 8 2. Linear maps 9 2.1. Definitions and examples 9 2.2. Linear maps and matrices 11 2.3. The first isomorphism theorem and the rank-nullity theorem 13 2.4. Change of basis 16 2.5. Elementary matrix operations 18 3. Duality 19 3.1. Dual spaces 19 3.2. Dual maps 21 4. Bilinear Forms (I) 23 5. Determinants of matrices 26 6. Endomorphisms 30 6.1. Invariants 30 6.2. Minimal polynomials 33 6.3. The Cayley-Hamilton Theorem 36 6.4. Multiplicities of eigenvalues and Jordan Normal Form 39 7. Bilinear forms (II) 45 7.1. Symmetric bilinear forms and quadratic forms 45 7.2. Hermitian forms 49 8. Inner product spaces 51 8.1. Definitions and basic properties 51 8.2. Gram–Schmidt orthogonalisation 52 8.3. Adjoints 54 8.4. Spectral theory 56 Date:Michaelmas2016. 1 2 SIMONWADSLEY Lecture 1 1. Vector spaces Linearalgebracanbesummarisedasthestudyofvectorspacesandlinearmaps between them. This is a second ‘first course’ in Linear Algebra. That is to say, we will define everything we use but will assume some familiarity with the concepts (picked up from the IA course Vectors & Matrices for example). 1.1. Definitions and examples. Examples. (1) For each non-negative integer n, the set Rn of column vectors of length n with real entries is a vector space (over R). An (m×n)-matrix A with real entries can be viewed as a linear map Rn → Rm via v (cid:55)→ Av. In fact, as we will see, everylinearmapfromRn →Rmisofthisform. Thisisthemotivatingexample and can be used for intuition throughout this course. However, it comes with a specified system of co-ordinates given by taking the various entries of the column vectors. A substantial difference between this course and Vectors & Matrices is that we will work with vector spaces without a specified set of co-ordinates. We will see a number of advantages to this approach as we go. (2) Let X be a set and RX := {f: X → R} be equipped with an addition given by (f +g)(x) := f(x)+g(x) and a multiplication by scalars (in R) given by (λf)(x)=λ(f(x)). ThenRX isavectorspace(overR)insomecontextscalled the space of scalar fields on X. More generally, if V is a vector space over R then VX = {f: X → V} is a vector space in a similar manner — a space of vector fields on X. (3) If[a,b]isaclosedintervalinRthenC([a,b],R):={f ∈R[a,b] |f is continuous} is an R-vector space by restricting the operations on R[a,b]. Similarly C∞([a,b],R):={f ∈C([a,b],R)|f is infinitely differentiable} is an R-vector space. (4) The set of (m×n)-matrices with real entries is a vector space over R. Notation. We will use F to denote an arbitrary field. However the schedules only require consideration of R and C in this course. If you prefer you may understand F to always denote either R or C (and the examiners must take this view). What do our examples of vector spaces above have in common? In each case we have a notion of addition of ‘vectors’ and scalar multiplication of ‘vectors’ by elements in R. Definition. AnF-vectorspace isanabeliangroup(V,+)equippedwithafunction F×V →V; (λ,v)(cid:55)→λv such that (i) λ(µv)=(λµ)v for all λ,µ∈F and v ∈V; (ii) λ(u+v)=λu+λv for all λ∈F and u,v ∈V; (iii) (λ+µ)v =λv+µv for all λ,µ∈F and v ∈V; (iv) 1v =v for all v ∈V. Note that this means that we can add, subtract and rescale elements in a vector space and these operations behave in the ways that we are used to. Note also that in general a vector space does not come equipped with a co-ordinate system, or LINEAR ALGEBRA 3 notions of length, volume or angle. We will discuss how to recover these later in the course. At that point particular properties of the field F will be important. Convention. We will always write 0 to denote the additive identity of a vector spaceV. Byslightabuseofnotationwewillalsowrite0todenotethevectorspace {0}. Exercise. (1) Convinceyourselfthatallthevectorspacesmentionedthusfardoindeedsatisfy the axioms for a vector space. (2) Show that for any v in any vector space V, 0v =0 and (−1)v =−v Definition. Suppose that V is a vector space over F. A subset U ⊂ V is an (F-linear) subspace if (i) for all u ,u ∈U, u +u ∈U; 1 2 1 2 (ii) for all λ∈F and u∈U, λu∈U; (iii) 0∈U. Remarks. (1) It is straightforward to see that U ⊂V is a subspace if and only if U (cid:54)=∅ and λu +µu ∈U for all u ,u ∈U and λ,µ∈F. 1 2 1 2 (2) IfU isasubspaceofV thenU isavectorspaceundertheinheritedoperations. Examples.    x  1  (1) x2∈R3: x1+x2+x3 =t is a subspace of R3 if and only if t=0.  x  3 (2) Let X be a set. We define the support of a function f: X →F to be suppf :={x∈X :f(x)(cid:54)=0}. Then {f ∈ FX : |suppf| < ∞} is a subspace of FX since we can compute supp0=∅, supp(f +g)⊂suppf ∪suppg and suppλf =suppf if λ(cid:54)=0. Definition. Suppose that U and W are subspaces of a vector space V over F. Then the sum of U and W is the set U +W :={u+w :u∈U,w ∈W}. Proposition. If U and W are subspaces of a vector space V over F then U ∩W and U +W are also subspaces of V. Proof. Certainly both U ∩W and U +W contain 0. Suppose that v ,v ∈U ∩W, 1 2 u ,u ∈U, w ,w ∈W,and λ,µ∈F. Then λv +µv ∈U ∩W and 1 2 1 2 1 2 λ(u +w )+µ(u +w )=(λu +µu )+(λw +µw )∈U +W. 1 1 2 2 1 2 1 2 So U ∩W and U +W are subspaces of V. (cid:3) Lecture 2 Definition. Suppose that V is a vector space over F and U is a subspace of V. Then the quotient space V/U is the abelian group V/U equipped with the scalar multiplication F×V/U →V/U given by λ(v+U)=(λv)+U for λ∈F and v+U ∈V/U. 4 SIMONWADSLEY Proposition. V/U with the above structure is an F-vector space. Proof. First suppose v +U =v +U ∈V/U. Then (v −v )∈U and so 1 2 1 2 λv −λv =λ(v −v )∈U 1 2 1 2 for each λ ∈ F since U is a subspace. Thus λv +U = λv +U and the scalar 1 2 multiplication function F×V/U →V/U is well-defined. Noticethateachofthefouraxioms(i)-(iv)thatmustbeverifedtoshowthatthis scalar multiplication makes V/U into a vector space is now an almost immediate consequenceofthefactthatthesameaxiommustholdforthescalarmultiplication on V. For example to see (i) we observe that for v+U ∈V/U and λ,µ∈F λ(µ(v+U))=λ(µv+U)=λ(µv)+U =(λµ)v+U =(λµ)(v+U). (cid:3) 1.2. Linear independence, bases and the Steinitz exchange lemma. Definition. Let V be a vector space over F and S ⊂ V a subset of V. Then the span of S in V is the set of all finite F-linear combinations of elements of S, (cid:40) n (cid:41) (cid:88) (cid:104)S(cid:105):= λ s :λ ∈F,s ∈S,n(cid:62)0 i i i i i=1 Remarks. (1) (cid:104)S(cid:105) only consists of finite linear combinations of elements of S. (2) For any subset S ⊂V, (cid:104)S(cid:105) is the smallest subspace of V containing S. Example. Suppose that V is R3.          1 0 1 a     If S = 0,1,2 then (cid:104)S(cid:105)= b:a,b∈R .  0 1 2   b  Note also that every subset of S of order 2 has the same span as S. Example. Let X be a set and for each x∈X, define δ : X →F by x (cid:40) 1 if y =x δ (y)= x 0 if y (cid:54)=x. Then (cid:104)δ :x∈X(cid:105)={f ∈FX :|suppf|<∞}. x Definition. Let V be a vector space over F and S ⊂V. (i) We say that S spans V if V =(cid:104)S(cid:105). (ii) We say that S is linearly independent (LI) if, whenever n (cid:88) λ s =0 i i i=1 with λ ∈ F, and s distinct elements of S, it follows that λ = 0 for all i. If i i i S is not linearly independent then we say that S is linearly dependent (LD). (iii) We say that S is a basis for V if S spans and is linearly independent. If V has a finite basis we say that V is finite-dimensional. LINEAR ALGEBRA 5 Note that it is not yet clear that if V is finite-dimensional then all bases must have the same size. So we cannot define the dimension of V yet even when it is known to be finite-dimensional. Fixing this is our next main goal.       1 0 1   Example. SupposethatV isR3 andS = 0,1,2 . ThenS islinearly  0 1 2        1 0 1 dependentsince10+21+(−1)2=0. MoreoverSdoesnotspanV since 0 1 2   0 0 is not in (cid:104)S(cid:105). However, every subset of S of order 2 is linearly independent 1 and forms a basis for (cid:104)S(cid:105). Remark. Note that no linearly independent set can contain the zero vector since 1·0=0. Convention. The span of the empty set (cid:104)∅(cid:105) is the zero subspace 0. Thus the empty set is a basis of 0. One may consider this to not be so much a convention as the only reasonable interpretation of the definitions of span, linearly independent and basis in this case. Lemma. A subset S of a vector space V over F is linearly dependent if and only if there exist s ,s ,...,s ∈S distinct and λ ,...,λ ∈F such that s =(cid:80)n λ s . 0 1 n 1 n 0 i=1 i i (cid:80) Proof. Suppose that S is linearly dependent so that λ s = 0 for some s ∈ S i i i distinct and λ ∈F with λ (cid:54)=0 say. Then i j s =(cid:88)−λis . j λ i j i(cid:54)=j Conversely, if s =(cid:80)n λ s then (−1)s +(cid:80)n λ s =0. (cid:3) 0 i=1 i i 0 i=1 i i Proposition. Let V be a vector space over F. Then S ⊂V is a basis for V if and (cid:80) only if every element v ∈ V can be written uniquely as v = λ s with λ ∈ F s∈S s s and all but finitely many λ =0. s (cid:80) Remark. Notethat λ smakessensewheneverallbutfinitelymanyλ arezero s∈S s s since we then only summing finitely many non-zero terms — if you are concerned you may define (cid:88) (cid:88) λ s:= λ s s s s∈S s∈S λs(cid:54)=0 inthiscase. Wewillhereandelsewhereabbreviate‘allbutfinitelymany’asalmost all. Proof. First we observe that by definition S spans V if and only if every element v (cid:80) of V can be written in at least one way as v = λ s with λ ∈ F and almost s∈S s s all λ =0. s SoitsufficestoshowthatS islinearlyindependentifandonlyifthereisatmost one such expression for every v ∈V. 6 SIMONWADSLEY (cid:80) (cid:80) Suppose that S is linearly independent and v = λ s = µ s with s∈S s s∈S s (cid:80) λ ,µ ∈ F and almost all zero. Then, (λ −µ )s = 0. Thus by definition of s s s s s linear independence, λ −µ =0 for all s∈S and so λ =µ for all s. s s s s Conversely if S is linearly dependent then we can write (cid:88) (cid:88) λ s=0= 0.s s s∈S s∈S for some λ ∈F almost all zero but not all zero. Thus there are two ways to write s 0 as an F-linear combination of the s. (cid:3) Thefollowingresultisnecessaryforagoodnotionofdimensionforvectorspaces. Theorem (Steinitz exchange lemma). Let V be a vector space over F. Suppose that S = {e ,...,e } is a linearly independent subset of V and T ⊂ V spans V. 1 n ThenthereisasubsetD ofT ofordernsuchthat(T\D)∪S spansV. Inparticular |S|(cid:54)|T|. This is sometimes stated as follows (with the assumption that T is finite). Corollary. If {e ,...,e }⊂V is linearly independent and {f ,...,f } spans V. 1 n 1 m Then n (cid:54) m and, possibly after reordering the f , {e ,...,e ,f ,...,f } spans i 1 n n+1 m V. Lecture 3 Corollary (Corollary of Steinitz exchange lemma). Let V be a vector space with a basis of order n. (a) Every basis of V has order n. (b) Any n LI vectors in V form a basis for V. (c) Any n vectors in V that span V form a basis for V. (d) Any set of linearly independent vectors in V can be extended to a basis for V. (e) Any finite spanning set in V contains a basis for V. Proof. Suppose that S ={e ,...,e } is a basis for V. 1 n (a) Suppose that T is another basis of V. Since S spans V and any finite subset of T is linearly independent |T| (cid:54) n. Since T spans and S is linearly independent |T|(cid:62)n. Thus |T|=n as required. (b) Suppose T is a LI subset of V of order n. If T did not span we could choose v ∈V\(cid:104)T(cid:105). Then T ∪{v} is a LI subset of V of order n+1, a contradiction. (c)SupposeT spansV andhasordern. IfT wereLDwecouldfindt ,t ,...,t 0 1 m in T distinct such that t =(cid:80)m λ t for some λ ∈F. Thus V =(cid:104)T(cid:105)=(cid:104)T\{t }(cid:105) 0 i=1 i i i 0 so T\{t } is a spanning set for V of order n−1, a contradiction. 0 (d) Let T = {t ,...,t } be a linearly independent subset of V. Since S spans 1 m V we can find s ,...,s in S such that (S\{s ,...,s })∪T spans V. Since this 1 m 1 m set has order (at most) n it is a basis containing T. (e) Suppose that T is a finite spanning set for V and let T(cid:48) ⊂ T be a subset of minimal size that still spans V. If |T(cid:48)| = n we’re done by (c). Otherwise |T(cid:48)| > n and so T(cid:48) is LD as S spans. Thus there are t ,...,t in T(cid:48) distinct such that 0 m t = (cid:80)λ t for some λ ∈ F. Then V = (cid:104)T(cid:48)(cid:105) = (cid:104)T(cid:48)\{t }(cid:105) contradicting the 0 i i i 0 minimality of T(cid:48) (cid:3) Exercise. Prove (e) holds for any spanning set in a f.d. V. LINEAR ALGEBRA 7 We prove the theorem by replacing elements of T by elements of S one by one. Proof of the Steinitz exchange lemma. Suppose that we’ve already found a subset D of T of order 0 (cid:54) r < n such that T := (T\D )∪{e ,...,e } spans V (the r r r 1 r case r =0 is clear and the case r =n is the result). Then we can write k (cid:88) e = λ t r+1 i i i=1 withλ ∈Fandt ∈T . Since{e ,...,e }islinearlyindependenttheremustbe i i r 1 r+1 some 1(cid:54)j (cid:54)k such that λ (cid:54)=0 and t (cid:54)∈{e ,...,e }. Let D =D ∪{t } and j j 1 r r+1 r j T =(T\D )∪{e ,...,e }=(T \{t })∪{e } r+1 r+1 1 r+1 r j r+1 . Now t = 1 e −(cid:88) λit , j λ r+1 λ i j j i(cid:54)=j so t ∈(cid:104)T (cid:105) and (cid:104)T (cid:105)=(cid:104)T ∪{t }(cid:105)⊃(cid:104)T (cid:105)=V. j r+1 r+1 r+1 j r Now we can inductively construct D =D with the required properties. (cid:3) n Definition. IfavectorspaceV overFisfinite-dimensionalwithbasisS,wedefine the dimension of V by dim V =dimV =|S|. F Remarks. (1) By the last corollary the dimension of a finite dimensional space V does not depend on the choice of basis S. However the dimension does depend on F. For example C has dimension 1 viewed as a vector space over C (since {1} is a basis) but dimension 2 viewed as a vector space over R (since {1,i} is a basis). (2) If we wanted to be more precise then we could define the dimension of an infinite-dimensional space to be the cardinality of any basis for V. We have not proven enough to see that this would be well-defined; in fact there are no problems. Lemma. If V is f.d. and U ⊂V is a subspace then U is also f.d. and dimU (cid:54)dimV. Proof. Let S ⊂ U be a LI subset of U of maximal possible size. Then every finite subset of S has size at most dimV (by the Steinitz Exchange Lemma). Thus |S|(cid:54)dimV. Ifu∈U\(cid:104)S(cid:105)thenS∪{u}isLIcontradictingthemaximalityofS. ThusU =(cid:104)S(cid:105) and S is a basis for U. (cid:3) Proposition. If V is a finite dimensional vector space over F and U is a subspace then dimV =dimU +dimV/U. Proof. Sincedimensionisdefinedintermsofbases,andwehavenowaytocompute it at this stage of the course except by finding bases and counting the number of elements, we must find suitable bases. The key idea is to be careful about how we choose our bases. Slogan When choosing bases always choose the best basis for the job. 8 SIMONWADSLEY Let{u ,...,u }beabasisforU andextendtoabasis{u ,...,u ,v ,...,v } 1 m 1 m m+1 n for V. It suffices to show that S := {v +U,...,v +U} is a basis for V/U. m+1 n Suppose that v+U ∈V/U. Then we can write m n (cid:88) (cid:88) v = λ u + µ v . i i j j i=1 j=m+1 Thus m n n (cid:88) (cid:88) (cid:88) v+U = λ (u +U)+ µ (v +U)=0+ µ (v +U) i i j j j j i=1 j=m+1 j=m+1 and so S spans V/U. To show that S is LI, suppose that n (cid:88) µ (v +U)=0. j j j=m+1 Then (cid:80)n µ v ∈ U so we can write (cid:80)n µ v = (cid:80)m λ u for some j=m+1 j j j=m+1 j j i=1 i i λ ∈F. Since the set {u ,...,u ,v ,...,v } is LI we deduce that each µ (and i 1 m m+1 n j λ ) is zero as required. (cid:3) j Corollary. If U is a proper subspace of V then dimU <dimV. Proof. SinceU isproper,V/U isnon-zeroandsotheemptysetdoesnotspanV/U. Thus dimV/U >0 and dimU =dimV −dimV/U <dimV. (cid:3) Exercise. Prove this last corollary directly by strengthening the proof that U (cid:54)V implies dimU (cid:54)dimV. Lecture 4 1.3. Direct sum. There are two related notions of direct sum of vector spaces and the distinction between them can often cause confusion to newcomers to the subject. The first is sometimes known as the internal direct sum and the latter as theexternal directsum. Howeveritiscommontoglossoverthedifferencebetween them. Definition. Suppose that V is a vector space over F and U and W are subspaces of V. Recall that the sum of U and W is defined to be U +W ={u+w :u∈U,w ∈W}. We say that V is the (internal) direct sum of U and W, written V = U ⊕W, if V =U +W and U ∩W =0. Equivalently V =U ⊕W if every element v ∈V can be written uniquely as u+w with u∈U and w ∈W. We also say that U and W are complementary subspaces in V. Example. Suppose that V =R3 and x   (cid:42)1(cid:43) (cid:42)1(cid:43)  1  U = x2:x1+x2+x3 =0 ,W1 = 1 and W2 = 0  x  1 0 3 then V =U ⊕W =U ⊕W . 1 2 NoteinparticularthatU doesnothaveonlyonecomplementarysubspaceinV. LINEAR ALGEBRA 9 Definition. Given any two vector spaces U and W over F the (external) direct sum U ⊕W of U and W is defined to be the set of pairs {(u,w):u∈U,w ∈W} with addition given by (u ,w )+(u ,w )=(u +u ,w +w ) 1 1 2 2 1 2 1 2 and scalar multiplication given by λ(u,w)=(λu,λw). Exercise. Show that U⊕W is a vector space over F with the given operations and that it is the internal direct sum of its subspaces {(u,0):u∈U} and {(0,w):w ∈W}. More generally we can make the following definitions. Definition. If U ,...,U are subspaces of V then V is the (internal) direct sum 1 n of U ,...,U written 1 n n (cid:77) V =U ⊕···⊕U = U 1 n i i=1 if every element v of V can be written uniquely as v =(cid:80)n u with u ∈U . i=1 i i i Definition. If U ,...,U are any vector spaces over F their (external) direct sum 1 n is the vector space n (cid:77) U :={(u ,...,u )|u ∈U } i 1 n i i i=1 with natural coordinate-wise operations. From now on we will drop the adjectives ‘internal’ and ‘external’ from ‘direct sum’. 2. Linear maps 2.1. Definitions and examples. Definition. SupposethatU andV arevectorspacesoverafieldF. Thenafunction α: U →V is a linear map if (i) α(u +u )=α(u )+α(u ) for all u ,u ∈U; 1 2 1 2 1 2 (ii) α(λu)=λα(u) for all u∈U and λ∈F. Notation. We write L(U,V) for the set of linear maps U →V. Remarks. (1) We can combine the two parts of the definition into one as: α is linear if and onlyifα(λu +µu )=λα(u )+µα(u )forallλ,µ∈Fandu ,u ∈U. Linear 1 2 1 2 1 2 maps should be viewed as functions between vector spaces that respect their structure as vector spaces. (2) Ifαisalinearmapthenαisahomomorphismoftheunderlyingabeliangroups. In particular α(0)=0. (3) IfwewanttostressthefieldFthenwewillsayamapisF-linear. Forexample, complexconjugationdefinesanR-linearmapfromCtoCbutitisnotC-linear. 10 SIMONWADSLEY Examples. (1) Let A be an n×m matrix with coefficients in F — write A∈M (F). Then n,m α: Fm →Fn; α(v)=Av is a linear map. To see this let λ,µ ∈ F and u,v ∈ Fm. As usual, let A denote the ijth ij entry of A and u , (resp. v ) for the jth coordinate of u (resp. v). Then for j j 1(cid:54)i(cid:54)n, m (cid:88) (α(λu+µv)) = A (λu +µv )=λα(u) +µα(v) i ij j j i i j=1 so α(λu+µv)=λα(u)+µα(v) as required. (2) If X is any set and g ∈ FX then m : FX → FX; m (f)(x) := g(x)f(x) for g g x∈X is linear. (3) For all x∈[a,b], δ : C([a,b],R)→R;f (cid:55)→f(x) is linear. x (cid:82)x (4) I: C([a,b],R)→C([a,b],R); I(f)(x)= f(t)dt is linear. a (5) D: C∞([a,b],R)→C∞([a,b],R); (Df)(t)=f(cid:48)(t) is linear. (6) Ifα,β: U →V arelinearandλ∈Fthenα+β: U →V givenby(α+β)(u)= α(u)+β(u) and λα: U → V given by (λα)(u) = λ(α(u)) are linear. In this way L(U,V) is a vector space over F. Definition. We say that a linear map α: U → V is an isomorphism if there is a linear map β: V →U such that βα=id and αβ =id . U V Lecture 5 Lemma. Suppose that U and V are vector spaces over F. A linear map α: U →V is an isomorphism if and only if α is a bijection. Proof. Certainly an isomorphism α: U → V is a bijection since it has an inverse as a function between the underlying sets U and V. Suppose that α: U → V is a linear bijection and let β: V →U be its inverse as a function. We must show that β is also linear. Let λ,µ∈F and v ,v ∈V. Then 1 2 αβ(λv +µv )=λαβ(v )+µαβ(v )=α(λβ(v )+µβ(v )). 1 2 1 2 1 2 Since α is injective it follows that β is linear as required. (cid:3) Proposition. Suppose that α: U →V is an F-linear map. (a) If α is injective and S ⊂ U is linearly independent then α(S) ⊂ V is linearly independent. (b) If α is surjective and S ⊂U spans U then α(S) spans V. (c) If α is an isomorphism and S is a basis then α(S) is a basis. Proof. (a) Suppose α is injective, S ⊂ U and α(S) is linearly dependent. Then there are s ,...,s ∈S distinct and λ ,...,λ ∈F such that 0 n 1 n (cid:32) n (cid:33) (cid:88) (cid:88) α(s )= λ α(s )=α λ s . 0 i i i i i=1 Since α is injective it follows that s =(cid:80)nλ s and S is LD. 0 1 i i (b) Now suppose that α is surjective, S ⊂ U spans U and let v in V. There is u ∈ U such that α(u) = v and there are s ,...,s ∈ S and λ ,...,λ ∈ F such 1 n 1 n (cid:80) (cid:80) that λ s =u. Then λ α(s )=v. Thus α(S) spans V. i i i i (c) Follows immediately from (a) and (b). (cid:3)

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.