ebook img

Linear Algebra PDF

436 Pages·2016·5.623 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Linear Algebra

Linear Algebra David Cherney, Tom Denton, Rohit Thomas and Andrew Waldron 2 Edited by Katrina Glaeser and Travis Scrimshaw First Edition. Davis California, 2013. This work is licensed under a Creative Commons Attribution-NonCommercial- ShareAlike 3.0 Unported License. 2 Contents 1 What is Linear Algebra? 9 1.1 Organizing Information . . . . . . . . . . . . . . . . . . . . . . 9 1.2 What are Vectors? . . . . . . . . . . . . . . . . . . . . . . . . 12 1.3 What are Linear Functions? . . . . . . . . . . . . . . . . . . . 15 1.4 So, What is a Matrix? . . . . . . . . . . . . . . . . . . . . . . 20 1.4.1 Matrix Multiplication is Composition of Functions . . . 25 1.4.2 The Matrix Detour . . . . . . . . . . . . . . . . . . . . 26 1.5 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 30 2 Systems of Linear Equations 37 2.1 Gaussian Elimination . . . . . . . . . . . . . . . . . . . . . . . 37 2.1.1 Augmented Matrix Notation . . . . . . . . . . . . . . . 37 2.1.2 Equivalence and the Act of Solving . . . . . . . . . . . 40 2.1.3 Reduced Row Echelon Form . . . . . . . . . . . . . . . 40 2.1.4 Solution Sets and RREF . . . . . . . . . . . . . . . . . 45 2.2 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 48 2.3 Elementary Row Operations . . . . . . . . . . . . . . . . . . . 52 2.3.1 EROs and Matrices . . . . . . . . . . . . . . . . . . . . 52 2.3.2 Recording EROs in (M|I) . . . . . . . . . . . . . . . . 54 2.3.3 The Three Elementary Matrices . . . . . . . . . . . . . 56 2.3.4 LU, LDU, and PLDU Factorizations . . . . . . . . . . 58 2.4 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 61 3 4 2.5 Solution Sets for Systems of Linear Equations . . . . . . . . . 63 2.5.1 The Geometry of Solution Sets: Hyperplanes . . . . . . 64 2.5.2 Particular Solution+Homogeneous Solutions . . . . . 65 2.5.3 Solutions and Linearity . . . . . . . . . . . . . . . . . . 66 2.6 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 68 3 The Simplex Method 71 3.1 Pablo’s Problem . . . . . . . . . . . . . . . . . . . . . . . . . . 71 3.2 Graphical Solutions . . . . . . . . . . . . . . . . . . . . . . . . 73 3.3 Dantzig’s Algorithm . . . . . . . . . . . . . . . . . . . . . . . 75 3.4 Pablo Meets Dantzig . . . . . . . . . . . . . . . . . . . . . . . 78 3.5 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 80 4 Vectors in Space, n-Vectors 83 4.1 Addition and Scalar Multiplication in Rn . . . . . . . . . . . . 84 4.2 Hyperplanes . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 4.3 Directions and Magnitudes . . . . . . . . . . . . . . . . . . . . 88 4.4 Vectors, Lists and Functions: RS . . . . . . . . . . . . . . . . 94 4.5 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 97 5 Vector Spaces 101 5.1 Examples of Vector Spaces . . . . . . . . . . . . . . . . . . . . 102 5.1.1 Non-Examples . . . . . . . . . . . . . . . . . . . . . . . 106 5.2 Other Fields . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 5.3 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 109 6 Linear Transformations 111 6.1 The Consequence of Linearity . . . . . . . . . . . . . . . . . . 112 6.2 Linear Functions on Hyperplanes . . . . . . . . . . . . . . . . 114 6.3 Linear Differential Operators . . . . . . . . . . . . . . . . . . . 115 6.4 Bases (Take 1) . . . . . . . . . . . . . . . . . . . . . . . . . . 115 6.5 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 118 7 Matrices 121 7.1 Linear Transformations and Matrices . . . . . . . . . . . . . . 121 7.1.1 Basis Notation . . . . . . . . . . . . . . . . . . . . . . 121 7.1.2 From Linear Operators to Matrices . . . . . . . . . . . 127 7.2 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 129 4 5 7.3 Properties of Matrices . . . . . . . . . . . . . . . . . . . . . . 133 7.3.1 Associativity and Non-Commutativity . . . . . . . . . 140 7.3.2 Block Matrices . . . . . . . . . . . . . . . . . . . . . . 142 7.3.3 The Algebra of Square Matrices . . . . . . . . . . . . 143 7.3.4 Trace . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145 7.4 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 146 7.5 Inverse Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . 150 7.5.1 Three Properties of the Inverse . . . . . . . . . . . . . 150 7.5.2 Finding Inverses (Redux) . . . . . . . . . . . . . . . . . 151 7.5.3 Linear Systems and Inverses . . . . . . . . . . . . . . . 153 7.5.4 Homogeneous Systems . . . . . . . . . . . . . . . . . . 154 7.5.5 Bit Matrices . . . . . . . . . . . . . . . . . . . . . . . . 154 7.6 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 155 7.7 LU Redux . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159 7.7.1 Using LU Decomposition to Solve Linear Systems . . . 160 7.7.2 Finding an LU Decomposition. . . . . . . . . . . . . . 162 7.7.3 Block LDU Decomposition . . . . . . . . . . . . . . . . 165 7.8 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 166 8 Determinants 169 8.1 The Determinant Formula . . . . . . . . . . . . . . . . . . . . 169 8.1.1 Simple Examples . . . . . . . . . . . . . . . . . . . . . 169 8.1.2 Permutations . . . . . . . . . . . . . . . . . . . . . . . 170 8.2 Elementary Matrices and Determinants . . . . . . . . . . . . . 174 8.2.1 Row Swap . . . . . . . . . . . . . . . . . . . . . . . . . 175 8.2.2 Row Multiplication . . . . . . . . . . . . . . . . . . . . 176 8.2.3 Row Addition . . . . . . . . . . . . . . . . . . . . . . . 177 8.2.4 Determinant of Products . . . . . . . . . . . . . . . . . 179 8.3 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 182 8.4 Properties of the Determinant . . . . . . . . . . . . . . . . . . 186 8.4.1 Determinant of the Inverse . . . . . . . . . . . . . . . . 190 8.4.2 Adjoint of a Matrix . . . . . . . . . . . . . . . . . . . . 190 8.4.3 Application: Volume of a Parallelepiped . . . . . . . . 192 8.5 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 193 9 Subspaces and Spanning Sets 195 9.1 Subspaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195 9.2 Building Subspaces . . . . . . . . . . . . . . . . . . . . . . . . 197 5 6 9.3 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 202 10 Linear Independence 203 10.1 Showing Linear Dependence . . . . . . . . . . . . . . . . . . . 204 10.2 Showing Linear Independence . . . . . . . . . . . . . . . . . . 207 10.3 From Dependent Independent . . . . . . . . . . . . . . . . . . 209 10.4 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 210 11 Basis and Dimension 213 11.1 Bases in Rn. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 216 11.2 Matrix of a Linear Transformation (Redux) . . . . . . . . . . 218 11.3 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 221 12 Eigenvalues and Eigenvectors 225 12.1 Invariant Directions . . . . . . . . . . . . . . . . . . . . . . . . 227 12.2 The Eigenvalue–Eigenvector Equation . . . . . . . . . . . . . . 233 12.3 Eigenspaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . 236 12.4 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 238 13 Diagonalization 241 13.1 Diagonalizability . . . . . . . . . . . . . . . . . . . . . . . . . 241 13.2 Change of Basis . . . . . . . . . . . . . . . . . . . . . . . . . . 242 13.3 Changing to a Basis of Eigenvectors . . . . . . . . . . . . . . . 246 13.4 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 248 14 Orthonormal Bases and Complements 253 14.1 Properties of the Standard Basis . . . . . . . . . . . . . . . . . 253 14.2 Orthogonal and Orthonormal Bases . . . . . . . . . . . . . . . 255 14.2.1 Orthonormal Bases and Dot Products . . . . . . . . . . 256 14.3 Relating Orthonormal Bases . . . . . . . . . . . . . . . . . . . 258 14.4 Gram-Schmidt & Orthogonal Complements . . . . . . . . . . 261 14.4.1 The Gram-Schmidt Procedure . . . . . . . . . . . . . . 264 14.5 QR Decomposition . . . . . . . . . . . . . . . . . . . . . . . . 265 14.6 Orthogonal Complements . . . . . . . . . . . . . . . . . . . . 267 14.7 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 272 15 Diagonalizing Symmetric Matrices 277 15.1 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 281 6 7 16 Kernel, Range, Nullity, Rank 285 16.1 Range . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 286 16.2 Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287 16.2.1 One-to-one and Onto . . . . . . . . . . . . . . . . . . 289 16.2.2 Kernel . . . . . . . . . . . . . . . . . . . . . . . . . . . 292 16.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 297 16.4 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 299 17 Least squares and Singular Values 303 17.1 Projection Matrices . . . . . . . . . . . . . . . . . . . . . . . . 306 17.2 Singular Value Decomposition . . . . . . . . . . . . . . . . . . 308 17.3 Review Problems . . . . . . . . . . . . . . . . . . . . . . . . . 312 A List of Symbols 315 B Fields 317 C Online Resources 319 D Sample First Midterm 321 E Sample Second Midterm 331 F Sample Final Exam 341 G Movie Scripts 367 G.1 What is Linear Algebra? . . . . . . . . . . . . . . . . . . . . . 367 G.2 Systems of Linear Equations . . . . . . . . . . . . . . . . . . . 367 G.3 Vectors in Space n-Vectors . . . . . . . . . . . . . . . . . . . . 377 G.4 Vector Spaces . . . . . . . . . . . . . . . . . . . . . . . . . . . 379 G.5 Linear Transformations . . . . . . . . . . . . . . . . . . . . . . 383 G.6 Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 385 G.7 Determinants . . . . . . . . . . . . . . . . . . . . . . . . . . . 395 G.8 Subspaces and Spanning Sets . . . . . . . . . . . . . . . . . . 403 G.9 Linear Independence . . . . . . . . . . . . . . . . . . . . . . . 404 G.10Basis and Dimension . . . . . . . . . . . . . . . . . . . . . . . 407 G.11Eigenvalues and Eigenvectors . . . . . . . . . . . . . . . . . . 409 G.12Diagonalization . . . . . . . . . . . . . . . . . . . . . . . . . . 415 G.13Orthonormal Bases and Complements . . . . . . . . . . . . . . 421 7 8 G.14Diagonalizing Symmetric Matrices . . . . . . . . . . . . . . . . 428 G.15Kernel, Range, Nullity, Rank . . . . . . . . . . . . . . . . . . . 430 G.16Least Squares and Singular Values . . . . . . . . . . . . . . . . 432 Index 432 8 1 What is Linear Algebra? Many difficult problems can be handled easily once relevant information is organized in a certain way. This text aims to teach you how to organize in- formation in cases where certain mathematical structures are present. Linear algebra is, in general, the study of those structures. Namely Linear algebra is the study of vectors and linear functions. In broad terms, vectors are things you can add and linear functions are functions of vectors that respect vector addition. The goal of this text is to teach you to organize information about vector spaces in a way that makes problems involving linear functions of many variables easy. (Or at least tractable.) To get a feel for the general idea of organizing information, of vectors, and of linear functions this chapter has brief sections on each. We start here in hopes of putting students in the right mindset for the odyssey that follows; the latter chapters cover the same material at a slower pace. Please be prepared to change the way you think about some familiar mathematical objects and keep a pencil and piece of paper handy! 1.1 Organizing Information Functions of several variables are often presented in one line such as f(x,y) = 3x+5y. 9 10 What is Linear Algebra? But lets think carefully; what is the left hand side of this equation doing? Functions and equations are different mathematical objects so why is the equal sign necessary? A Sophisticated Review of Functions If someone says “Consider the function of two variables 7β −13b.” wedonotquitehavealltheinformationweneedtodeterminetherelationship between inputs and outputs. Example 1 (Of organizing and reorganizing information) You own stock in 3 companies: Google, Netflix, and Apple. The value V of your stock portfolio as a function of the number of shares you own s ,s ,s of these N G A companies is 24s +80s +35s . G A N   1 Here is an ill posed question: what is V 2? 3 The column of three numbers is ambiguous! Is it is meant to denote • 1 share of G, 2 shares of N and 3 shares of A? • 1 share of N, 2 shares of G and 3 shares of A? Do we multiply the first number of the input by 24 or by 35? No one has specified an order for the variables, so we do not know how to calculate an output associated with a particular input.1 A different notation for V can clear this up; we can denote V itself as an ordered triple of numbers that reminds us what to do to each number from the input. 1Of course we would know how to calculate an output if the input is described in the tedious form such as “1 share of G, 2 shares of N and 3 shares of A”, but that is unacceptably tedious! We want to use ordered triples of numbers to concisely describe inputs. 10

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.