ebook img

Theory and Application of the Linear Model PDF

237 Pages·35.295 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Theory and Application of the Linear Model

... ... .!. . .~ ·. f ¡ .. . ) AN INTRODUCTION TO LINEAR STATISTICAL MODELS Volun1e I FRANKLIN A. GRAYBILL L-'rojl'.~sor of 1lfothe111ulical Statistics Colorado Stutc University Fort Colli11.$, Colorado :rifoGRA \\'-H ILL BOOK COl\IPANY, INC. Ncw York 'l'oron to London l !Hi 1 AN INTRODUCTION TO LINEAR STATISTICAL MODELS TO JEANNE VOLUME I Copyright © 1961 by the McGraw-Hill Book Company, Inc. Printed in the United States of America. All rights reserved. This book, or parts thereof, may not be reproduced in any forro without permission of the publishers. Library of OongretJs Oatalog Oard Number 60-9841 ISBN 07-024331-x 12 13 14 15 16-MAMM-7 6 5 4 3 2 FAC. CIENCIAS Preface This book was \vritten with the intention of fulfilling three needs: ( l) for a theory textbook in experimental statistics, for undergraduates or first-year graduate students; (2) for a reference book in the area of regression, correlation, least squares, experimental design, etc., for consulting statisticians with limited mathematical training; and (3) for a reference book for experimenters with limited mathematical training who use statistfos in their research. This is not a book on mathematics, neither is it an advanced book on statistical theory. It is intended to be an introductory mathematical treatment of topics which are important to experimenters, statistical consultants, and those who are training to be statistical consultants. The only mathe- - matics that is required is generally obtained before the senior year in college. The mathematical leve] of this book is about the same as "An Introduction to the Theory of Statistics" by A.M. Mood, except for the fact that a great <leal of use is made of matrix theory. Chapter l is a review of the mathematical concepts, especially matrices, that will be used in the book. For students who have had a course in matrix algebra, Chapter 1 can be omitted, since in most cases the theorems are stated without proof and simple problems are given to illustrate them. Chapter 2 is a review of those statistical conccpts which are usually obtained in a first course in mathematical statistics. In these two chapters the important concepts are presented in the form of theorems so that they can be referred to as they are needed in later chapters. In the remainder of the book sorne of the proofs are omitted, but most of them are presented in detail, since this book was written for those who will have limited mathematical training. vii viii PREFACE PREFACE • IX Among other things, the statistical consultant is involved in aiding Professor Osear Kempthorne, first for his hel durin in the interpretation of observations that are collected by experi work at Iowa Stat e College and Iater for read. pth ? my graduate ' mg e entire manuscript menters. This interpretation can often be materially iinproved by the I a ls o wan t t o express my appreciation top r L W · roJ.essor . ayne J ohnson use of "statistical models." This use inv ol ves three parts. The first t.h e h ead o f t li e Ma thematics Department of Oklahoma St . ' is the specification of the structure of the observations, or, in other s1ty, wh~ made personn~l available for typing and proofre:: Un~v~­ words, the selection of a model to represent the real-world situation manuscr1pt, and to David Weeks and Van . ng_o t e 1 that the observations describe. The second part is the n:iathematical students irr statistics at Oklahoma Statea ~a~a ~eshadr1, graduate treatment of the model to obtain results about various components of entire manuscript. mversity, who read the it. This is generally in the nature of point estimation, interval I wish to thank the typists who helped to t h d . estimation, hypothesis testing, etc. The third part is the use of the into typescript; Miss Nancy Adams M' puC m~ an wr1tten notes results of the mathematical treatment of the model to make dccisions Carlotta Fine, and Miss Biruta Stakle' iss aro yn Hundley, Mrs. in the real world. This book is concerned only with the sccond part, I am indebted to Mr F W B jy namely, the mathematical treatment of statistical models, and no of Standards at Boulde; C~l:r::' fi irector ~f t.he National Bureau attempt is made to justify any model for a given real-world situation. port .1 0n of t h e material in 'N BS repor' t 5o0r6 9p eGrmr1as sh1 on tdo reproduce ª Five important statistical models are discusscd, which are the basis for Percentage Points F( ) fi h . ' P s an Tables of the to b bl. h d b L 'V1,~2,P or t e F1sher-Snedecor Variance Ratio many of the applications in statistics, and these ti.ve models are e pu is e y ew1s E. Vogler and Kenneth A No t ' presented from the point of view of infinite-model theory. Through I am also indebted to Professor E S p J. r on. rr d U . . earson . Neyman p C out the book emphasis is placed on the power of tests and on the ang, an mvers1ty College for perm:J.i ssi• on to ' . ' . . width of confidence intervals. The Power Function of the Analy. sis of V . rTeprmt t~e tables from A set of notes from which this book was taken has served as a two rn us t rat1· ons rJ. Or rT heir Use, from Staartiiasntic e l eRs ts with Tables a.n d semester course for seniors and first-year graduate students at Okla volume l. ca esearch Memoirs, homa State University; the prerequisites are one year of calculus, one I am also indebted to the Biometrika Truste . . semester of matrix algebra, one year of introductory statistical theory reproduce a portion of "Tables of p ~s for perm1ss1on to from a book such as "Introduction to the Theory of Statistics," by Beta (F) Distribution" by M M . ercentage Pomts of the Inverted F' . · errmgton and C. M. Thom son A.M. Mood, or "Statistical Theory in Research," by R. L. Anderson . mally' I w1sh to expresa my thanks to . p . . and T. A. Bancroft, and one year of statistical methods. Man y of the reheved me of most of the household h d ~y w1fe, J eanne, who students who were not statistics majors had no formal course in that I worked on the manuscript. e ores urmg the many evenings matrix theory, but sorne had audited a matrix course, and sorne were Franklin A. Graybill self-taught. These students were required to work the pertinent problema in Chapter l. Volume II will contain such topics as sample size, multiple com parisons, multivariate analysis of variance, response surfaces, dis criminant functions, partially balanced incomplete block designa, orthogonal latin squares, randomization theory, split-plot models, and sorne nonlinear models. This book was written with the direct and indirect help_ of many people. I owe a great debt to my mother, Mrs. J...ula Graybill, and to my two sisters and their husbands, Mr. and Mrs. Wayne Carr and Mr. and Mrs. Homer Good, for their help and encouragement. Had it not been for them, I probably would not have been able to attend college. I want to thank Professor Carl E. Marshall for introducing me to statistics and for encouraging me to do further work in this field, and Coutents Preface . vii · Chapter 1. Mathematical Concepts . 1 1.1 Matrices . 1 1.2 Quadratio Forms 3 1.3 Determinants 6 1.4 Miscellaneous Theorems on Matrices . 6 1.5 The Derivativas of Matrices and Vectors . 11 1.6 ldempotent Matrices 12 1. 7 Maxima, Minima, and J acobians . 17 Chapter 2. Statistical Concepts . 27 2.1 Sample Space, Random Variable, Frequency Function 27 2.2 Statistical lnference. 32 2.3 Point Estimation 33 2.4 Interval Estimation. 37 2.5 Testing Hypotheses . 39 2.6 Concliision 46 1 Chapter 3. The Multivariate Normal Distribution 48 3.1 Definition 48 3.2 Marginal Distributions . 51 3.3 Moments of the Multivariate Normal. 53 3.4 Linear Functions of Normal Variables 56 3.5 Independence 57 3.6 The Conditional Distribution . 62 3. 7 Additional Theorems 67 1 Chapter 4. Distribution of Quadratic Forms . 74 4.1 Introduction. 74 4.2 Noncentral Chi-square . 75 xi 1 CONTENTS xiii CONTENTS xii 12.3 Two-way Cross Classification: No Interaction . 258 77 4.3 N oncentral F 82 12.4 N-way Cross Classification: No Interaction . 261 4.4 Distribution of Quadratic Forros · 84 12.5 Two-way Classification with Interaction . 265 4.5 Independence of Quadra:tic Forros . 86 12.6 N-way Classification with Interaction 272 4.6 Independence of Linear an~ Quadrat1c F4 orros 87 12. 7 2n Factorial l\fodels • 279 4. 7 Expected Value of Quadrat1c Forros · · · 88 12.8 Using Interaction Sum of Squares for Error Sum of Squares 283 4.8 Additional Theorems · · Chapter 13. Two-way Classification with Unequal Nwnbe1·s in Subclasses. 287 93 /':.i Chapter 5. Linear ModeIS 93 13.l Two-way Classificatiori l\fodel with No Interaction. 287 5.1 Introduction. 96 13.2 Computing Instructions 302 5.2 Linear Models 103 Chapter 14. Incomplete Block J.\fodels 306 5.3 Model Classification . 106 14.1 Int.roduction. 306 Chapter 6. Model l: The General Linear Hypothesis of Full Rank 106 14.2 Point Estimation 309 6.1 Introduction . · llO 14.3 Interval Estimation and Tests of Hypotheses 312 6.2 Point Estimation 120 14.4 Computing 313 6.3 Interval Estimation · 128 Chapter 15. Sorne Additional Topics about Model 4 318 6.4 Tests of Hypotheses 149 Chapter 7. Coroputing Techniques 15. l Introduction • 318 149 15.2 Assumptions for Model 4 318 7 .1 Introduction · · · · . . 149 15.:J Tests of Hypotheses 320 7 .2 Solving a System of Symmetr1c Equa~1ons ·. . 155 15.4 Test for Additivity . 324 7 .3 Coroputing the Inverse of a Symroetr1c Matr1x 165 15.5 Transformation . 332 Cha.pter 8. Polynomial or Curvilinear Models . 165 Chapter 10. Model 5: Variance Components; Point Estimation 337 8.1 Introduction · · · · · '. · . · · ' · · ~l Mod~l · : : . 165 8.2 Estimating and Testing Coefficie~ts m a Poly~:m1 ffven Set ofData 166 16.l One-way Classification: Equal Subclass Numbers . 338 8.3 Finding the Degree of a Polynom1al that Descr1 es a 1 . . . . 16.2 The General Balanced Case of Model 5 . 347 172 16.3 Two-way Classification . 348 8.4 Orthogonal Polynomials · · · · 183 16.4 The Balanced Twofold and Fourfold Nested Classification of Model 5 349 8.5 Repeated Observations for Each X 186 16.5 One-way Classification with Unequal Numbers in the Groups. 351 Cha.pter 9. Model 2: Functional Relationships 186 16.6 Twofold Nested Classification in l\fodel 5 with Unequal Numbers in the 9.1 Introduction and Definitions · 187 Subclasses 354 9.2 Point Estimation · · · • · · · · 193 16.7 The Unbalanced Two-way Classification in l\fodel 5 359 9.3 Interval Estimation and Tests of Hypotheses 16.8 The Balanced Incomplete Block in Model 5 . 362 195 I Chapter 10. Model 3: Regre~sion Models Chapter 17. Model 5: Var;iance Components; Interval Estimation and 195 Tests of Hypotheses · 368 10.l Introduction · · · . · · · • 197 10.2 Case 1: The M,_ultivar1ate Normal· 206 17 .1 Distribution of a Linear Combino.tion of Chi-square Varia.tes . 368 10.3 Correlation · 216 17 .2 Tests of Hypotheses 374 l o.4 Case 2, Model 3 217 17.3 Ratio of Variances . 378 10.5 Case 3, Model 3 223 Chapter 18. Mixed Models 383 Chapter 11. Model 4: Experimental Design Models .· 223 18.1 Covariarice 383 11.1 Introduction · ·. 226 18.2 Two-way. Classification l\fodel with Interaction and with Fixed and 11.2 Point Estima.tion 241 Random Effects . 396 11.3 lnterva.1 Estimation · 241 18.3 Balanced Incomplete Block with Blocks Random: Recovery of 11.4 Testing Hypotheses · · · · . · 245 Interblock Information . 403 11.5 Normal Equat.ions and Computmg 250 11.6 Optimum Properties of the Tests of Hypotheses Appendix. Tables 421 254 Cha.pter 12. The Cross-classification or Factorial Model lndex 461 254 12.l Introduction · · · ·. · · · 255 12.2 The One-way Olassificat1on Model. 1 Mathematical Concepts 1.1 Matrices The theory of linear statistical models that will be developed in this book will require sorne of the mathematical techniques of calculus, matrix algebra, etc. In this chapter we shall give sorne of the impor tant mathematical theorems that are necessary to develop this theory. Most of the theorems in this chapter will be given without proof; for sorne, however, the proof will be supplied. A matrix A will hav e elements denoted by aH, where i refers to the row andj to the·column. If A denotes a matrix, then A' will denote the transpose of A, and A -1 will denote the inv erse of A. The symbol IAI will be used to denote the determinant of-A. The identity matrix will be denoted by 1, and O will denote the null matrix, i.e., a matrix whose elements are all zeros. The dimension of a matrix is the number of its rows by the immber of its columna. For cxample, a matrix A of dimension n X m, or an n X m matrix A, will be a matrix A with n rows and m columns. If m, = 1, the matrix will be called an n x l vector. The rank ofthe matrix A will sometimes be denoted by p(A). Given the matrices A =(a¡;) and B =(bu), the product AB = C = e Ln (cii) is defined as the matrix with pqth element equal to ªpsb sq• For 8=1 AB to be defined, the number of columns in A must equal the number + ofrows in B. For A B to be defined, A and B must hav e the samc dimension; A + B = e gives C¡¡ ~-a;¡ + b¡;. If k is a scalar and A is a matrix, then kA means the matrix such that each element is the corresponding element of A multiplied by k. l 2 LINEAR STATISTICAL l\lODELS MATHEMATICAL CONCEPTS 3 A diagonal matrix D is defined as a square matrix whose off-diagonal that AB and BC exist and if A and C are nonsingular then elements are ~11 zero; that is, if D = (dii), then d¡; = O if i -=F j. p(AB) = p(BC) = p(B). ' + Theorem 1.1 The transpose of A' equals A; that is, (A')' =A. + Theorem 1.18 If the product AB of two square matrices is O, then A = O or B = O or A and B are both singular. + Theorem 1.2 The inverse of A-1 is A; that is, (A-1)-1 =A. + Theorem 1.19 If A and B ar~ n x n matrices of rank r and + Theorem 1.3 The transpose and inverse symbols may be per- 8 respectively, then the rank of AB is greater than or equal t~ inuted; that is, (A')-1 = (A-1)'. r + s - n. · + Theorem 1.4 (AB)' = B.' A'. + Theorem 1.20 The rank of AA' equals the rank of A'A equals the rank of A equals the rank of A'. · + Theorem 1.5 (AB)-1 = B-1 A-1 if A and B are each nonsingular. + Theorem 1.6 A scalar commutes with every matrix; that is, 1.2 Quadratic Forms kA = Ak. + Theorem 1.7 For any matrix A we have IA =Al= A. .I f ~is an n X 1 vector with ith element y¡ and if A is an n x n matrix w1th iJth elcmcnt equal to a¡;, then the quadratic f orm Y'A Y is defined as + Theorem 1.8 All diagonal matrices of the same dimension are n n .I .I commutative. YiY;ª;;· The rank of the quadratic form Y'A Y is defined as the t=l J==l + Theorem 1.9 IfD1 and D are diagonal matrices, then the product rank of the matrix A. The quadratic form Y'A Y is said to be positive 2 is ~iagonal; that is, D D = D D = D, where D is diagonal. d~finite if and only if Y' A Y > O for ali vectors Y where. Y =I= O. A 1 2 2 1 The ith diagonal element of D is the product of the ith diagonal quadratic form Y' AY is said to be positive semidefinite if and only if element of D1 and the ith diagonal element of D2• Y'A1:"·~ O for all Y, a~d Y'AY =O for sorne vector Y -=FO. The matr!x A .ºf a quadratw form Y' A Y is said to· be positiva definite + Theorem 1.10 If X and Y are vectors and if A is a nonsingular (senu.defimte) '~hen the quadratic form is positive definite (semi matrix and if thc cquation Y = AX holds, then X = A-1 Y. defimte). If e IS an n X n matrix such that C'C = 1, then e is said + Theorem 1.11 The rank of the product AB of the two matrices to be an ortlwgonal matrix, and C' = c-1. A and B is less than or equal to the rank of A and is less than or Consider the transformation from the vector z to the vector y by equal to the rank of B. t~e , matrix P such that Y. = PZ. Then, Y'A Y = (PZ) 'A(PZ) = + Theorem 1.12 The rank of the sum of A + B. is less than or equal Z P A~Z. . Thus, by the transformation Y.= PZ, the quadratic form Y AY is transformed into the_quadratic form Z'(P'AP)Z. to the rank of A plus the rank of B. + T.heorem 1.13 If A is an n x n matrix and if IAI =O, then the • Theor~m 1.2~ If .Pis a nonsingular matrix and if A is positive rank of A is less than n. ( IAI denotes the determinant of the . defimte (sem1defimte), then P'A P is positive definite (semidefinite). matrix A.) + Theorem 1.22 A necessary and sufficient condition for the + Theorem 1.14 If the rank of A is less than n, then the rows of A symmetric matrix A to be positive definite is that there exist a are not independent; likewise, thc columns of A are not inde nonsingular matrix P such that A = PP'. · ¡)endent. (A is n x n.) • Theorem 1.23 A necessary and sufficient condition that the + Theorem 1.15 If the rank of A is m ~ n, then the number of · ~atrix A be positive definitc, where linearly independent rows is m; also, the number of linearly ::¡ independent columns is m. (A is n x n.) + Theorem 1.16 If A'A =O, then A ~O. A = (:: ::: . . . + Theorem 1.17 The rank of a matrix is unaltered by multiplication ....... ~.~ . ~.~ ~.~ by a nonsingular matrix; that is, if A, B, and C are matrices such 4 LINEAR STATISTICAL MODELS MATHE.MATICAL CONCEPTS 5 1i is that the following inequalities hold: + Theorem 1.31 For every symmetric matrix A tl . orthogonal matrix C such that C' AC - D. ·h ie:c ex.1sts an mat 1.·1 x. w h ose d1' agonal elements are th- 1 '·'\ ere. D• is a d1a0u onal e e 1aracter1strn roots of A. ª11 ª12 I· >0, ... , >0 + Theorem 1.32• Let A 1, A 2, • · • , A t be a collect1·o n of sym metric 1 ª21 ª22 n .?< n matrices. A necessary and sufficient condition that there ex1sts ~n orthogonal ~ransformatiOñ C such that C' A C C' A C + Theorem 1.24 If A is an n x m matrix of rank m < n, then i· . ·a ·n , d CJ.. tA .t sC·~ fnacree d aalll lt hdeia Ago; naarle iss ytmhmate tAri ic A, i itb f eo lslyowmsm teht1r a1.t'c Af oAr 2a ilsl' A' A is positive definite and AA' is positive semidefinite. symme r1c l an only if A; and A, commute. ¡ i + Theorem 1.25 If A is an n x m matrix of rank k < m and + Theorem 1.33 Let an n X n matrix C be written k < n, then A' A and AA' are each positive semidefinite. + Theorem 1.26 If C is an orthogonal matrix, and if the trans formation Y = CZ is made on Y'Y, we get Y'Y = Y'IY = Z'C'ICZ = Z'C'CZ = Z'Z. In order to develop the theory of quadratic forms, it is necessary to define a characteristic root of a matrix. A characteristic root of a pX x=I = pO . maTthriex v eAc t.ios r aX sisc aclaalrle Ad tshuec chh athraactt eAriXsti c= v e.cAt.Xo r foofr thsoer mnea tvreixc tAor. whl ere C; is the ith row of c. Thus e i 1-8 the t ranspose o f an n ·x 1 coffiu~n vt ~ctoCr. The following two conditions are necessary and It follows that, if A is a characteristic root of A, then AX - .A.X = O· su cien .ior to be orthogonal: and (A - .A.l)X =O. Thus, A is a scalar such that the above homo ( l) geneous set of equations has a nontrivial solution, i.e., a solution other for ali i -=!= .i than X =O. It is known from elementary matrix theory that this (2) for ali i implies IA - .HI =O. Thus the characteristic root of a matrix A could be defined as a scalar A such that IA - .A.11 =O. It is easily That is t~ s~y, any two rows of any orthogonal matrix are ortho - onal (the1r mner product is zero) and th . g seen that IA - .UI is a pth degree polynomial in .A.. This polynomial row with itself is unity. ' e mner product of any is called the characteristic polynomial, and its roots are the characteristic ::¡ roots of the matrix A. We shall noi.v give a few theorems concerning characteristic roots, characteristic vectors, and characteristic poly Theore~ nomials. In this book the elements of a matrix will be real. • 1.34 • L· etP C x ! -- (. . . be P rows of an n X n orthogonal + Theorem 1.27 The number of nonzero characteristic roots of a matrix A is equal to the rank of A. Cv + Theorem 1.28 The characteristic roots of A are identical with the matrix. That is to sav let e( i' ; characteristic roots of CAC-1• lf C is an orthogonal matrix, it vectors suc'Ii that e¡¿ ~ O 2'. ·~. ' e,, be the transpose~ of p follows that A and CAC' have identical characteristic roots. (i -- 1, -? , ... , ?) . ~hi en there exJi s-t nl ,- 2,p · ·v ·e c, tpo)r sa nL ds ue~.eh. t=i 1t + Theorem 1.29 The characteristic roots of a symmetric matrix C; fJ = O for all i andJ and f~f. = 1 (i = 1 2 i , 1ª are real; i.e., if A =A', the characteristic polynomial of IA - .U¡ I· f i· ,-1_-J•. 'I,h us the theorem'· s1 tates that i' f w' e• • a• r,e n i- P) ' f i f i --. O = O has all real roots. such as C1, there exists a matrix C2 of di:Uension (n g ve) .a matr1~ C ) - p X nsuc + Theorem 1.30 The characteristic roots of a positive definite 1 that (C = C (C1 forms the first p rows of C and C the last n _ p matrix A are ¡Jositive; the characteristic roots of a positivo 2 2 semidefinite matrix are nonnegative. rows of C), where 9 is orthogonal. G LINEAR STATISTICAL l\WDELS MATHEMATICAL CONCEPTS 7 1.3 Determinants The trace of a matrix A, l'i'l1ich will be \vritten tr(A), is equal to the In this section a few of the important theorems on determinants snm of the diagonal elements of A; that is, tr(A) = 2ta, aii· ¡.,. ¡ will be given. It will be assumed that the student knows the definition • Theorem 1.45 tr(AB) = tr(BA). of a determinant and knows how to evaluate small ones. In linear Proof: By definition, tr(AB) is equal to 2,a¡;bii. By definition, hypothesis applications it is often necessary to..§.olve systems involving ij . a great ma~y equations. It might at times be necessary to evaluate tr(BA) is equal to¡ b,kaki· But it is clear that 2, a;;bii = 2, bikaki; large determinants. There m·e many methods of doing these tlúngs therefore, tr(AB) ~= tr(BA). ij ~ that are adaptable to automatic and semiautomatic computing machines. These methods will be discussed in detail later. It will • Theorem 1.46 tr(~BC) = tr(CAB) = tr(BCA); that is, the be assumed here that the student knows how to evaluate determinants trace of the product of matrices is invariant under any cyclic by the method of minors or by sorne other simple method. permutation of the matrices. Proof: By Theorem 1.45, tr[(AB)C] = tr[C(AB)]. + Theorem 1.35 The determinant of a diagonal rnatrix is equal to the product of the diagonal elements. • Theorem 1.47 tr(I) = n, where I is an n x n identity matrix. + Theorem 1.36 If A and B are n x n matrices, then IABI = • Theorem 1.48 If C is an orthogonal matrix, tr(C' AC) = tr(A). IBA! = IAI IBI. Proof: ByTheorem 1.46, tr(C'AC) = tr(CC'A) = tr(IA) = tr(A). + Theorem 1.37 If A is singular, IAI = O. It is sometimes advantageous to break a matrix up into submatrices. • Theorem 1.38 If C is an orthogonal matrix, then ICI = +1 or 'fhis is callcd partitioning a matrix into submatrices, and a matrix can ICI = -1. be partitioned in many ways. For example, A might be partitioned + Th~orem 1.39 If C is an orthogonal matrix, then IC' ACI = l~I­ into submatrices as follows: A= (Au Aj2 + Theorem 1.40 The determinant of a positive definite ~atrix is positive. A21 ~9 + Theorem 1.41 The determinant of a triangular matrix is equal + Thtoe othree mpr o1.d4u2c t o1fn t-h11e =d iaIg/IoDnIa,l iefl !eDmI e-n:Ft so. . awnhTde hrAee 2Ap2 r isois dm um2c txX A nnB,2 , Aoanf1 1dt w iwso hm em1r eax tmr inc1 e1+,s A cma122n i=bs emm m 1 aanxdd en ns21,y mA+b2 1no l2ii sc= am lln2y . xe vne1n, + Theorem 1.43 If A is a square matrix such that if A and B are broken ~nto SQbmatrices. The multiplication proceeds as if the submatrices were single elements of the matrix. However, the. dimensions of the matrices and of the submatrices must be such that they will multiply. For example, if Bisan~ x p matrix such where Au and A22 are square matrices, and if A12 = Oo r A21 =O, that (Bn B Bj2 th_en IAI = IA11l IA22I· = + Theorem 1.44 If A1 and A2 are symmetric and A2 is positive defi B21 B nite and if A1 - A2 is positive semidefinite (or positive definite), wherc Bii is an ni X P; matrix, then the product AB exists; and the then IA11 ~ IA2I· corresponding submatrices will multiply, since A¡ is of dimension 1 m; X n; and B;k is of dimension n1 x P1:· The. resulting matrix is as 1.4 Miscellaneous Theorems on Matrices follows: In this section we shall discuss sorne miscellaneous theorems con cerning matrices, which we shall use in later chapters.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.