ebook img

Multivariate Analysis PDF

532 Pages·24.35 MB·english
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Multivariate Analysis

MULTIVARIATE ANALYSIS K. V. MARDIA Department of Statistics. University of Leeds. Leeds. U.K. 1. T. KENT Department of Statistics. University of Leeds, Leeds. U.K. 1. M. BIBBY Faculty of Mathematics, The Open University. Milton Keynes, U.K. This is a volume in PROBABILITY AND MATHEMATICAL STATISTICS A Series of Monographs and Textbooks ACADEMIC PRESS Editors: Z. W. Birnbaum and E. Lukacs Harcourt Brace & Company, Publishers London San Diego A complete list of titles in this series is available from the publisher upon requesL New York Boston Sydney Tokyo Toronto ACADEM1CPRESS UMITED 24/28 Oval Road, mx London NWl Urriled SIDles Edilion published by ACADEMIC PRESS. INC San Diego, CA~21Ol Copyright © 1979 Academic Press LImited To Neela Mardia Te.nth pritlting.1995 All Rights Reserved No part (l( this book Dlay be reproduced ill any (fnm by photostat. microfilm. or aoy other me3JlS. without written permission (rom the publishers. British Librar), Calaloguing in Publica/lOll Dolo Mardi.-, KantiJal Varichand Multivariale analysIS. - (Pruoal.ilily IUUJ matbematical statistics). I. Multivariate anaJysis I. Title II. Kent. J T 111. Bibby, John IV. Series 519.5'3 QA27S 79-40922 ISBN ~12-471250-9 ISBN 0-12-471252-5 Pbk Printed in Great Britain by TJ Press (Potdstow) LId., Pacl~ow. Corn" ,.11 PREFACE "Everything is Tetaled with every OIher thing. and this relation involves the! emergence of a relational quality. The qllalities cannot be known a prior;, though a good number of them can be deduced from certain fundamental characteris Multivariate Analysis deals with observations on more than one variable tiCS." where there i. some inherent interdependence between the: variable. - JSUlB philosophy With several texts already available in this area, one may very well enquire of tbe authors as to the need for yet another book. Most of the available books fall into two categories, either theoretical Or data analytic. The present book not only combines the two approaches but it also The! lama Pliliosophy o( Non·Absohmsm by S. Mookcrjec:. q ..... emphasizes modern developments. The choice of material for the book Mahalanobis (1957) has been guided by the need to give suitable mailer for the. beginner as well as illustrating some deeper aspects of the sUbject for the research worker. Practical examples are kept to the forefront and. wherever feasible. each technique is motivated by such an example. The book is aimed at final year under!,'Taduates and postgraduate students in Mathematic /Statistics with sections suitahle lor practitioners and research workers. The book assumes a basic knowledge of Mathematical Statistics at undergraduate level. An eLementary course on Linear Algebra is also assumed. In particular. we assume an exposure to Matrix Algebra to the level required to read Appendix A. Broadly speaking, Chapters 1-6 and Chapter 12 can be de cribed as containing direct extensions of univariate ideas and techniques. The remaining chapters concentrate on specifically multivariate problems which bave no meaningful analogues in the univariate case. Chaptcr 1 is primarily concerned with giving exploratory analyses for multivariate data and briefly introduces some 01 the important techniques, tools, and diagrammatic representations. Chapter 2 introduces various distributions together with some fundamental results, whereas Chapler 3 concentrates exclusively on normal distribution theory. Chapters 4-6 deal with prob lems in inference. Chapter 7 gives an over-view 01 Econometrics, whilst Principal Component Analysis, Factor Analysis, Canonical Correlation Analysis. and Discriminant Analysis are discussed from both theoretical and practical points of view in Chapters 8-11. Chapter 12 is on Mul tivariate Analysis of Variance, which can be better understood in terms of PREFACE viii ix PREFACE the techniques of previous chapters. The later chapters look into the asterisks. Mathematically orientated students may like to proceed to presently developing techniques of Ouster Analysis, Multidimensional Chapter 2, omitting the data analytic ideas of Chapter 1. 'Scaling, and Directional Data. Various new methods of presentation are utilized in Ihe book. For Each chapter concludes with a set of exercises. Solving these will nOI instance the data matrix is emphasized throughout, a density-free ap only enable the reader to understand the material better but will also proach is given for normal theory_ the union intersection principle is used serve to complement the chapter itself. In general, the questions have in testing as well as the likelihood ratio principle, and graphical methods in-built answers, but. where desirable, hints for the solution of theoretical are used in explanation. In view of the computer packages generally problems are provided. Some of the numerical exercises are designed to available. most 01 the numerical work is laken for granted and therelore, be run On a computer, but as the main aim is On interpretation, the except for a few particular cases, emphasis is not placed on numerical answers are provided. We found NAG roulines and GLIM most useful, calculations, The style of preselltation is generally kept descriptive excepl but nowadays any computer centre will have some suitable statistics and where rigour is found to be necessary for theoretical reslllts, whicb are matrix algebra routines. then put in the form of theorems. If any details of the proof of a theorem There are three appendices, A. S, and C, which respectively provide a are felt tedious but simple, they are then relegated to the exercises. sufficient background of matrix algebra, a summary of univariate statis Several important topics not usually found in multivariate texts are tics, and some tables of critical values. The aim of Appendix A on Matrix discussed in detail. Examples of such material include the complete Algebra is not only to provide a summary of results, but also to give chapters on Econometrics, Cluster Analysis, Multidimensional Scaling, sufficient guidance to master these for students having little previous and Directional Data. Further material is also included in parts of other knowledge. Equations from Appendix A are referred to as (A.x.x) to chapters: methods of graphical presentation, measures of multivariate distinguiSh them from (I.x.x), etc. Appendix A also includes a summary skewness and kurtosis, the singular multinonnal distribution, various of results in n-dimensional geometry which are used liberally in the book. non-normal distributions and families of distributions, a density-free Appendix B gives a summary of important univariate distributions. approach to normal distribution theory, Bayesian and robust estimators, a The reference list is by no means exhaustive. Only directly relevant recent solution to the Fisher-Beluens problem. a test of multinorrnality, a articles are quoted and lor a fuller bibliography we refer the reader to non-parametric lest, discarding of variables in regression, principal com Anderson. Das Gupta, and Styan (1972) and Subrahmaniam and Subrahma ponent analysis and discrimination analysis, correspondence analysis, niam (l1J73). The relerence list also serves as an author index. A suhject allometry, the jack-knifing method in discrimination, canonkal analysis of index is provided. qualitative and quantitative variables, and a test of dimensionality in The material in the book can be used in several different ways. For MANOV A. It is hoped thaI coverage 01 these developments will be example, a one-semester elementary cOurse of 40 lectures could cover the helpful for studeots as well as research workers. following topics. Appendix A; Chapter 1 (Sections 1.1-1.7); Chapter 2 Tbere are various other topics which have oot been tOllched upon (Sections 2.1-2.5); Chapter 3 (Sections 3.4.1.3.5,3.6.1. assuming results partly because of lack of space as well as our own preferences. such as from previous sections. Definitions 3.7.1, 3.7.2); Chapter 4 (Section Control Theory, Multivariate Time Series, Latent Variable Models, Path 4.2.2); Chapler 5 (Sections 5_1, 5.2.1a, 5.2.lb, 5.2.2a. 5.2.2b, 5.3.2b. Analysis, Growth Curves, Portfolio Analysis, and various Multivariate 5.5); Chapter 8 (Sections 8.1, 8.2.1. 8.2.2, 8.2.5, 8.2.6, 8.4.3, 8.7); Designs. Chapter 9 (Sections 9.1-9.3, 9.4 (without details), 9.5, 9.6, 9.8); Chapler In addition to various research papers, we have been inlluenced by JO (Sections 10.1, 10.2); Chapler 11 (Sections 11.1, 11.2.1-11.2.3, particular texts in this area, especially Anderson (195&), Kendall (1975), 11.3.1. 11.6.1). Further material which can be introduced is Chapter 12 Kshirsagar (1972). Morrison (1976). Press (1972), and Rao (1973). All (Sections 12.1-12.3, 12.6); Chapter 13 (Sections 13.1, 13.3.1); Chapter these are recommended to the reader. 14 (Sections 14_1. 14.2). This material has been covered in 40 lectures The authors would be most grateful to readers who draw their attention spread over two terms in different British universities. Alternatively. a to any errors o~ obscurities in the book, or suggest other improvements. one-semester course with more emphasis on foundation rather than applications could be based on Appendix A and Chapters 1-5. Two January 1979 Kanti Mardia semester courses could indude all the chapters, excluding Chaplers 7 and John Kent 15 On Econometrics and Directional Data. as well a, the ,,,clions with John Bibby ACKNOWLEDGEMENTS CONTENTS First of all we wish to express our gratitude to pioneers in this field. In Preface vji particular. w~ should mention M. S. Bartlett, R. A. Fisher, H. Hotelling, D. G. Kendall. M. G. Kendall, P. C. Mahalanobis. C. R. Rao, S. . Roy, Acknowledgements X W. S. Torgerson, and S. S. Wilks. Chapter t-lntroduction We are grateful to authors and editors who have generously granted us 1.1 Objec1S and variables 1 permission to reproduce figures and tables. 1.2 Some multivariate problems and techniques 2 We are also grateful to many colleagues for their valuable help and 1.3 The data matru 8 1.4 Summary statistics 9 comments, in panicular Martin Beale, Cltristopber Bingbam. Lesley 1.5 Linear combinations 13 Butler. Richard Cormack, David Cox. Ian Curry, Peter Fisk. Allan \.6 Ge<>metricai ideas 16 Gordon, John Gower, Peler Harris. Chunni Khatri, Conrad Le<er, E.ric 1.7 Graphical representations l7 Okell, Ross Renner. David Salmond. Cyril Smith. and Peter Zemroch. * 1.8 Measures of multivariate skewness and kurtosis 20 We are also indebted to Joyce Snell for making various comments on an Exercises and complements . ... .. . 22 earlier draft of the book which have led to considerable improvement. Chapter 2-Basic Properties of Random Vedors 26 We should also express our gratitude to Rob Edwards for his help in 2.1 Cumulative distribution functions and probability density func- various facets of the book. for calculations, for proof· reading, for diag tioos ., ... . . . 26 rams, etc. 2.2 Population moments 28 Some of the questions are laken from examination papers in British 2.3 Characteristic [unctions 33 2.4 Transformations 35 universities and we are grateful to various unnamed colleagues. Since tbe 2,) The multinormal distribution 36 original sources of questions are difficult to trace, we apologize to any 2,6 Some multivariate generalizations of univariate distributions 43 colleague who recognizes a question of his own. 2.7 Families of distributions 45 The authors would like to thank their wives, Pavan Mardia, Susan 2.8 Random samples . . 49 Kent, and Zorina Bibby. 2.9 Limit theo'rems 51 Exercises and complements 53 Finally our thanks go to Barbara Forsyth and Margaret Richardson for typing a difficult manuscript with great skill. Chapter 3- onnal Distribution Theory S9 3.1 Characterization and properties 59 KVM 3.2 Linear forms . . . .. " . 62 JTK 3.3 Transformations of normal data matrices 64 JMB 3.4 The Wishart distribution ,.... .. 66 XII xiii CONTENTS 3.5 The Hotelling T2distribution .... . 73 8.7 Discarding of variables ....... . 242 8.8 Principal component anaJysis in regression 244 3.6 Mahalanobis distance . .. .... . 76 Exerc.ises and complements 246 3.7 Statistics based on the Wishart distribution 80 3.8 Other distributions related to the multinormal 8S Chapter 9-Factor Anal}D 2SS Exercises and complements 86 9.1 Introduction . . . . 255 9.2 The factor model . . 256 Chapter 4---Eotimation 96 9.3 Principal factor analysis 261 4. t Likelihood and sufficiency 96 9.4 Maximum likelihood factor analysis 263 4.2 Maximum likelihood estimation 102 9.5 Goodness of fit test 267 4.3 Other techniques and concepts 109 9.6 Rotation of factors . . . . . . . 268 Exercises and complements 113 9.7 Factorscores 273 9.8 Relationships between factor analysis and principal component Chapter 5-Hypothesis Testing 120 analysis • . . . . . . . . . . 275 5.1 Introduction ..... 120 9.9 Analysis of covariance structures . , 276 5.2 The techniques introduced 123 Exercises and complements ...• . . 276 5.3 The techniques further iUustrated 131 '5.4 The Behrens-FISher problem 142 Chapter IO-Canonical Correlation Analysis 281 5.5 Simultaneous confidence intervals 144 10.1 Introduction . . . ....... 281 5.6 Multivariate hYJX>thesis testing: some general points 147 10.2 Mathematical development . . . . 282 '5.7 Non-normal data . . . . . . _ . . . . . . . . 148 10.3 Qualitative data and dummy variables 290 5.8 A non-parametric test for the bivariate two-sample problem 149 10.4 Qualitative and quanta live data 293 Exercises and complements ..... . lSI Exercises and complements 295 Chapter &--Multivariale Regression Analysis 157 Chapter lI-Discriminllnt AnalY8is 300 6.1 Introduction . . . . . . . . . 157 11.1 Introduction ..... . 300 6.2 Maximum likelihood estimation 158 11.2 Discrimination when the populations are known 301 6.3 The general linear hypothesis . • 161 11.3 Discrimination under estimation 309 6.4 Design matrices of degenerate rank 164 11.4 Is discrimination worthwbile? . . . 318 6.5 Multiple correlation 167 ] 1.5 Fisber's linear discriminant function 318 6.6 Least squares estimatiOn 171 J 1.6 Probabilities of misclassificalion 320 6.7 Discarding of variables 175 11.7 Discarding of variables . . . . . . 322 Exercises and complements . . 180 t 1.8 When does correlation improve discrimination? 324 Exercises and complements ...... . 325 Chapter 7-Eronometri.. ........ .... . 185 7.1 Introduction . . . . . . . . . .. .... . 185 Chapter 12-Multivariate Anal}. .i s of Variance . . . . . 333 7.2 Instrumental variables and two-slage least squares 186 12.1 Introduction . . .. .... ... .... . 333 7.3 Simultaneous equation systems 191 12.2 Formulation of multivariate one-way classification 333 7.4 Single-equation estimators 199 12.3 The likelihood ratio principle . . . . . . . . 334 7.5 System estima tors 203 12.4 Testing fixed contrasts . . . . . . . . . . . 337 7.6 Comparison of estimators 208 12.5 Canonical variables and a test of dimensionality 338 Exercises and complements 208 12.6 The union intersection approach 348 12.7 Two-way classification 350 Chapter 8-Principal Component Analysis . . . . . . . 213 Exercises and complements 356 8.1 Introduction . . . . . . . . . . . . . . . . . 213 8.2 Definition and properties of principal components 214 Chapter 13-Clnster Analysis 360 8.3 Sampling properties of principal components . 229 13.1 Introduction . . . . 360 8.4 Testing bypotheses about principal components 233 13.2 A probabilistic formulation 361 8.5 Correspondenoe analysis ... . .. . . . 237 13.3 Hierarchical methods 369 8.6 Allometry-the measurement of size and sbape 239 13.4 Distances and simiJanl'ics 375 CONTENTS )(IV xv CONTENTS 13.5 Other methods and comparative approach Table C.2 Upper percemage points of the t.. distribution . . . 491 Exercises and complements . . . . . . TableC.3 pper percentage points of the F",.v) distribution 492 Table C.4 Upper percentage points 00 of 8(p. 1)1. '-'2)' the largest Chapter 14-Multidimensional Scalin~ 394 eigenvalueofIB- O(W+B)I=Oforp=2. 494 14.1 Introduction ...... . 394 14.2 ClassicaJ solution .. .. . 397 References . 497 14.3 Duality between principal coordinate analysis and principal corn- ponent analysis . . . . . . _. . .. ..... . . . -lO-l List of Main otations and Abbreviations . 508 14.4 Oplimal properties of the classical solution and goodness of 6t 406 J 4.5 Seriation ......... . .. .. . 409 Subject Index 510 14.6 Non-metric methods . . . . . . . . . . . 413 14.7 Goodness of fit measure; Procrustes rotation .....• 416 Author Index 519 .] 4.8 Multi-sample problem and canonical variates 4J9 Exercises and complements ......,... . 420 Chapter IS-Directional Data 424 15.1 Introduction 424 15.2 Descriptive measures 426 15.3 Basic distributions . . 421\ 15.4 Distribution theory 435 ]5.5 Maximum lik.elihood estimators for the Von Mise$-Fi~her distribu- tion . . . . . . . . . . . . . . 437 15.6 Test of uniformity; the Rayleigh test 439 15.7 Some other problems 441 Exercises and complements 446 Appendix A-Matrix Algebra 452 A.l Introduction 452 A.2 Matrix operations 455 A.3 Further particular matrices and types of matrix 460 A.4 Vector spaces. rank. and linear equations 462 A.5 Linear translormations . . . . . 465 A.6 Eigenvalues and eigenvectors . . 466 A.7 Quadratic lorms and definiteness 474 • A.8 Generalized inverse . . . . . . 476 A.9 Matrix differentiation and maximjzation probJems 478 A.IO Geometrical idea. . . . . . . . . . 481 Appendix B-Univariate Statistics 486 B.1 Introduction .. . .. 486 B.2 Normal distribution 486 B.3 Chi·squared distribution. 487 B.4 F and beta variables 487 B.S I distribution .. ... 488 Appendix C-Tables .. . .. • .. .. . ....... 4119 X: Table C. 1 Upper percentage pOints of the distribution 4<)() 1 Introduction 1.1 Objects and V~bles I Multivariate analysis deals with data containing observations on two or more variables each measured on a set of objects. For example. we may have the set of examination marks achieved by certain students, or the cork deposit in various directions of a set of trees. or flower measure ments for different species of iris (see Tables 1.2.1, 1.4.1, and 1.2.2, respectively). Each of these data has a set of "variables" (the examination marks, trunk thicknesses, and flower measurements) and a set of "ob jects" (the students. trees, and flowers). In general, if there are" objects. 0", .. . 0", and p variables, Xl •. ... Xp• the data contains lip pieces of information. These may be conveniently arranged using an (II x p) "data matrix". in which each row corresponds to an object and each column corresponds to a variable. For instance. three variables on five "objects" (students) are shown as a (5 x 3) data matrix in Table 1.1.1. Table. 1.1.1 Data matrix with five students as object..~ where Xl =age in years at entry to university. x, = marks Oul of 100 in an examination al the end of the first year, and X.3 = sex Variables Objects x, x, X3't 1 18.45 70 1 2 18.41 6S 0 3 18.39 71 0 4 18.70 72 0 5 18".34 94 1 t 1 indicates male~ 0 indicutes female. MULTIVARIATE ANALYSIS 2 3 INTRODUCTION Note that the variables need not all be of the same type: in Tahlc 1.1. I. Table 1.2. J Marks in open-book and closed-book examination out of J 00 t Xl is a "'continuousU variable, X2 is a discrete variable, and ,t"l i.o,,; a dichotomous variable. Note also that attribute, characteristic. description. Mechanics (C) Vectors (C) Algebra (0) Anatysis (0) Statistics (0) item, measurement, and response are synonyms for "variable", wherca~ 77 82 67 67 81 individual, observation, plot, reading, and unit caD be used in place of 63 78 80 70 81 "object", 75 73 71 66 81 55 72 63 70 68 63 63 65 70 63 1.2 Some Multivariate Problems and Techniques 53 tit 72 64 73 51 67 65 65 68 59 70 68 62 56 We may now illustrate various categories of multivariate technique. 62 60 58 62 70 64 72 60 62 45 52 64 60 63 54 1.2.1 Generalizations of univariate techniques 55 67 59 62 44 50 50 64 55 63 Most univariate questions are capable of at least one multivariate 65 63 58 56 37 generalization. For instance, using Table 1.2.1 we may ask, as an exam 31 55 60 57 73 ple. "What is the appropriate underlying parent distribution of examina 60 64 56 54 40 of tion marks on various papers of a set students?" "What are the 44 69 53 53 53 42 69 61 55 45 summary statistics?" "Are the differences between average marks on 62 46 6J 57 45 different papers significant?", etc. These problems are direct generaliza 31 49 62 63 62 tions of univariate problems and their motivation is easy to grasp. See for 44 6t 52 62 46 example Chapters 2-6 and 12. 49 4t 6t 49 64 12 58 6J 63 67 49 53 49 62 47 1.2.2 Dependence and regession 54 49 56 47 53 54 53 46 S9 44 Referring to Table 1.2.1. we may enquire as to the degree of dependence 44 56 55 61 ~ between performance on different papers taken by the same students. It 18 44 50 57 81 may be useful. for counselling or other purposes, to have some idea of 46 52 65 50 35 32 45 49 57 64 how final degree marks ("dependent" variables) are affected by previous 30 69 50 52 45 examination results or by other variables such as age or sex ("explana 46 49 53 59 37 tory" variables). This presents the so-called regression problem. which is 40 27 54 61 61 examined in Chapter 6. 31 42 48 54 68 30 59 51 45 51 56 40 56 54 35 1.2.3 linear combinations 46 56 57 49 32 45 42 55 56 40 Given examination marks on different topics (as in Table 1.2.1), the 42 60 54 49 33 question arises of how to combine or average these marks in a suitable 40 63 53 54 25 way. A straightforward method would use the simple arithmetic mean. 23 55 59 53 44 but this procedure may not always be suitable. For instance, if the marks 4 48 49 51 37 41 63 49 46 34 on some papers vary more than others, we may wish to weight them 46 52 53 41 40 differently. This leads us to search for a linear combination (weighted sum) which is "optimal'" in some sense. If all the examination papers rail to indicates open-book. C indk:atcs closed book. MULTIVARIATE ANALYSIS 4 5 INTRODUcnON Table 1.2.1 Conrinued in one group then principal compollent aMlysis and factor analysis are two tech.niques which can help to answer such questions (see Chapters 8 Mechanics (C) Vectors (C) Algebra (0) Analysis (0) Statistics (O) and 9). In some situations tbe papers may rail into more tban one 46 61 46 38 41 group--for instance. in Table 1.2.1 some examinations were "open 40 57 51 52 31 book" wh.i1e others were "closed-book". In sucb situations we may wish 49 49 45 48 39 to investigate the use of linear combinations within each. group separately. 22 58 53 56 41 This leads to the method known as canonical corTelatiOtt analysis, which is 35 60 47 54 33 discussed in Cbapter 10. 48 56 49 42 32 The idea of taking linear combinations is an important one in mul 31 57 50 54 34 17 53 57 43 51 tivariate analysis. and we shall return to it in Section 1.5. 49 57 47 39 26 59 50 47 15 46 37 56 49 28 45 1.2.4 Assignment and dissection 40 43 4R 21 61 Table 1.2.2 gives three (50 x 4) data matrices. In each malTix the "ob 35 35 41 51 50 38 44 54 47 24 jects" are 50 irises of species Iris setosa, Iris versicolollr, and Iris virginica, 43 43 38 34 49 respectively. The "variables" are 39 46 46 32 43 62 44 36 22 42 x, =sepaJ lengtb, X2 = sepal width. 4384 3482 5410 4447 3239 X3 = petal length, X4 = petal width. I~ 51 40 56 30 If a new iris of unknown species has measurements x,=5.1, x.,=3.2, 35 36 46 48 29 x,=2.7. x4=0.7 we may ask to which species it belongs. This presents 59 53 37 22 19 41 41 43 30 33 the problem of discriminallt analysis, which is discussed in Chapter 11. 31 52 37 n ~ Howe er. if we were presented with the 150 observations of Table 1.2.2 17 51 52 35 31 in an unclassified manner (say, before the three species were established) 34 30 50 47 36 then the aim could have been to dissect the population ioto homogeneous 46 40 47 29 17 groups. This problem is handled by cluster analysis (see Chapter 13). 10 46 36 47 39 46 37 45 15 30 30 34 43 46 18 1.2.5 Building configurations 13 51 50 25 31 49 50 38 23 9 In some cases the data consists not of an (n X p) data matrix, but of 18 32 31 ~ ~ 1n(n-1) "distances" between aU pairs o( points. To get an intuitive feel 8 42 4~ 26 40 for tbe structure of such data. a configuration can be constructed of n 23 38 36 48 15 30 24 43 33 25 points in a Euclidean space of low dimension (e.g. p = 2 or 3). Hopefully 3 9 5L 47 ~ the distances between the 11 points of the configuration will closely match 7 51 43 17 22 the original distances. The problems of building and interpreting such 15 40 43 23 18 configurations are studied in Chapter 14, on Inllitidimensional scaling. 15 38 39 28 17 5 30 44 36 18 12 30 32 35 21 1.2.6 Directional data 5 26 15 20 W o 40 21 <) 14 Tbere are various problems which arise when the variables are directional-that is, the multivariate observations are constrained to lie on a hypersphere. For a discus.ion of these problems see Chapter 15.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.