ebook img

Linear Models: Least Squares and Alternatives PDF

439 Pages·1997·1.734 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Linear Models: Least Squares and Alternatives

Linear Models: Least Squares and Alternatives, Second Edition C. Radhakrishna Rao Helge Toutenburg Springer Preface to the First Edition Thebookisbasedonseveralyearsofexperienceofbothauthorsinteaching linearmodelsatvariouslevels.Itgivesanup-to-dateaccountofthetheory and applications of linear models. The book can be used as a text for courses in statistics at the graduate level and as an accompanying text for courses in other areas. Some of the highlights in this book are as follows. A relatively extensive chapter on matrix theory (Appendix A) provides the necessary tools for proving theorems discussed in the text and offers a selectionofclassicalandmodernalgebraicresultsthatareusefulinresearch work in econometrics, engineering, and optimization theory. The matrix theory of the last ten years has produced a series of fundamental results aboutthedefinitenessofmatrices,especiallyforthedifferencesofmatrices, which enable superiority comparisons of two biased estimates to be made for the first time. We have attempted to provide a unified theory of inference from linear models with minimal assumptions. Besides the usual least-squares theory, alternative methods of estimation and testing based on convex loss func- tions and general estimating equations are discussed. Special emphasis is given to sensitivity analysis and model selection. A special chapter is devoted to the analysis of categorical data based on logit, loglinear, and logistic regression models. The material covered, theoretical discussion, and a variety of practical applications will be useful not only to students but also to researchers and consultants in statistics. WewouldliketothankourcolleaguesDr.G.TrenklerandDr.V.K.Sri- vastava for their valuable advice during the preparation of the book. We vi Preface to the First Edition wish to acknowledge our appreciation of the generous help received from AndreaScho¨pp,AndreasFieger,andChristianKastnerforpreparingafair copy.Finally,wewouldliketothankDr.MartinGilchristofSpringer-Verlag for his cooperation in drafting and finalizing the book. We request that readers bring to our attention any errors they may find in the book and also give suggestions for adding new material and/or improving the presentation of the existing material. University Park, PA C. Radhakrishna Rao Mu¨nchen, Germany Helge Toutenburg July 1995 Preface to the Second Edition The first edition of this book has found wide interest in the readership. A first reprint appeared in 1997 and a special reprint for the Peoples Re- public of China appeared in 1998. Based on this, the authors followed the invitation of John Kimmel of Springer-Verlag to prepare a second edi- tion, which includes additional material such as simultaneous confidence intervals for linear functions, neural networks, restricted regression and se- lectionproblems(Chapter3);mixedeffectmodels,regression-likeequations in econometrics, simultaneous prediction of actual and average values, si- multaneousestimationofparametersindifferentlinearmodelsbyempirical Bayessolutions(Chapter4);themethodoftheKalmanFilter(Chapter6); and regression diagnostics for removing an observation with animating graphics (Chapter 7). Chapter 8, “Analysis of Incomplete Data Sets”, is completely rewrit- ten, including recent terminology and updated results such as regression diagnostics to identify Non-MCAR processes. Chapter 10, “Models for Categorical Response Variables”, also is com- pletely rewritten to present the theory in a more unified way including GEE-methods for correlated response. At the end of the chapters we have given complements and exercises. We have added a separate chapter (Appendix C) that is devoted to the software available for the models covered in this book. We would like to thank our colleagues Dr. V. K. Srivastava (Lucknow, India) and Dr. Ch. Heumann (Mu¨nchen, Germany) for their valuable ad- viceduringthepreparationofthesecondedition.WethankNinaLieskefor her help in preparing a fair copy. We would like to thank John Kimmel of viii Preface to the Second Edition Springer-Verlagforhiseffectivecooperation.Finally,wewishtoappreciate the immense work done by Andreas Fieger (Mu¨nchen, Germany) with re- spect to the numerical solutions of the examples included, to the technical managementofthecopy,andespeciallytothereorganizationandupdating of Chapter 8 (including some of his own research results). Appendix C on software was written by him, also. We request that readers bring to our attention any suggestions that would help to improve the presentation. University Park, PA C. Radhakrishna Rao Mu¨nchen, Germany Helge Toutenburg May 1999 Contents Preface to the First Edition v Preface to the Second Edition vii 1 Introduction 1 2 Linear Models 5 2.1 Regression Models in Econometrics . . . . . . . . . . . . 5 2.2 Econometric Models . . . . . . . . . . . . . . . . . . . . . 8 2.3 The Reduced Form . . . . . . . . . . . . . . . . . . . . . 12 2.4 The Multivariate Regression Model . . . . . . . . . . . . 14 2.5 The Classical Multivariate Linear Regression Model . . . 17 2.6 The Generalized Linear Regression Model. . . . . . . . . 18 2.7 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3 The Linear Regression Model 23 3.1 The Linear Model . . . . . . . . . . . . . . . . . . . . . . 23 3.2 The Principle of Ordinary Least Squares (OLS) . . . . . 24 3.3 Geometric Properties of OLS . . . . . . . . . . . . . . . . 25 3.4 Best Linear Unbiased Estimation . . . . . . . . . . . . . 27 3.4.1 Basic Theorems. . . . . . . . . . . . . . . . . . . 27 3.4.2 Linear Estimators . . . . . . . . . . . . . . . . . 32 3.4.3 Mean Dispersion Error . . . . . . . . . . . . . . . 33 3.5 Estimation (Prediction) of the Error Term (cid:1) and σ2 . . . 34

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.