ebook img

Linear Models: Least Squares and Alternatives PDF

360 Pages·1995·10.16 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Linear Models: Least Squares and Alternatives

Springer Series in Statistics Advisors: P. Diggle, S. Fienberg, K. Krickeberg, I. Olkin, N. Wermuth Springer Science+Business Media, LLC Springer Series in Statistics Andersen/Borgan/Gill/Keiding: Statistical Models Based on Counting Processes. Anderson: Continuous-Time Markov Chains: An Applications-Oriented Approach. Andrews/Herzberg: Data: A Collection of Problems from Many Fields for the Student and Research Worker. Anscombe: Computing in Statistical Science through APL. Berger: Statistical Decision Theory and Bayesian Analysis, 2nd edition. Bol/arine/Zacks: Prediction Theory for Finite Populations. Bremaud: Point Processes and Queues: Martingale Dynamics. Brockwell/Davis: Time Series: Theory and Methods, 2nd edition. Choi: ARMA Model Identification. Daley/Vere-Jones: An Introduction to the Theory of Point Processes. Dzhaparidze: Parameter Estimation and Hypothesis Testing in Spectral Analysis of Stationary Time Series. Fahrmeir/Tutz: Multivariate Statistical Modelling Based on Generalized Linear Models. Farrell: Multivariate Calculation. Federer: Statistical Design and Analysis for Intercropping Experiments. Fienberg/Hoaglin/KruskaljTanur (Eds.): A Statistical Model: Frederick Mosteller's Contributions to Statistics, Science and Public Policy. Fisher/Sen: The Collected Works of Wassily Hoeffding. Good: Permutation Tests: A Practical Guide to Resampling Methods for Testing Hypotheses. Goodman/Kruskal: Measures of Association for Cross Classifications. Grandell: Aspects of Risk Theory. Hall: The Bootstrap and Edgeworth Expansion. Hardle: Smoothing Techniques: With Implementation in S. Hartigan: Bayes Theory. Heyer: Theory of Statistical Experiments. Jolliffe: Principal Component Analysis. Kotz/Johnson (Eds.): Breakthroughs in Statistics Volume I. Kotz/Johnson (Eds.): Breakthroughs in Statistics Volume II. Kres: Statistical Tables for Multivariate Analysis. Leadbetter/Lindgren/Rootzen: Extremes and Related Properties of Random Sequences and Processes. Le Cam: Asymptotic Methods in Statistical Decision Theory. Le Cam/Yang: Asymptotics in Statistics: Some Basic Concepts. Manoukian: Modern Concepts and Theorems of Mathematical Statistics. Manton/Singer/Suzman (Eds.): Forecasting the Health of Elderly Populations. Miller, Jr.: Simultaneous Statistical Inference, 2nd edition. Mosteller/Wallace: Applied Bayesian and Classical Inference: The Case of The Federalist Papers. (continued after index) c. Radhakrishna Rao Helge Toutenburg Linear Models Least Squares and Alternatives With 33 Illustrations Springer Science+Business Media, LLC C. Radhakrishna Rao The Pennsylvania State University Department of Statistics University Park, PA 16802 USA Helge Toutenburg Institut für Statistik Universität München 80799 München Germany Library of Congress Cataloging-in-Publication Data Rao, C. Radhakrishna (Calyampudi Radhakrishna), 1920- Linear models: legist squares and alternatives / C. Radhakrishna Rao, Helge Toutenburg. p. cm. - (Springer series in statistics) Includes bibliographical references (p. - ) and index. ISBN 978-1-4899-0026-5 (hardcover : alk. paper) 1. Linear models (Statistics) I. Toutenburg, Helge. II. Title. III. Series. QA279.R3615 1995 5i9.5'36—dc20 95-23947 © Springer Science+Business Media New York 1995 Originally published by Springer-Verlag New York, Inc. in 1995 Softcover reprint of the hardcover 1st edition 1995 All rights reserved. This work may not be translated or copied in whole or in part without the written permission of the publisher Springer Science + Business Media, LLC, except for brief excerpts in connection with reviews or scholarly analysis. Use in connection with any form of information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed is forbidden. The use of general descriptive names, trade names, trademarks, etc., in this publication, even if the former are not especially identified, is not to be taken as a sign that such names, as understood by the Trade Marks and Merchandise Marks Act, may accordingly be used freely by anyone. Production managed by Laura Carlson; manufacturing supervised by Jeffrey Taub. Photocomposed pages prepared from the authors' L^TgX files. 987654321 ISBN 978-1-4899-0026-5 ISBN 978-1-4899-0024-1 (eBook) DOI 10.1007/978-1-4899-0024-1 Preface The book is based on both authors' several years of experience in teaching linear models at various levels. It gives an up-to-date account of the theory and applications of linear models. The book can be used as a text for courses in statistics at the graduate level and as an accompanying text for courses in other areas. Some of the highlights in this book are as follows. A relatively extensive chapter on matrix theory (Appendix A) provides the necessary tools for proving theorems discussed in the text and offers a selection of classical and modern algebraic results that are useful in research work in econometrics, engineering, and optimization theory. The matrix theory of the last ten years has produced a series of fundamental results about the definiteness of matrices, especially for the differences of matrices, which enable superiority comparisons of two biased estimates to be made for the first time. We have attempted to provide a unified theory of inference from linear models with minimal assumptions. Besides the usual least-squares theory, alternative methods of estimation and testing based on convex loss func tions and general estimating equations are discussed. Special emphasis is given to sensitivity analysis and model selection. A special chapter is devoted to the analysis of categorical data based on logit, loglinear, and logistic regression models. The material covered, theoretical discussion, and its practical applica tions will be useful not only to students but also to researchers and con sultants in statistics. We would like to thank our colleagues Dr. G. Trenkler and Dr. V. K. Sri vastava for their valuable advice during the preparation of the book. We vi Preface wish to acknowledge our appreciation of the generous help received from Andrea Schopp, Andreas Fieger, and Christian Kastner for preparing a fair copy. Finally, we would like to thank Dr. Martin Gilchrist of Springer-Verlag for his cooperation in drafting and finalizing the book. We request that readers bring to our attention any errors they may find in the book and also give suggestions for adding new material and/or improving the presentation of the existing material. C. Radhakrishna Rao Helge Toutenburg University Park, Pennsylvania Munich USA Germany July 1995 Contents Preface v 1 Introduction 1 2 Linear Models 3 2.1 Regression Models in Econometrics. 3 2.2 Econometric Models ........ . 6 2.3 The Reduced Form. . . . . . . . . . 10 2.4 The Multivariate Regression Model . 12 2.5 The Classical Multivariate Linear Regression Model 15 2.6 The Generalized Linear Regression Model . . . . . . 16 3 The Linear Regression Model 19 3.1 The Linear Model ............... . 19 3.2 The Principle of Ordinary Least 8quares (OL8) 20 3.3 Geometric Properties of OL8 .. 21 3.4 Best Linear Unbiased Estimation 23 3.4.1 Basic Theorems .... . 23 3.4.2 Linear Estimators ... . 28 3.4.3 Mean Dispersion Error. . 29 3.5 Estimation (Prediction) of the Error Term e and a2 31 3.6 Classical Regression under Normal Errors 32 3.7 Testing Linear Hypotheses . . . . . . . . . 34 3.8 Analysis of Variance and Goodness of Fit 41 viii Contents 3.8.1 Bivariate Regression 41 3.8.2 Multiple Regression 46 3.8.3 A Complex Example . 50 3.8.4 Graphical Presentation 54 3.9 The Canonical Form . . . . . . 60 3.10 Methods for Dealing with Multicollinearity 61 3.10.1 Principal Components Regression. 62 3.10.2 Ridge Estimation. . . 63 3.10.3 Shrinkage Estimates . . 67 3.10.4 Partial Least Squares . 67 3.11 Projection Pursuit Regression. 71 3.12 Total Least Squares ...... 72 3.13 Minimax Estimation . . . . . . 75 3.13.1 Inequality Restrictions. 75 3.13.2 The Minimax Principle 78 3.14 Censored Regression . . . . . . 83 3.14.1 Introduction ...... 83 3.14.2 LAD Estimators and Asymptotic Normality. 84 3.14.3 Tests of Linear Hypotheses . . . . . . . . . . 85 4 The Generalized Linear Regression Model 89 4.1 Optimal Linear Estimation of {3 . . . . . . 89 4.2 The Aitken Estimator . . . . . . . . . . . 96 4.3 Misspecification of the Dispersion Matrix 98 4.4 Heteroscedasticity and Autoregression . . 101 5 Exact and Stochastic Linear Restrictions 111 5.1 Use of Prior Information. . . . . . . . . . 111 5.2 The Restricted Least-Squares Estimator . 112 5.3 Stepwise Inclusion of Exact Linear Restrictions 115 5.4 Biased Linear Restrictions and MDE Comparison with the OLSE .............................. 120 5.5 MDE Matrix Comparisons of Two Biased Estimators. . . . 123 5.6 MDE Matrix Comparison of Two Linear Biased Estimators 129 5.7 MDE Comparison of Two (Biased) Restricted Estimators 130 5.7.1 Special Case: Stepwise Biased Restrictions. 134 5.8 Stochastic Linear Restrictions . . . . . . . . . . . . 138 5.8.1 Mixed Estimator . . . . . . . . . . . . . . . 138 5.8.2 Assumptions about the Dispersion Matrix . 140 5.8.3 Biased Stochastic Restrictions. 143 5.9 Weakened Linear Restrictions. . . . . . . . . . . . 147 5.9.1 Weakly (R, r)-Unbiasedness . . . . . . . . . 147 5.9.2 Optimal Weakly (R, r)-Unbiased Estimators. 148 5.9.3 Feasible Estimators-Optimal Substitution of {3 in ,61({3,A) ......................... 152 Contents ix 5.9.4 RLSE Instead of the Mixed Estimator . . . . . . .. 153 6 Prediction Problems in the Generalized Regression Mode1155 6.1 Introduction......... 155 6.2 Some Simple Linear Models . . . . 155 6.3 The Prediction Model . . . . . . . 158 6.4 Optimal Heterogeneous Prediction 159 6.5 Optimal Homogeneous Prediction. 161 6.6 MDE Matrix Comparisons between Optimal and Classical Predictors . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165 6.6.1 Comparison of Classical and Optimal Prediction with Respect to the y*-Superiority . . . . . . . . . . . . . 167 6.6.2 Comparison of Classical and Optimal Predictors with Respect to the X*,B-Superiority . 170 6.7 Prediction Regions . . . . . . . . . . . . . . . . . . . . . . . 172 7 Sensitivity Analysis 177 7.1 Introduction ... 177 7.2 Prediction Matrix. . . . . . . . . . . . . . . . . . . . . . . . 177 7.3 The Effect of a Single Observation on the Estimation of Pa- rameters . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 183 7.3.1 Measures Based on Residuals . . . . . . . . . . . .. 184 7.3.2 Algebraic Consequences of Omitting an Observation 185 7.3.3 Detection of Outliers . . . . . . . . . . . . . . 186 7.4 Diagnostic Plots for Testing the Model Assumptions 191 7.5 Measures Based on the Confidence Ellipsoid. 192 7.6 Partial Regression Plots . . . . . . . . . . . . 199 8 Analysis of Incomplete Data Sets 203 8.1 Statistical Analysis with Missing Data . . . . . . . 203 8.2 Missing Data in the Response . . . . . . . . . . . . 209 8.2.1 Least-Squares Analysis for Complete Data. 209 8.2.2 Least-Squares Analysis for Filled-up Data . 210 8.2.3 Analysis of Covariance-Bartlett's Method 211 8.3 Missing Values in the X-Matrix. . . . . . . . . . . 212 8.3.1 Missing Values and Loss in Efficiency ... 214 8.3.2 Standard Methods for Incomplete X-Matrices. 216 8.4 Maximum Likelihood Estimates of Missing Values 219 8.5 Weighted Mixed Regression . . 221 8.5.1 Minimizing the MDEP . 223 8.5.2 The Two-Stage WMRE 226 9 Robust Regression 229 9.1 Introduction ... 229 9.2 Least Absolute Deviation Estimators-Univariate Case. 230

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.