ebook img

Applied Nonparametric Regression PDF

433 Pages·1992·4.509 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Applied Nonparametric Regression

Applied Nonparametric Regression Wolfgang H¨ardle Humboldt-Universita¨t zu Berlin Wirtschaftswissenschaftliche Fakulta¨t Institut fu¨r Statistik und O¨konometrie Spandauer Str. 1 D–10178 Berlin 1994 Fu¨r Renate Nora Viola Adrian Contents I Regression smoothing 1 1 Introduction 3 1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.2 Scope of this book . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2 Basic idea of smoothing 17 2.1 The stochastic nature of the observations . . . . . . . . . . . . . . . . . . . . 26 2.2 Hurdles for the smoothing process . . . . . . . . . . . . . . . . . . . . . . . . 27 3 Smoothing techniques 31 3.1 Kernel Smoothing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 3.2 Complements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 3.3 Proof of Proposition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 3.4 k-nearest neighbor estimates . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 3.5 Orthogonal series estimators . . . . . . . . . . . . . . . . . . . . . . . . . . . 61 3.6 Spline smoothing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 3.7 Complements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 3.8 An overview of various smoothers . . . . . . . . . . . . . . . . . . . . . . . . 78 3.9 Recursive techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 3.10 The regressogram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 3.11 A comparison of kernel, k-NN and spline smoothers . . . . . . . . . . . . . . 87 II The kernel method 111 4 How close is the smooth to the true curve? 113 4.1 The speed at which the smooth curve converges . . . . . . . . . . . . . . . . 116 4.2 Pointwise confidence intervals . . . . . . . . . . . . . . . . . . . . . . . . . . 125 4.3 Variability bands for functions . . . . . . . . . . . . . . . . . . . . . . . . . . 139 4.4 Behavior at the boundary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159 4.5 The accuracy as a function of the kernel . . . . . . . . . . . . . . . . . . . . 162 4.6 Bias reduction techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172 5 Choosing the smoothing parameter 179 5.1 Cross-validation, penalizing functions and the plug-in method. . . . . . . . . 180 5.2 Which selector should be used? . . . . . . . . . . . . . . . . . . . . . . . . . 200 5.3 Local adaptation of the smoothing parameter . . . . . . . . . . . . . . . . . 214 5.4 Comparing bandwidths between laboratories (canonical kernels) . . . . . . . 223 6 Data sets with outliers 229 6.1 Resistant smoothing techniques . . . . . . . . . . . . . . . . . . . . . . . . . 231 6.2 Complements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241 7 Nonparametric regression techniques for time series 245 7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245 7.2 Nonparametric time series analysis . . . . . . . . . . . . . . . . . . . . . . . 247 7.3 Smoothing with dependent errors . . . . . . . . . . . . . . . . . . . . . . . . 263 7.4 Conditional heteroscedastic autoregressive nonlinear models . . . . . . . . . 267 8 Looking for special features and qualitative smoothing 281 8.1 Monotonic and unimodal smoothing . . . . . . . . . . . . . . . . . . . . . . . 282 8.2 Estimation of Zeros and Extrema . . . . . . . . . . . . . . . . . . . . . . . . 291 9 Incorporating parametric components 299 9.1 Partial linear models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 302 9.2 Shape-invariant modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 306 9.3 Comparing nonparametric and parametric curves . . . . . . . . . . . . . . . 313 III Smoothing in high dimensions 325 10 Investigating multiple regression by additive models 327 10.1 Regression trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 329 10.2 Projection pursuit regression . . . . . . . . . . . . . . . . . . . . . . . . . . . 337 10.3 Alternating conditional expectations . . . . . . . . . . . . . . . . . . . . . . 341 10.4 Average derivative estimation . . . . . . . . . . . . . . . . . . . . . . . . . . 348 10.5 Generalized additive models . . . . . . . . . . . . . . . . . . . . . . . . . . . 354 A XploRe 365 A.1 Using XploRe . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 365 A.2 Quantlet Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373 A.3 Getting Help . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 378 A.4 Basic XploRe Syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 381 B Tables 387 Bibliography 391 Index 407 List of Figures 1.1 Potatoes versus net income. . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.2 potatoes versus net income . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.3 Human height growth versus age. . . . . . . . . . . . . . . . . . . . . . . . . 9 1.4 Net income densities over time. . . . . . . . . . . . . . . . . . . . . . . . . . 10 1.5 Net income densities over time. . . . . . . . . . . . . . . . . . . . . . . . . . 11 1.6 Temperature response function for Georgia. . . . . . . . . . . . . . . . . . . 12 1.7 Nonparametric flow probability for the St. Mary’s river. . . . . . . . . . . . 13 1.8 Side inpact data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.1 Food versus net income. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 2.2 Food versus net income . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 2.3 Height versus age. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 2.4 potatoes versus net income . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 2.5 Potatoes versus net income . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 3.1 The Epanechnikov kernel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 3.2 The effective kernel weights.... . . . . . . . . . . . . . . . . . . . . . . . . . . 35 3.3 Local parabolic fits. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 3.4 First and second derivatives of kernel smoothers. . . . . . . . . . . . . . . . . 44 3.5 Title . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 3.6 Titel! . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66 3.7 The effective weight function . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 3.8 amount of sugar in sugar-beet as a function of temperature . . . . . . . . . 68 3.9 A spline smooth of the Motorcycle data set . . . . . . . . . . . . . . . . . . . 72 3.10 Spline smooth with cubic polynomial fit . . . . . . . . . . . . . . . . . . . . 95 3.11 The effective spline kernel . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 3.12 Equivalent kernel function for the temperature . . . . . . . . . . . . . . . . . 97 3.13 Equivalent kernel function . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 3.14 Huber’s approximation to the effective weight function . . . . . . . . . . . . 99 3.15 A regressogram smooth of the motorcycle data. . . . . . . . . . . . . . . . . 100 3.16 Running median and a - smooth. . . . . . . . . . . . . . . . . . . . . . . 101 k NN 3.17 A kernel smooth applied to a sawtooth function. . . . . . . . . . . . . . . . . 102 3.18 The split linear fit applied to a sawtooth function. . . . . . . . . . . . . . . . 103 3.19 Empirical regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104 3.20 A simulated data set. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 3.21 A kernel smooth of the simulated data set. . . . . . . . . . . . . . . . . . . . 106

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.