ebook img

Biometry for Forestry and Environmental Data-With Examples in R PDF

425 Pages·2020·10.136 MB·\425
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Biometry for Forestry and Environmental Data-With Examples in R

Biometry for Forestry and Environmental Data Chapman&Hall/CRC AppliedEnvironmentalSeries Series Editors Douglas Nychka, Colorado School of Mines Alexandra Schmidt, Universidade Federal do Rio de Janero Richard L. Smith, University of North Carolina Lance A. Waller, Emory University Recently Published Titles Environmental Sta!s!cs with S-PLUS Steven P. Millard, Nagaraj K. Neerchal Sta!s!cal Tools for Environmental Quality Measurement Douglas E. Splitstone, Michael E. Ginevan Sampling Strategies for Natural Resources and the Environment Timothy G. Gregoire, Harry T. Valen!ne Sampling Techniques for Forest Inventories Daniel Mandallaz Sta!s!cs for Environmental Science and Management Second Edi!on Bryan F.J. Manly Sta!s!cal Geoinforma!cs for Human Environment Interface Wayne L. Myers, Ganapa! P. Pa!l Introduc!on to Hierarchical Bayesian Modeling for Ecological Data Eric Parent, E!enne Rivot Handbook of Spa!al Point-Pa"ern Analysis in Ecology Thorsten Wiegand, Kirk A. Moloney Introduc!on to Ecological Sampling Bryan F.J. Manly, Jorge A. Navarro Alberto Future Sustainable Ecosystems: Complexity, Risk, and Uncertainty Nathaniel K Newlands Environmental and Ecological Sta!s!cs with R Second Edi!on Song S. Qian Sta!s!cal Methods for Field and Laboratory Studies in Behavioral Ecology Sco" Pardo, Michael Pardo Biometry for Forestry and Environmental Data with Examples in R Lauri Mehtätalo, Juha Lappi Formoreinforma!onaboutthisseries,pleasevisit:h"ps://www.crcpress.com/Chapman--HallCRC- Applied-Environmental-Sta!s!cs/book-series/CRCAPPENVSTA Biometry for Forestry and Environmental Data with Examples in R Lauri Mehtätalo Juha Lappi Firsteditionpublished2020 byCRCPress 6000BrokenSoundParkwayNW,Suite300,BocaRaton,FL33487-2742 andbyCRCPress 2ParkSquare,MiltonPark,Abingdon,Oxon,OX144RN ©2020Taylor&FrancisGroup,LLC CRCPressisanimprintofTaylor&FrancisGroup,LLC Reasonableeffortshavebeenmadetopublishreliabledataandinformation,buttheauthorandpublisher cannotassumeresponsibilityforthevalidityofallmaterialsortheconsequencesoftheiruse.Theauthors andpublishershaveattemptedtotracethecopyrightholdersofallmaterialreproducedinthispublication and apologize to copyright holders if permission to publish in this form has not been obtained. If any copyrightmaterialhasnotbeenacknowledgedpleasewriteandletusknowsowemayrectifyinanyfuture reprint. ExceptaspermittedunderU.S.CopyrightLaw,nopartofthisbookmaybereprinted,reproduced,trans- mitted, or utilized in any form by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying, microfilming, and recording, or in any information storage or retrieval system,withoutwrittenpermissionfromthepublishers. For permission to photocopy or use material electronically from this work, access www.copyright.com or contact the Copyright Clearance Center, Inc. (CCC), 222 Rosewood Drive, Danvers, MA 01923, 978-750- 8400.ForworksthatarenotavailableonCCCpleasecontactmpkbookspermissions@tandf.co.uk TrademarkNotice:Productorcorporatenamesmaybetrademarksorregisteredtrademarks,andareused onlyforidentificationandexplanationwithoutintenttoinfringe. Library of Congress Cataloging-in-Publication Data [InsertLoCDataherewhenavailable] ISBN:9781498711487(hbk) ISBN:9780429173462(ebk) TypesetinCMR byNovaTechsetPrivateLimited,Bengaluru&Chennai,India Contents Preface xi 1 Introduction 1 2 Random Variables 5 2.1 Introduction to random variables . . . . . . . . . . . . . . . . . . . . . . . . 5 2.1.1 Sources of randomness . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.1.2 Univariate and multivariate random variables . . . . . . . . . . . . . 6 2.2 Univariate random variables . . . . . . . . . . . . . . . . . . . . . . . . . . 7 2.2.1 Sample space and support set . . . . . . . . . . . . . . . . . . . . . . 7 2.2.2 Distribution function and density . . . . . . . . . . . . . . . . . . . . 7 2.2.3 Transformations of random variables . . . . . . . . . . . . . . . . . . 14 2.2.4 Expected value . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.2.5 Variance and standard deviation . . . . . . . . . . . . . . . . . . . . 16 2.2.6 Moments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2.3 Multivariate random variables . . . . . . . . . . . . . . . . . . . . . . . . . 18 2.3.1 Joint, conditional and marginal distributions . . . . . . . . . . . . . 18 2.3.2 Bivariate transformations . . . . . . . . . . . . . . . . . . . . . . . . 21 2.3.3 Independence and dependence . . . . . . . . . . . . . . . . . . . . . 21 2.3.4 Covariance and correlation . . . . . . . . . . . . . . . . . . . . . . . 22 2.4 Calculation rules for expected value, variance and covariance . . . . . . . . 23 2.5 Conditional and marginal expected value and variance . . . . . . . . . . . . 24 2.6 Multivariate normal distribution . . . . . . . . . . . . . . . . . . . . . . . . 26 2.7 Quantile function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 2.8 Weighted and truncated distributions . . . . . . . . . . . . . . . . . . . . . 34 2.9 Compound distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 2.10 Sums of independent random variables . . . . . . . . . . . . . . . . . . . . 36 2.11 Transformations of a standard normal variable . . . . . . . . . . . . . . . . 38 2.11.1 χ2-distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 2.11.2 Student’s t-distribution . . . . . . . . . . . . . . . . . . . . . . . . . 39 2.11.3 F-distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 2.12 Random functions: stochastic and spatial processes . . . . . . . . . . . . . 41 3 Statistical Modeling, Estimation and Prediction 43 3.1 Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 3.2 The role of assumptions in modeling . . . . . . . . . . . . . . . . . . . . . . 43 3.3 Scales of measurement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 3.4 Estimation using observed data . . . . . . . . . . . . . . . . . . . . . . . . 46 3.4.1 Comparing alternative estimators . . . . . . . . . . . . . . . . . . . . 46 3.4.2 Estimating expected value of any distribution . . . . . . . . . . . . . 47 Box: Least Squares estimation of µ . . . . . . . . . . . . . . . . . . . 48 3.4.3 Estimating variance and covariance of any distribution . . . . . . . . 50 v vi Contents 3.4.4 Estimating any parameter of a specific distribution . . . . . . . . . . 51 3.4.5 Bayesian estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 3.5 Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 3.5.1 Comparing alternative predictors . . . . . . . . . . . . . . . . . . . . 59 3.5.2 Linear prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60 3.6 Model comparison and confidence intervals . . . . . . . . . . . . . . . . . . 62 3.6.1 Testing of hypothesis . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 3.6.2 Multiple tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 Box: Publication bias . . . . . . . . . . . . . . . . . . . . . . . . . . 65 3.6.3 Confidence intervals . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 3.6.4 Information theoretic model comparison . . . . . . . . . . . . . . . . 65 4 Linear Model 67 4.1 Model formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 4.1.1 Model for an individual observation . . . . . . . . . . . . . . . . . . 67 4.1.2 The model for all data . . . . . . . . . . . . . . . . . . . . . . . . . . 71 4.2 The systematic part . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 4.2.1 Identifiability and multicollinearity . . . . . . . . . . . . . . . . . . . 72 4.2.2 Formulating the systematic part . . . . . . . . . . . . . . . . . . . . 72 4.2.3 Interpretation of regression coefficients . . . . . . . . . . . . . . . . . 74 4.2.4 Categorical predictors . . . . . . . . . . . . . . . . . . . . . . . . . . 76 4.2.5 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 4.3 The random part . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 4.3.1 Modeling inconstant error variance . . . . . . . . . . . . . . . . . . . 83 4.3.2 Modeling dependence using an autocorrelation function . . . . . . . 83 4.3.3 Variance-covariance structures in the model for all data . . . . . . . 84 4.3.4 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86 4.4 Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 4.4.1 Ordinary least squares (OLS) . . . . . . . . . . . . . . . . . . . . . . 89 Box: Stein’s paradox . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 Box: Ridge regression and LASSO . . . . . . . . . . . . . . . . . . . 91 4.4.2 Generalized least squares (GLS) . . . . . . . . . . . . . . . . . . . . 92 4.4.3 Maximum likelihood (ML). . . . . . . . . . . . . . . . . . . . . . . . 94 Box: BLUE or minimum RMSE? . . . . . . . . . . . . . . . . . . . . 95 4.4.4 Restricted maximum likelihood (REML) . . . . . . . . . . . . . . . . 95 Box: Warnings about model comparison using REML . . . . . . . . 97 4.5 Does the model fit? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 4.5.1 Model residuals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 4.5.2 Graphical evaluation of the fit . . . . . . . . . . . . . . . . . . . . . 100 4.5.3 Numerical criteria for model fit . . . . . . . . . . . . . . . . . . . . . 111 4.6 Inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 4.6.1 Hypothesis tests of nested models . . . . . . . . . . . . . . . . . . . 113 4.6.2 Wald’s F- and χ2 tests of β . . . . . . . . . . . . . . . . . . . . . . . 114 4.6.3 Normality of residual errors in testing . . . . . . . . . . . . . . . . . 115 4.6.4 Marginal and sequential testing procedures . . . . . . . . . . . . . . 116 4.6.5 The effect of estimation error in V . . . . . . . . . . . . . . . . . . . 117 4.6.6 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118 4.6.7 t-tests of β . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121 4.6.8 Likelihood ratio - a general large-sample test procedure for nested models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124 4.6.9 Comparing non-nested models . . . . . . . . . . . . . . . . . . . . . 125 Contents vii 4.6.10 Confidence intervals for model parameters . . . . . . . . . . . . . . . 125 4.7 Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126 4.7.1 Prediction of an uncorrelated observation . . . . . . . . . . . . . . . 126 4.7.2 Prediction of a correlated observation . . . . . . . . . . . . . . . . . 129 5 Linear Mixed-Effects Models 131 Box: Pseudo-replication . . . . . . . . . . . . . . . . . . . . . . . . . 132 5.1 Model formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133 5.1.1 The variance component model . . . . . . . . . . . . . . . . . . . . . 133 5.1.2 Mixed-effects model with a random intercept . . . . . . . . . . . . . 136 5.1.3 Multiple random effects . . . . . . . . . . . . . . . . . . . . . . . . . 139 5.1.4 Random effects for categorical predictors . . . . . . . . . . . . . . . 142 5.2 Matrix formulation of the model . . . . . . . . . . . . . . . . . . . . . . . . 145 5.2.1 Matrix formulation for single group. . . . . . . . . . . . . . . . . . . 145 5.2.2 Relaxing the assumptions on residual variance . . . . . . . . . . . . 147 5.2.3 Matrix formulation for all data . . . . . . . . . . . . . . . . . . . . . 147 5.3 Estimation of model parameters . . . . . . . . . . . . . . . . . . . . . . . . 150 5.3.1 Estimating variance components with ANOVA methods . . . . . . . 151 5.3.2 Maximum likelihood . . . . . . . . . . . . . . . . . . . . . . . . . . . 152 Box: Negative within-group correlation. . . . . . . . . . . . . . . . . 153 Box: Swamy’s GLS estimator . . . . . . . . . . . . . . . . . . . . . . 154 5.3.3 Restricted maximum likelihood . . . . . . . . . . . . . . . . . . . . . 155 5.3.4 More about the estimation of R . . . . . . . . . . . . . . . . . . . . 156 5.3.5 Bayesian estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . 157 5.4 Prediction at group level . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158 5.4.1 Prediction of random effects . . . . . . . . . . . . . . . . . . . . . . . 158 Box: Kalman filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159 5.4.2 Group-level prediction . . . . . . . . . . . . . . . . . . . . . . . . . . 160 Box: Henderson’s mixed model equations . . . . . . . . . . . . . . . 162 5.4.3 A closer look at the variance component model . . . . . . . . . . . . 165 5.4.4 Fixed or random group effect? . . . . . . . . . . . . . . . . . . . . . 166 5.5 Evaluation of model fit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168 Box: Plotting standard errors . . . . . . . . . . . . . . . . . . . . . . 169 5.5.1 The systematic part . . . . . . . . . . . . . . . . . . . . . . . . . . . 169 5.5.2 The random part . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170 5.5.3 Coefficient of determination for mixed-effect models . . . . . . . . . 174 5.6 Tests and confidence intervals . . . . . . . . . . . . . . . . . . . . . . . . . 175 5.6.1 Hypothesis tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175 5.6.2 Confidence intervals . . . . . . . . . . . . . . . . . . . . . . . . . . . 180 5.7 Reporting a fitted mixed-effects model . . . . . . . . . . . . . . . . . . . . . 182 6 More about Linear Mixed-Effects Models 183 6.1 Nested grouping structure . . . . . . . . . . . . . . . . . . . . . . . . . . . 183 6.1.1 A nested two-level model . . . . . . . . . . . . . . . . . . . . . . . . 183 6.1.2 Matrix formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184 6.1.3 More than two nested levels . . . . . . . . . . . . . . . . . . . . . . . 189 6.2 Crossed grouping structure . . . . . . . . . . . . . . . . . . . . . . . . . . . 190 6.3 Between-group and within-group effects . . . . . . . . . . . . . . . . . . . . 198 6.4 Population-averaged prediction . . . . . . . . . . . . . . . . . . . . . . . . . 203 6.5 Prediction of correlated observation using a mixed model . . . . . . . . . . 205 viii Contents 6.6 Applying the mixed model BLUP in a new group without re-estimating the model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206 7 Nonlinear (Mixed-Effects) Models 209 7.1 Nonlinear model and nonlinear relationship . . . . . . . . . . . . . . . . . . 209 7.2 Nonlinear fixed-effects model . . . . . . . . . . . . . . . . . . . . . . . . . . 210 7.2.1 Model formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210 Box: Taylor approximation . . . . . . . . . . . . . . . . . . . . . . . 211 7.2.2 Explaining variability in φ through second-order predictors . . . . . 214 Box: Finney’s second-level parameters . . . . . . . . . . . . . . . . . 216 7.2.3 Parameter estimation when var(e)=σ2I . . . . . . . . . . . . . . . 216 7.2.4 Initial guesses of parameters. . . . . . . . . . . . . . . . . . . . . . . 218 7.2.5 Inference and model diagnostics. . . . . . . . . . . . . . . . . . . . . 218 7.2.6 Parameter estimation and inference when var(e)=σ2V . . . . . . . 221 7.3 Nonlinear mixed-effects models . . . . . . . . . . . . . . . . . . . . . . . . . 223 7.3.1 Model formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223 7.3.2 Parameter estimation and inference . . . . . . . . . . . . . . . . . . 225 7.3.3 Convergence problems . . . . . . . . . . . . . . . . . . . . . . . . . . 233 7.3.4 Multiple levels of grouping . . . . . . . . . . . . . . . . . . . . . . . 234 7.3.5 Prediction of random effects and y-values . . . . . . . . . . . . . . . 235 7.3.6 Principal components of random parameters . . . . . . . . . . . . . 241 8 Generalized Linear (Mixed-Effects) Models 245 Box: Transformations or GLMM? . . . . . . . . . . . . . . . . . . . . 246 8.1 Common parent distributions . . . . . . . . . . . . . . . . . . . . . . . . . . 246 8.1.1 The normal distribution . . . . . . . . . . . . . . . . . . . . . . . . . 246 8.1.2 Bernoulli and binomial distribution . . . . . . . . . . . . . . . . . . . 247 8.1.3 Poisson distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . 247 8.1.4 Link and mean functions . . . . . . . . . . . . . . . . . . . . . . . . 248 8.2 Generalized linear model . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249 8.2.1 Formulation of the LM as a GLM . . . . . . . . . . . . . . . . . . . 249 8.2.2 Models for binary data. . . . . . . . . . . . . . . . . . . . . . . . . . 251 8.2.3 Models for count data . . . . . . . . . . . . . . . . . . . . . . . . . . 257 8.2.4 GLM for the exponential family. . . . . . . . . . . . . . . . . . . . . 258 8.2.5 Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 260 8.2.6 Evaluating the fit of a GLM . . . . . . . . . . . . . . . . . . . . . . . 262 8.2.7 Over- and underdispersion . . . . . . . . . . . . . . . . . . . . . . . . 265 8.2.8 Inference on GLM . . . . . . . . . . . . . . . . . . . . . . . . . . . . 270 8.2.9 Zero-inflation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274 8.2.10 Exposure variables in count data . . . . . . . . . . . . . . . . . . . . 275 8.3 Generalized linear mixed-effects models . . . . . . . . . . . . . . . . . . . . 275 8.3.1 Model formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 276 8.3.2 Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279 8.3.3 Inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284 9 Multivariate (Mixed-Effects) Models 287 9.1 Why model systems? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287 9.2 Seemingly unrelated regression models . . . . . . . . . . . . . . . . . . . . . 289 9.2.1 Model formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 289 9.2.2 Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 290 9.2.3 Inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295 Contents ix 9.2.4 Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 296 9.3 Simultaneous equations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 298 9.4 Multivariate mixed-effects models . . . . . . . . . . . . . . . . . . . . . . . 300 9.4.1 Model formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 300 9.4.2 Estimation and inference . . . . . . . . . . . . . . . . . . . . . . . . 302 9.4.3 Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 306 9.5 Multivariate nonlinear and generalized linear (mixed) models . . . . . . . . 308 10 Additional Topics on Regression 309 10.1 Random regressors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 309 10.1.1 Connection between OLS linear regression and BLP . . . . . . . . . 309 Box: Regression through the origin . . . . . . . . . . . . . . . . . . . 311 10.1.2 Random regressors in a linear model . . . . . . . . . . . . . . . . . . 312 10.2 Modeling nonlinear responses using transformations in y . . . . . . . . . . 314 10.2.1 Logarithmic regression . . . . . . . . . . . . . . . . . . . . . . . . . . 314 10.2.2 Correcting back transformation bias using the Taylor series . . . . . 316 10.2.3 Correcting back transformation bias using the two-point distribution 318 10.3 Predicting the mean of a nonlinear function over the distribution of x . . . 319 10.4 Modeling nonlinear responses using transformations in x . . . . . . . . . . 321 10.4.1 Regression splines . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321 Box: Interpolating splines . . . . . . . . . . . . . . . . . . . . . . . . 324 10.4.2 Smoothing splines . . . . . . . . . . . . . . . . . . . . . . . . . . . . 324 10.4.3 Multivariate splines . . . . . . . . . . . . . . . . . . . . . . . . . . . 325 10.4.4 Second-order response surface . . . . . . . . . . . . . . . . . . . . . . 325 10.5 Generalized additive models . . . . . . . . . . . . . . . . . . . . . . . . . . 329 10.6 Modeling continuous proportions . . . . . . . . . . . . . . . . . . . . . . . . 330 10.7 Effect size . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 331 11 Modeling Tree Size 333 Box: Kernel smoothing. . . . . . . . . . . . . . . . . . . . . . . . . . 334 11.1 A model for tree size in a single forest stand . . . . . . . . . . . . . . . . . 336 11.2 Fitting an assumed model to tree diameter data . . . . . . . . . . . . . . . 337 11.2.1 Maximum likelihood . . . . . . . . . . . . . . . . . . . . . . . . . . . 337 11.2.2 The method of moments . . . . . . . . . . . . . . . . . . . . . . . . . 339 11.2.3 Other methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341 11.3 Model forms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 342 11.3.1 Distribution models for tree size . . . . . . . . . . . . . . . . . . . . 342 11.3.2 Model comparison criteria . . . . . . . . . . . . . . . . . . . . . . . . 346 11.4 Mathematics of size distributions . . . . . . . . . . . . . . . . . . . . . . . . 348 11.4.1 Transformation of tree size . . . . . . . . . . . . . . . . . . . . . . . 348 11.4.2 Basal-area weighted distribution . . . . . . . . . . . . . . . . . . . . 353 11.4.3 Weighted distributions . . . . . . . . . . . . . . . . . . . . . . . . . . 356 11.4.4 Scaling the size distribution . . . . . . . . . . . . . . . . . . . . . . . 360 11.4.5 Utilizing the arithmetic relationships of stand variables . . . . . . . 365 11.5 Modeling tree size in a population of stands . . . . . . . . . . . . . . . . . 368 11.5.1 The statistical model . . . . . . . . . . . . . . . . . . . . . . . . . . . 368 11.5.2 PPM and PRM approaches . . . . . . . . . . . . . . . . . . . . . . . 370 11.5.3 Improving the prediction using sample information . . . . . . . . . . 373

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.