Springer Texts in Statistics Advisors: George Casella Stephen Fienberg Ingram Olkin springer Texts in Statistics Alfred: Elements of Statistics for the Life and Social Sciences Berger: An Introduction to Probability and Stochastic Processes Bilodeau and Brenner: Theory of Multivariate Statistics Blom: Probability and Statistics: Theory and Applications Brockwell and Davis: Introduction to Times Series and Forecasting, Second Edition Carmona: Statistical Analysis of Financial Data in S-Plus Chow and Teicher: Probability Theory: Independence, Interchangeability, Martingales, Third Edition Christensen: Advanced Linear Modeling: Multivariate, Time Series, and Spatial Data—Nonparametric Regression and Response Surface Maximization, Second Edition Christensen: Log-Linear Models and Logistic Regression, Second Edition Christensen: Plane Answers to Complex Questions: The Theory of Linear Models, Third Edition Creighton: A First Course in Probability Models and Statistical Inference Davis: Statistical Methods for the Analysis of Repeated Measurements Dean and Voss: Design and Analysis of Experiments du Toit, Steyn, and Stumpf: Graphical Exploratory Data Analysis Durrett: Essentials of Stochastic Processes Edwards: Introduction to Graphical Modelling, Second Edition Finkelstein and Levin: Statistics for Lawyers Flury: A First Course in Multivariate Statistics Ghosh, Delampady and Samanta: An Introduction to Bayesian Analysis: Theory and Methods Gut: Probability: A Graduate Course Heiberger and Holland: Statistical Analysis and Data Display: An Intermediate Course with Examples in S-PLUS, R, and SAS Job son: Applied Multivariate Data Analysis, Volume I: Regression and Experimental Design Jobson: Applied Multivariate Data Analysis, Volume II: Categorical and Multivariate Methods Kalbfleisch: Probability and Statistical Inference, Volume I: Probability, Second Edition Kalbfleisch: Probability and Statistical Inference, Volume II: Statistical Inference, Second Edition Karr: Probability Keyfitz: Applied Mathematical Demography, Second Edition Kiefer: Introduction to Statistical Inference Kokoska and Nevison: Statistical Tables and Formulae Kulkarni: Modeling, Analysis, Design, and Control of Stochastic Systems Lange: Applied Probability Lange: Optimization Lehmann: Elements of Large-Sample Theory (continued after index) Jayanta K. Ghosh Mohan Delampady Tapas Samanta An Introduction to Bayesian Analysis Theory and Methods With 13 Illustrations Sprin ger Jayanta K. Ghosh Mohan Delampady Tapas Samanta Department of Statistics Indian Statistical Institute, Indian Statistical Institute Purdue University 8th Mile, Mysore Road, 203 B.T. Road 150 N. University Street R.V. College Post, Kolkata 700108, India West Lafayette, Bangalore 560059, India [email protected] IN 47907-2067 mohan@isibang. ac. in USA [email protected] and Indian Statistical Institute 203 B.T. Road Kolkata 700108, India [email protected] Editorial Board George Casella Stephen Fienberg Ingram Olkin Department of Statistics Department of Statistics Department of Statistics University of Florida Carnegie Mellon University Stanford University Gainesville, FL 32611-8545 Pittsburgh, PA 15213-3890 Stanford, CA 94305 USA USA USA Library of Congress Control Number: 2006922766 ISBN-10: 0-387-40084-2 e-ISBN: 0-387-35433-6 ISBN-13: 978-0387-40084-6 Printed on acid-free paper. ©2006 Springer Science+Business Media, LLC All rights reserved. This work may not be translated or copied in whole or in part without the written permission of the publisher (Springer Science+Business Media, LLC, 233 Spring Street, New York, NY 10013, USA), except for brief excepts in connection with reviews or scholarly analysis. Use in connection with any form of information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed is forbidden. The use in this publication of trade names, trademarks, service marks, and similar terms, even if they are not identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary rights. Printed in the United States of America. (MVY) 9 8 7 6 5 4 3 21 sprmger.com To Ira, Shobha, and Shampa Preface Though there are many recent additions to graduate-level introductory books on Bayesian analysis, none has quite our blend of theory, methods, and ap plications. We believe a beginning graduate student taking a Bayesian course or just trying to find out what it means to be a Bayesian ought to have some familiarity with all three aspects. More specialization can come later. Each of us has taught a course like this at Indian Statistical Institute or Purdue. In fact, at least partly, the book grew out of those courses. We would also like to refer to the review (Ghosh and Samanta (2002b)) that first made us think of writing a book. The book contains somewhat more material than can be covered in a single semester. We have done this intentionally, so that an instructor has some choice as to what to cover as well as which of the three aspects to emphasize. Such a choice is essential for the instructor. The topics include several results or methods that have not appeared in a graduate text before. In fact, the book can be used also as a second course in Bayesian analysis if the instructor supplies more details. Chapter 1 provides a quick review of classical statistical inference. Some knowledge of this is assumed when we compare different paradigms. Following this, an introduction to Bayesian inference is given in Chapter 2 emphasizing the need for the Bayesian approach to statistics. Objective priors and objec tive Bayesian analysis are also introduced here. We use the terms objective and nonsubjective interchangeably. After briefly reviewing an axiomatic de velopment of utility and prior, a detailed discussion on Bayesian robustness is provided in Chapter 3. Chapter 4 is mainly on convergence of posterior quan tities and large sample approximations. In Chapter 5, we discuss Bayesian inference for problems with low-dimensional parameters, specifically objec tive priors and objective Bayesian analysis for such problems. This covers a whole range of possibilities including uniform priors, Jeffreys' prior, other invariant objective priors, and reference priors. After this, in Chapter 6 we discuss some aspects of testing and model selection, treating these two prob lems as equivalent. This mostly involves Bayes factors and bounds on these computed over large classes of priors. Comparison with classical P-value is VIII Preface also made whenever appropriate. Bayesian P-value and nonsubjective Bayes factors such as the intrinsic and fractional Bayes factors are also introduced. Chapter 7 is on Bayesian computations. Analytic approximation and the E-M algorithm are covered here, but most of the emphasis is on Markov chain based Monte Carlo methods including the M-H algorithm and Gibbs sampler, which are currently the most popular techniques. Follwing this, in Chapter 8 we cover the Bayesian approach to some standard problems in statistics. The next chapter covers more complex problems, namely, hierarchical Bayesian (HB) point and interval estimation in high-dimensional problems and para metric empirical Bayes (FEB) methods. Superiority of HB and FEB methods to classical methods and advantages of HB methods over FEB methods are discussed in detail. Akaike information criterion (AIC), Bayes information criterion (BIC), and other generalized Bayesian model selection criteria, high- dimensional testing problems, microarrays, and multiple comparisons are also covered here. The last chapter consists of three major methodological appli cations along with the required methodology. We have marked those sections that are either very technical or are very specialized. These may be omitted at first reading, and also they need not be part of a standard one-semester course. Several problems have been provided at the end of each chapter. More problems and other material will be placed at http://www.isical.ac.in/'^ tapas/book Many people have helped - our mentors, both friends and critics, from whom we have learnt, our family and students at ISI and Furdue, and the anonymous referees of the book. Special mention must be made of Arijit Chakrabarti for Sections 9.7 and 9.8, Sudipto Banerjee for Section 10.1, Fartha F. Majumder for Appendix D, and Kajal Dihidar and Avranil Sarkar for help in several computations. We alone are responsible for our philosophical views, however tentatively held, as well as presentation. Thanks to John Kimmel, whose encouragement and support, as well as advice, were invaluable. Indian Statistical Institute and Furdue University Jayanta K. Ghosh Indian Statistical Institute Mohan Delampady Indian Statistical Institute Tapas Samanta February 2006 Contents Statistical Preliminaries 1 1.1 Common Models 1 1.1.1 Exponential Families 4 1.1.2 Location-Scale Families 5 1.1.3 Regular Family 6 1.2 Likelihood Function 7 1.3 Sufficient Statistics and Ancillary Statistics 9 1.4 Three Basic Problems of Inference in Classical Statistics 11 1.4.1 Point Estimates 11 1.4.2 Testing Hypotheses 16 1.4.3 Interval Estimation 20 1.5 Inference as a Statistical Decision Problem 21 1.6 The Changing Face of Classical Inference 23 1.7 Exercises 24 Bayesian Inference and Decision Theory 29 2.1 Subjective and Frequentist Probability 29 2.2 Bayesian Inference 30 2.3 Advantages of Being a Bayesian 35 2.4 Paradoxes in Classical Statistics 37 2.5 Elements of Bayesian Decision Theory 38 2.6 Improper Priors 40 2.7 Common Problems of Bayesian Inference 41 2.7.1 Point Estimates 41 2.7.2 Testing 42 2.7.3 Credible Intervals 48 2.7.4 Testing of a Sharp Null Hypothesis Through Credible Intervals 49 2.8 Prediction of a Future Observation 50 2.9 Examples of Cox and Welch Revisited 51 2.10 Elimination of Nuisance Parameters 51 X Contents 2.11 A High-dimensional Example 53 2.12 Exchangeability 54 2.13 Normative and Descriptive Aspects of Bayesian Analysis, Elicitation of Probability 55 2.14 Objective Priors and Objective Bayesian Analysis 55 2.15 Other Paradigms 57 2.16 Remarks 57 2.17 Exercises 58 3 Utility, Prior, and Bayesian Robustness 65 3.1 Utility, Prior, and Rational Preference 65 3.2 Utility and Loss 67 3.3 Rationality Axioms Leading to the Bayesian Approach 68 3.4 Coherence 70 3.5 Bayesian Analysis with Subjective Prior 71 3.6 Robustness and Sensitivity 72 3.7 Classes of Priors 74 3.7.1 Conjugate Class 74 3.7.2 Neighborhood Class 75 3.7.3 Density Ratio Class 75 3.8 Posterior Robustness: Measures and Techniques 76 3.8.1 Global Measures of Sensitivity 76 3.8.2 Belief Functions 81 3.8.3 Interactive Robust Bayesian Analysis 83 3.8.4 Other Global Measures 84 3.8.5 Local Measures of Sensitivity 84 3.9 Inherently Robust Procedures 91 3.10 Loss Robustness 92 3.11 Model Robustness 93 3.12 Exercises 94 4 Large Sample Methods 99 4.1 Limit of Posterior Distribution 100 4.1.1 Consistency of Posterior Distribution 100 4.1.2 Asymptotic Normality of Posterior Distribution 101 4.2 Asymptotic Expansion of Posterior Distribution 107 4.2.1 Determination of Sample Size in Testing 109 4.3 Laplace Approximation 113 4.3.1 Laplace's Method 113 4.3.2 Tierney-Kadane-Kass Refinements 115 4.4 Exercises 119
Description: