ebook img

Advancements in Bayesian Methods and Implementations PDF

322 Pages·2022·6.77 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Advancements in Bayesian Methods and Implementations

Handbook of Statistics Volume 47 Advancements in Bayesian Methods and Implementation Handbook of Statistics Series Editors C.R. Rao C.R. Rao AIMSCS, University of Hyderabad Campus, Hyderabad, India Arni S.R. Srinivasa Rao Medical College of Georgia, Augusta University, United States Handbook of Statistics Volume 47 Advancements in Bayesian Methods and Implementation Edited by Arni S.R. Srinivasa Rao Medical College of Georgia, Augusta, Georgia, United States G. Alastair Young Department of Mathematics, Imperial College London, London, United Kingdom C.R. Rao AIMSCS, University of Hyderabad Campus, Hyderabad, India AcademicPressisanimprintofElsevier 50HampshireStreet,5thFloor,Cambridge,MA02139,UnitedStates 525BStreet,Suite1650,SanDiego,CA92101,UnitedStates TheBoulevard,LangfordLane,Kidlington,OxfordOX51GB,UnitedKingdom 125LondonWall,London,EC2Y5AS,UnitedKingdom Copyright©2022ElsevierB.V.Allrightsreserved. Nopartofthispublicationmaybereproducedortransmittedinanyformorbyanymeans, electronicormechanical,includingphotocopying,recording,oranyinformationstorageand retrievalsystem,withoutpermissioninwritingfromthepublisher.Detailsonhowtoseek permission,furtherinformationaboutthePublisher’spermissionspoliciesandourarrangements withorganizationssuchastheCopyrightClearanceCenterandtheCopyrightLicensingAgency, canbefoundatourwebsite:www.elsevier.com/permissions. Thisbookandtheindividualcontributionscontainedinitareprotectedundercopyrightbythe Publisher(otherthanasmaybenotedherein). Notices Knowledgeandbestpracticeinthisfieldareconstantlychanging.Asnewresearchandexperience broadenourunderstanding,changesinresearchmethods,professional practices,ormedicaltreatmentmaybecomenecessary. Practitionersandresearchersmustalwaysrelyontheirownexperienceandknowledge inevaluatingandusinganyinformation,methods,compounds,orexperiments describedherein.Inusingsuchinformationormethodstheyshouldbemindfuloftheir ownsafetyandthesafetyofothers,includingpartiesforwhomtheyhaveaprofessional responsibility. Tothefullestextentofthelaw,neitherthePublishernortheauthors,contributors,or editors,assumeanyliabilityforanyinjuryand/ordamagetopersonsorpropertyasa matterofproductsliability,negligenceorotherwise,orfromanyuseoroperationofany methods,products,instructions,orideascontainedinthematerialherein. ISBN:978-0-323-95268-2 ISSN:0169-7161 ForinformationonallAcademicPresspublications visitourwebsiteathttps://www.elsevier.com/books-and-journals Publisher:ZoeKruze AcquisitionsEditor:MarianaKuhl DevelopmentalEditor:NaizaErminMendoza ProductionProjectManager:AbdullaSait CoverDesigner:VickyPearson TypesetbySTRAIVE,India Contents Contributors xi Preface xiii 1. Direct Gibbs posterior inference on risk minimizers: Construction, concentration, and calibration 1 RyanMartinandNicholasSyring 1. Introduction 1 2. Gibbsposteriordistributions 4 2.1 Problemsetup 4 2.2 Definition 6 2.3 FAQs 8 2.4 Illustrations 11 3. Asymptotictheory 14 3.1 Objectivesandgeneralstrategies 14 3.2 Consistency 15 3.3 Concentrationrates 16 3.4 Distributionalapproximations 18 4. Learningrateselection 21 5. Numericalexamples 24 5.1 Quantileregression 24 5.2 Classification 26 5.3 Nonlinearregression 28 6. Furtherdetails 30 6.1 Thingswedid notdiscuss 30 6.2 Openproblems 31 7. Conclusion 32 Acknowledgments 33 Appendix 34 A.1 Proofs 34 References 38 2. Bayesian selective inference 43 DanielGarcı´aRasinesandG.AlastairYoung 1. Introduction 43 2. Bayesandselection 45 2.1 Fixedandrandomparameters 48 v vi Contents 3. Noninformativepriorsforselectiveinference 50 3.1 Noninformativepriorsforexponentialfamilies 56 4. Discussion 64 References 65 3. Dependent Bayesian multiple hypothesis testing 67 NoirritKiranChandra andSourabhBhattacharya 1. Introduction 67 2. Bayesianmultiplehypothesistesting 69 2.1 Preliminariesandsetup 69 2.2 Thedecisionproblem 71 3. Dependent multipletesting 72 3.1 Newerrorbasedcriterion 74 3.2 ChoiceofG1,…,Gm 75 4. Simulationstudy 76 4.1 ThepostulatedBayesianmodel 76 4.2 Comparisoncriteria 77 4.3 Comparisonoftheresults 77 5. Discussion 78 References 79 4. A new look at Bayesian uncertainty 83 StephenG.Walker 1. Introduction 83 2. Missing data 86 3. Parametricmartingalesequences 88 3.1 Langevinposterior 90 4. Nonparametric martingale distributions 92 5. Illustrations 93 5.1 Parametriccase 93 5.2 Nonparametriccase 94 6. Mathematicaltheory 96 7. Discussion 98 Acknowledgments 100 References 100 5. 50 shades of Bayesian testing of hypotheses 103 Christian P.Robert 1. Introduction 103 2. Bayesianhypothesistesting 104 3. Improperpriorsunited againsthypothesistesting 107 4. TheJeffreys–Lindley paradox 109 5. Posteriorpredictivep-values 110 6. A modestproposal 111 7. Conclusion 116 Acknowledgments 116 References 117 Contents vii 6. Inference approach to ground states of quantum systems 121 AngeloPlastino andA.R.Plastino 1. Introduction 121 2. TheJaynes’maximumentropymethodology:Brief resume 122 3. Thequantummaximumentropyapproach 123 3.1 Preliminaries 123 4. PropertiesofSQthat makeourapproximatemaximum entropyapproach wavefunctionsreasonable ones 124 4.1 SQisatrueShannon’signorancefunction 124 4.2 Subjecttotheknownquantitiesbk,themaximum valueofSQisunique 125 4.3 TheentropySQ obeysanH-theorem 125 4.4 OurSQ-ground-statewavefunctionsrespectthevirial theorem 126 4.5 TheSQ-ground-statewavefunctionsrespecthypervirial theorems 127 4.6 Saturation 127 4.7 Speculation 127 5. Coulombpotential 127 5.1 Harmonicoscillator 128 5.2 Morsepotential 128 5.3 Groundstateofthequarticoscillator 129 5.4 ApossibleMEMextension 129 6. Noncommuting observables 129 7. Other entropicorinformationmeasures 130 8. Conclusions 133 References 133 7. MCMC for GLMMs 135 Vivekananda Roy 1. Introduction 135 2. Likelihood functionforGLMMs 136 3. ConditionalsimulationforGLMMs 139 3.1 MALAforGLMMs 139 3.2 HMCforGLMMs 141 3.3 DataaugmentationforGLMMs 143 4. MCMCforBayesianGLMMs 147 4.1 MALAandHMCforBayesianGLMMs 148 4.2 DataaugmentationforBayesianGLMMs 150 5. A numericalexample 154 6. Discussion 156 References 157 viii Contents 8. Sparsity-aware Bayesian inference and its applications 161 Geethu Joseph, SaurabhKhanna,Chandra R.Murthy, Ranjitha Prasad,andSaiSubramanyamThoota 1. Introduction 162 1.1 Quicksummaryofexistingmethodsforsparsesignal recovery 163 1.2 Bayesian approaches:Motivationandrelated literature 164 2. ThehierarchicalBayesianframework 165 2.1 GaussianscalemixturesandsparseBayesian learning 166 2.2 SBLframework 167 2.3 Casestudy:WirelesschannelestimationandSBL 169 3. Joint-sparse signalrecovery 173 3.1 TheMSBLalgorithm 173 3.2 ExpectationmaximizationinMSBL 174 3.3 AninterestinginterpretationoftheMSBLcost function 174 3.4 ACovariance-matchingframeworkforsparsesupport recoveryusing MMVs 175 3.5 Examplesofcovariance-matchingalgorithmsforsparse supportrecovery 177 4. Exploitingintervectorcorrelation 178 4.1 Intervectorcorrelation:TheKalmanSBLalgorithm 179 4.2 Onlinesparsevectorrecovery 181 4.3 Casestudy(continued):Wirelesschannelestimation andtheKSBLalgorithm 185 5. Intravectorcorrelations:ThenestedSBLalgorithm 186 5.1 NestedSBL(B6¼IB) 189 6. Quantizedsparsesignalrecovery 192 7. Other extensions 198 7.1 DecentralizedSBL 198 7.2 Dictionarylearning 199 7.3 Relationshipwithrobustprincipalcomponentanalysis andsparse+low-rankdecomposition 200 7.4 DeepunfoldedSBL 202 8. Discussion andfutureoutlook 202 References 203 9. Mathematical theory of Bayesian statistics where all models are wrong 209 Sumio Watanabe 1. Introduction 209 2. MathematicaltheoryofBayesianstatistics 211 2.1 DGP,model,andprior 211 2.2 Generalizationlossandfreeenergy 213 Contents ix 2.3 Regulartheory 217 2.4 Singulartheory 220 2.5 Phasetransitions 225 3. Applicationstostatisticsandmachine learning 227 3.1 Modelevaluation 227 3.2 Priorevaluation 230 3.3 Noti.i.d.cases 232 4. Conclusion 235 References 236 10. Geometry in sampling methods: A review on manifold MCMC and particle-based variational inference methods 239 ChangLiuandJunZhu 1. Geometryconsiderationinsampling:Whybother? 239 2. Manifold andrelatedconcepts 242 2.1 Manifold 242 2.2 Tangentvectorandvectorfield 244 2.3 Cotangentvectoranddifferentialform 246 2.4 Riemannianmanifold 247 2.5 Measure 248 2.6 DivergenceandLaplacian 249 2.7 Manifoldembedding 250 3. Markovchain MonteCarloonRiemannian manifolds 251 3.1 TechnicaldescriptionofgeneralMCMCdynamics 252 3.2 RiemannianMCMCincoordinatespace 253 3.3 RiemannianMCMCinembeddedspace 258 4. Particle-basedvariationalinferencemethods 260 4.1 Steinvariationalgradientdescent 261 4.2 TheWassersteinspace 262 4.3 Geometricviewofparticle-basedvariationalinference methods 266 4.4 GeometricviewofMCMCdynamicsandrelation toParVImethods 270 4.5 VariantsandTechniquesInspiredbytheGeometric View 277 5. Conclusion 286 Acknowledgments 286 References 286 Index 295

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.