ebook img

Discrete Probability Models and Methods: Probability on Graphs and Trees, Markov Chains and Random Fields, Entropy and Coding PDF

561 Pages·2017·2.76 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Discrete Probability Models and Methods: Probability on Graphs and Trees, Markov Chains and Random Fields, Entropy and Coding

Probability Theory and Stochastic Modelling 78 Pierre Brémaud Discrete Probability Models and Methods Probability on Graphs and Trees, Markov Chains and Random Fields, Entropy and Coding Probability Theory and Stochastic Modelling Volume 78 Editors-in-chief Søren Asmussen, Aarhus, Denmark Peter W. Glynn, Stanford, CA, USA Yves Le Jan, Orsay, France Advisory Board Martin Hairer, Coventry, UK Peter Jagers, Gothenburg, Sweden Ioannis Karatzas, New York, NY, USA Frank P. Kelly, Cambridge, UK Andreas E. Kyprianou, Bath, UK Bernt Øksendal, Oslo, Norway George Papanicolaou, Stanford, CA, USA Etienne Pardoux, Marseille, France Edwin Perkins, Vancouver, BC, Canada Halil Mete Soner, Zürich, Switzerland The Probability Theory and Stochastic Modelling series is a merger and continuation of Springer’s two well established series Stochastic Modelling and Applied Probability and Probability and Its Applications series. It publishes researchmonographsthatmakeasignificantcontributiontoprobabilitytheoryoran applications domain in which advanced probability methods are fundamental. Books in this series are expected to follow rigorous mathematical standards, while alsodisplayingtheexpositoryqualitynecessarytomakethemusefulandaccessible toadvancedstudentsaswellasresearchers.Theseriescoversallaspectsofmodern probability theory including (cid:129) Gaussian processes (cid:129) Markov processes (cid:129) Random fields, point processes and random sets (cid:129) Random matrices (cid:129) Statistical mechanics and random media (cid:129) Stochastic analysis as well as applications that include (but are not restricted to): (cid:129) Branching processes and other models of population growth (cid:129) Communications and processing networks (cid:129) Computational methods in probability and stochastic processes, including simulation (cid:129) Genetics and other stochastic models in biology and the life sciences (cid:129) Information theory, signal processing, and image synthesis (cid:129) Mathematical economics and finance (cid:129) Statistical methods (e.g. empirical processes, MCMC) (cid:129) Statistics for stochastic processes (cid:129) Stochastic control (cid:129) Stochastic models in operations research and stochastic optimization (cid:129) Stochastic models in the physical sciences More information about this series at http://www.springer.com/series/13205 é Pierre Br maud Discrete Probability Models and Methods Probability on Graphs and Trees, Markov Chains and Random Fields, Entropy and Coding 123 Pierre Brémaud ÉcolePolytechniqueFédéraledeLausanne(EPFL) Lausanne Switzerland ISSN 2199-3130 ISSN 2199-3149 (electronic) Probability Theoryand Stochastic Modelling ISBN978-3-319-43475-9 ISBN978-3-319-43476-6 (eBook) DOI 10.1007/978-3-319-43476-6 LibraryofCongressControlNumber:2016962040 Mathematics Subject Classification (2010): 60J10, 68Q87, 68W20, 68W40, 05C80, 05C81, 60G60, 60G42,60C05,60K05,60J80,60K15 ©SpringerInternationalPublishingSwitzerland2017 Thisworkissubjecttocopyright.AllrightsarereservedbythePublisher,whetherthewholeorpart of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission orinformationstorageandretrieval,electronicadaptation,computersoftware,orbysimilarordissimilar methodologynowknownorhereafterdeveloped. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publicationdoesnotimply,evenintheabsenceofaspecificstatement,thatsuchnamesareexemptfrom therelevantprotectivelawsandregulationsandthereforefreeforgeneraluse. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authorsortheeditorsgiveawarranty,expressorimplied,withrespecttothematerialcontainedhereinor for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictionalclaimsinpublishedmapsandinstitutionalaffiliations. Printedonacid-freepaper ThisSpringerimprintispublishedbySpringerNature TheregisteredcompanyisSpringerInternationalPublishingAG Theregisteredcompanyaddressis:Gewerbestrasse11,6330Cham,Switzerland Pour Marion Contents Introduction xiii 1 Events and Probability 1 1.1 Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1.1 The Sample Space . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1.2 The Language of Probabilists . . . . . . . . . . . . . . . . . . . . . 2 1.1.3 The Sigma-field of Events . . . . . . . . . . . . . . . . . . . . . . . 3 1.2 Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.2.1 The Axioms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.2.2 The Borel–Cantelli Lemma . . . . . . . . . . . . . . . . . . . . . . 6 1.3 Independence and Conditioning . . . . . . . . . . . . . . . . . . . . . . . . 10 1.3.1 Independent Events . . . . . . . . . . . . . . . . . . . . . . . . . . 10 1.3.2 Conditional Probability . . . . . . . . . . . . . . . . . . . . . . . . 12 1.3.3 The Bayes Calculus . . . . . . . . . . . . . . . . . . . . . . . . . . 13 1.3.4 Conditional Independence . . . . . . . . . . . . . . . . . . . . . . . 15 1.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 2 Random Variables 21 2.1 Probability Distribution and Expectation . . . . . . . . . . . . . . . . . . 21 2.1.1 Random Variables and their Distributions . . . . . . . . . . . . . . 21 2.1.2 Independent Random Variables . . . . . . . . . . . . . . . . . . . . 24 2.1.3 Expectation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 2.1.4 Famous Distributions . . . . . . . . . . . . . . . . . . . . . . . . . 30 2.2 Generating functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 2.2.1 Definition and Properties . . . . . . . . . . . . . . . . . . . . . . . 43 2.2.2 Random Sums . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 2.2.3 Counting with Generating Functions . . . . . . . . . . . . . . . . . 47 2.3 Conditional Expectation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 2.3.1 Conditioning with Respect to an Event . . . . . . . . . . . . . . . 48 2.3.2 Conditioning with Respect to a Random Variable. . . . . . . . . . 52 2.3.3 Basic Properties of Conditional Expectation. . . . . . . . . . . . . 54 2.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 3 Bounds and Inequalities 65 3.1 The Three Basic Inequalities . . . . . . . . . . . . . . . . . . . . . . . . . 65 3.1.1 Markov’s Inequality . . . . . . . . . . . . . . . . . . . . . . . . . . 65 3.1.2 Jensen’s Inequality . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 3.1.3 Schwarz’s Inequality . . . . . . . . . . . . . . . . . . . . . . . . . . 68 3.2 Frequently Used Bounds . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 3.2.1 The Union Bound . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 vii viii CONTENTS 3.2.2 The Chernoff Bounds . . . . . . . . . . . . . . . . . . . . . . . . . 70 3.2.3 The First- and Second-moment Bounds . . . . . . . . . . . . . . . 75 3.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 4 Almost Sure Convergence 79 4.1 Conditions for Almost Sure Convergence . . . . . . . . . . . . . . . . . . . 79 4.1.1 A Sufficient Condition . . . . . . . . . . . . . . . . . . . . . . . . . 79 4.1.2 A Criterion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 4.1.3 Convergence under the Expectation Sign. . . . . . . . . . . . . . . 83 4.2 Kolmogorov’s Strong Law of Large Numbers. . . . . . . . . . . . . . . . . 87 4.2.1 The Square-integrable Case . . . . . . . . . . . . . . . . . . . . . . 87 4.2.2 The General Case . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 4.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 5 The probabilistic Method 93 5.1 Proving Existence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 5.1.1 The Counting Argument . . . . . . . . . . . . . . . . . . . . . . . . 93 5.1.2 The Expectation Argument . . . . . . . . . . . . . . . . . . . . . . 95 5.1.3 Lovasz’s Local Lemma . . . . . . . . . . . . . . . . . . . . . . . . . 100 5.2 Random Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 5.2.1 Las Vegas Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . 105 5.2.2 Monte Carlo Algorithms . . . . . . . . . . . . . . . . . . . . . . . . 107 5.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 6 Markov Chain Models 117 6.1 The Transition Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 6.1.1 Distribution of a Markov Chain . . . . . . . . . . . . . . . . . . . . 117 6.1.2 Sample Path Realization. . . . . . . . . . . . . . . . . . . . . . . . 120 6.1.3 Communication and Period . . . . . . . . . . . . . . . . . . . . . . 128 6.2 Stationary Distribution and Reversibility . . . . . . . . . . . . . . . . . . 131 6.2.1 The Global Balance Equation . . . . . . . . . . . . . . . . . . . . . 131 6.2.2 Reversibility and Detailed Balance . . . . . . . . . . . . . . . . . . 133 6.3 Finite State Space . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135 6.3.1 Perron–Fr¨obenius. . . . . . . . . . . . . . . . . . . . . . . . . . . . 135 6.3.2 The Limit Distribution . . . . . . . . . . . . . . . . . . . . . . . . 138 6.3.3 Spectral Densities . . . . . . . . . . . . . . . . . . . . . . . . . . . 139 6.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142 7 Recurrence of Markov Chains 145 7.1 Recurrent and Transient States . . . . . . . . . . . . . . . . . . . . . . . . 145 7.1.1 The Strong Markov Property . . . . . . . . . . . . . . . . . . . . . 145 7.1.2 The Potential Matrix Criterion of Recurrence . . . . . . . . . . . . 148 7.2 Positive Recurrence. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150 7.2.1 The Stationary Distribution Criterion . . . . . . . . . . . . . . . . 150 7.2.2 The Ergodic Theorem . . . . . . . . . . . . . . . . . . . . . . . . . 157 7.3 The Lyapunov Function Method . . . . . . . . . . . . . . . . . . . . . . . 160 7.3.1 Foster’s Condition of Positive Recurrence . . . . . . . . . . . . . . 160 7.3.2 Queueing Applications . . . . . . . . . . . . . . . . . . . . . . . . . 163 7.4 Fundamental Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169 7.4.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169 7.4.2 Travel Times . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172 CONTENTS ix 7.4.3 Hitting Times Formula. . . . . . . . . . . . . . . . . . . . . . . . . 177 7.5 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180 8 Random Walks on Graphs 185 8.1 Pure Random Walks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185 8.1.1 The Symmetric Random Walks on and 3 . . . . . . . . . . . . 185 8.1.2 Pure Random Walk on a Graph . . . . . . . . . . . . . . . . . . . 190 8.1.3 Spanning Trees and Cover Times . . . . . . . . . . . . . . . . . . . 191 8.2 Symmetric Walks on a Graph . . . . . . . . . . . . . . . . . . . . . . . . . 195 8.2.1 Reversible Chains as Symmetric Walks. . . . . . . . . . . . . . . . 195 8.2.2 The Electrical Network Analogy . . . . . . . . . . . . . . . . . . . 197 8.3 Effective Resistance and Escape Probability . . . . . . . . . . . . . . . . . 201 8.3.1 Computation of the Effective Resistance . . . . . . . . . . . . . . . 201 8.3.2 Thompson’s and Rayleigh’s Principles . . . . . . . . . . . . . . . . 205 8.3.3 Infinite Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207 8.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210 9 Markov Fields on Graphs 215 9.1 Gibbs–Markov Equivalence . . . . . . . . . . . . . . . . . . . . . . . . . . 215 9.1.1 Local Characteristics . . . . . . . . . . . . . . . . . . . . . . . . . . 215 9.1.2 Gibbs Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . 217 9.1.3 Specific Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224 9.2 Phase Transition in the Ising Model . . . . . . . . . . . . . . . . . . . . . 235 9.2.1 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . 235 9.2.2 Peierls’ Argument . . . . . . . . . . . . . . . . . . . . . . . . . . . 238 9.3 Correlation in Random Fields . . . . . . . . . . . . . . . . . . . . . . . . . 240 9.3.1 Increasing Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240 9.3.2 Holley’s Inequality . . . . . . . . . . . . . . . . . . . . . . . . . . . 242 9.3.3 The Potts and Fortuin–Kasteleyn Models . . . . . . . . . . . . . . 244 9.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 247 10 Random Graphs 255 10.1 Branching Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255 10.1.1 Extinction and Survival . . . . . . . . . . . . . . . . . . . . . . . . 255 10.1.2 Tail Distributions. . . . . . . . . . . . . . . . . . . . . . . . . . . . 260 10.2 The Erd¨os–R´enyi Graph . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262 10.2.1 Asymptotically Almost Sure Properties . . . . . . . . . . . . . . . 262 10.2.2 The Evolution of Connectivity . . . . . . . . . . . . . . . . . . . . 270 10.2.3 The Giant Component . . . . . . . . . . . . . . . . . . . . . . . . . 272 10.3 Percolation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279 10.3.1 The Basic Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279 10.3.2 The Percolation Threshold . . . . . . . . . . . . . . . . . . . . . . 280 10.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284 11 Coding Trees 287 11.1 Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287 11.1.1 The Gibbs Inequality . . . . . . . . . . . . . . . . . . . . . . . . . 287 11.1.2 Typical Sequences . . . . . . . . . . . . . . . . . . . . . . . . . . . 292 11.1.3 Uniquely Decipherable Codes . . . . . . . . . . . . . . . . . . . . . 295 11.2 Three Statistics Dependent Codes . . . . . . . . . . . . . . . . . . . . . . 299 11.2.1 The Huffman Code . . . . . . . . . . . . . . . . . . . . . . . . . . . 299 x CONTENTS 11.2.2 The Shannon–Fano–Elias Code . . . . . . . . . . . . . . . . . . . . 302 11.2.3 The Tunstall Code . . . . . . . . . . . . . . . . . . . . . . . . . . . 303 11.3 Discrete Distributions and Fair Coins. . . . . . . . . . . . . . . . . . . . . 310 11.3.1 Representation of Discrete Distributions by Trees . . . . . . . . . . 310 11.3.2 The Knuth–Yao Tree Algorithm . . . . . . . . . . . . . . . . . . . 311 11.3.3 Extraction Functions . . . . . . . . . . . . . . . . . . . . . . . . . . 313 11.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316 12 Shannon’s Capacity Theorem 319 12.1 More Information-theoretic Quantities . . . . . . . . . . . . . . . . . . . . 319 12.1.1 Conditional Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . 319 12.1.2 Mutual Information . . . . . . . . . . . . . . . . . . . . . . . . . . 322 12.1.3 Capacity of Noisy Channels . . . . . . . . . . . . . . . . . . . . . . 327 12.2 Shannon’s Capacity Theorem . . . . . . . . . . . . . . . . . . . . . . . . . 331 12.2.1 Rate versus Accuracy . . . . . . . . . . . . . . . . . . . . . . . . . 331 12.2.2 The Random Coding Argument. . . . . . . . . . . . . . . . . . . . 333 12.2.3 Proof of the Converse . . . . . . . . . . . . . . . . . . . . . . . . . 335 12.2.4 Feedback Does not Improve Capacity . . . . . . . . . . . . . . . . 336 12.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 337 13 The Method of Types 341 13.1 Divergence and Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341 13.1.1 Divergence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341 13.1.2 Empirical Averages. . . . . . . . . . . . . . . . . . . . . . . . . . . 343 13.2 Sanov’s Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 347 13.2.1 A Theorem on Large Deviations . . . . . . . . . . . . . . . . . . . 347 13.2.2 Computation of the Rate of Convergence . . . . . . . . . . . . . . 350 13.2.3 The Maximum Entropy Principle . . . . . . . . . . . . . . . . . . . 351 13.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353 14 Universal Source Coding 357 14.1 Type Encoding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357 14.1.1 A First Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357 14.1.2 Source Coding via Typical Sequences. . . . . . . . . . . . . . . . . 358 14.2 The Lempel–Ziv Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . 359 14.2.1 Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359 14.2.2 Parsings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 361 14.2.3 Optimality of the Lempel–Ziv Algorithm . . . . . . . . . . . . . . 363 14.2.4 Lempel–Ziv Measures Entropy . . . . . . . . . . . . . . . . . . . . 366 14.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 370 15 Asymptotic Behaviour of Markov Chains 373 15.1 Limit Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373 15.1.1 Countable State Space . . . . . . . . . . . . . . . . . . . . . . . . . 373 15.1.2 Absorption . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 375 15.1.3 Variance of Ergodic Estimates . . . . . . . . . . . . . . . . . . . . 380 15.2 Non-homogeneous Markov Chains . . . . . . . . . . . . . . . . . . . . . . 383 15.2.1 Dobrushin’s Ergodic Coefficient . . . . . . . . . . . . . . . . . . . . 383 15.2.2 Ergodicity of Non-homogeneous Markov Chains. . . . . . . . . . . 386 15.2.3 Bounded Variation Extensions . . . . . . . . . . . . . . . . . . . . 390 15.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.