computer science/statistics INTRODUCTION TO INTRODUCTION TO STATISTICAL RELATIONAL LEARNING STATISTICAL RELATIONAL Lise Getooris Assistant Professor in the Of related interest Department of Computer Science at the LEARNING University of Maryland. Ben Taskar is Assistant GAUSSIAN PROCESSES FOR MACHINE LEARNING Professor in the Computer and Information Carl Edward Rasmussen and Christopher K. I. Williams Science Department at the University of GE S IN EDITED BY LISE GETOOR AND BEN TASKAR EDITED BY LISE GETOOR AND BEN TASKAR T T Pennsylvania. Gaussian processes (GPs) provide a principled, practical, probabilistic approach T O A R to learning in kernel machines. GPs have received increased attention in the O O Handling inherent uncertainty and exploiting R T D machine-learning community over the past decade, and this book provides U compositional structure are fundamental to under- A I Adaptive Computation and Machine Learning series a long-needed systematic and unified treatment of theoretical and prac- C N S standing and designing large-scale systems. T tical aspects of GPs in machine learning. The treatment is compre- D T I Statistical relational learning builds on ideas from O hensive and self-contained, targeted at researchers and students in T I N probability theory and statistics to address uncer- machine learning and applied statistics. A C tainty while incorporating tools from logic, data- S T K A O bases, and programming languages to represent A structure. In Introduction to Statistical Relational R L , Learning,leading researchers in this emerging E R E area of machine learning describe current for- SEMI-SUPERVISED LEARNING D edited by Olivier Chapelle, Bernhard Schölkopf, and Alexander Zien IT E malisms, models, and algorithms that enable O effective and robust reasoning about richly struc- L R tured systems and data. In the field of machine learning, semi-supervised learning (SSL) occupies S A ∑ The early chapters provide tutorials for mate- the middle ground between supervised learning (in which all training exam- T rial used in later chapters, offering introductions ples are labeled) and unsupervised learning (in which no label data are I to representation, inference and learning in graph- given). Interest in SSL has increased in recent years, particularly because of O ical models, and logic. The book then describes application domains in which unlabeled data are plentiful, such as images, N object-oriented approaches, including probabilistic text, and bioinformatics. This first comprehensive overview of SSL presents A relational models, relational Markov networks, and state-of-the-art algorithms, a taxonomy of the field, selected applications, L probabilistic entity-relationship models as well as benchmark experiments, and perspectives on ongoing and future research. logic-based formalisms including Bayesian logic A L programs, Markov logic, and stochastic logic pro- E grams. Later chapters discuss such topics as A probabilistic models with unknown objects,rela- R tional dependency networks, reinforcement learn- ing in relational domains, and information extraction. N The MIT Press By presenting a variety of approaches, the I Massachusetts Institute of Technology N book highlights commonalities and clarifies impor- Cambridge, Massachusetts 02142 tantdifferences among proposed approaches and, G http://mitpress.mit.edu along the way, identifies important representational and algorithmic issues. Numerous applications are 978-0-262-07288-5 ∏ 0-262-07288-2 provided throughout. Cover design based on an animation by MAXAM, http://www.maxamania.com ,!7IA2G2-ahciif!:t;K;k;K;k Introduction to Statistical Relational Learning Adaptive Computationand Machine Learning ThomasDietterich,Editor Christopher M. Bishop, David Heckerman, Michael I. Jordan, and Michael Kearns, Associate Editors Bioinformatics: The Machine Learning Approach PierreBaldiandSørenBrunak,1998 Reinforcement Learning: AnIntroduction RichardS.SuttonandAndrewG.Barto,1998 Graphical Models for Machine Learning and Digital Communication BrendanJ.Frey,1998 Learning inGraphical Models MichaelI.Jordan,ed.,1998 Causation, Prediction, and Search, 2ndEdition PeterSpirtes,ClarkGlymour,andRichardScheines, 2001 Principles of DataMining DavidHand,HeikkiMannila,andPadhraicSmyth,2001 Bioinformatics: The Machine Learning Approach, 2ndEdition PierreBaldiandSørenBrunak,2001 Learning withKernels: Support VectorMachines, Regularization, Optimization, and Beyond BernhardSch¨olkopf andAlexanderJ.Smola,2001 Learning KernelClassifiers: Theory and Algorithms RalfHerbrich,2001 Introduction to Machine Learning EthemAlpaydin,2004 Gaussian Processes for Machine Learning CarlEdwardRasmussenandChristopherK.I.Williams,2005 Semi-Supervised Learning OlivierChapelle,BernhardSch¨olkopf, andAlexanderZien,eds.2006 The Minimum Description LengthPrinciple PeterD.Gru¨nwald,2007 Introduction to StatisticalRelational Learning LiseGetoor andBenTaskar,eds.,2007 Introduction to Statistical Relational Learning edited by Lise Getoor Ben Taskar The MIT Press Cambridge, Massachusetts London, England (cid:2)c2007Massachusetts Institute ofTechnology All rights reserved. No part of this book may be reproduced in any form by any electronic or mechanical means (including photocopying, recording, or information storage and retrieval) withoutpermissioninwritingfromthepublisher. TypesetbytheauthorsusingLATEX2ε PrintedandboundintheUnitedStates ofAmerica LibraryofCongressCataloging-in-PublicationData Introductiontostatisticalrelationallearning/editedbyLiseGetoor,BenTaskar. p. cm. Includesbibliographicalreferencesandindex. ISBN978-0-262-07288-5(hardcover :alk.paper) 1. Relational databases. 2. Machine learning–Statistical methods 3. Computer algorithms. I. Getoor,Lise.II.Taskar,Ben. QA76.9.D3I682007 006.3’1–dc22 2007000951 10987654321 Contents Series Foreword xi Preface xiii 1 Introduction 1 Lise Getoor, Ben Taskar 1.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 BriefHistoryofRelationalLearning . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.3 EmergingTrends . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.4 Statistical RelationalLearning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.5 ChapterMap . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.6 Outlook . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 2 Graphical Models in a Nutshell 13 Daphne Koller, NirFriedman, Lise Getoor, Ben Taskar 2.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.2 Representation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.3 Inference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 2.4 Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 2.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 3 Inductive Logic Programming in a Nutshell 57 Saˇso Dˇzeroski 3.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 3.2 LogicProgramming. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 3.3 Inductive LogicProgramming:Settings andApproaches . . . . . . . . . . . . . . . 64 3.4 RelationalClassificationRules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 3.5 RelationalDecisionTrees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 3.6 RelationalAssociationRules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 3.7 RelationalDistance-BasedMethods . . . . . . . . . . . . . . . . . . . . . . . . . . 84 3.8 RecentTrendsinILPandRDM . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 4 An Introduction to Conditional Random Fields for RelationalLearning 93 Charles Sutton, Andrew McCallum 4.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 4.2 GraphicalModels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 4.3 Linear-ChainConditional RandomFields . . . . . . . . . . . . . . . . . . . . . . . 100 4.4 CRFsinGeneral . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 4.5 Skip-ChainCRFs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 4.6 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122 vi Contents 5 Probabilistic RelationalModels 129 Lise Getoor, Nir Friedman, Daphne Koller, Avi Pfeffer, Ben Taskar 5.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129 5.2 PRMRepresentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130 5.3 TheDifferencebetween PRMsandBayesianNetworks . . . . . . . . . . . . . . . . 140 5.4 PRMswithStructural Uncertainty . . . . . . . . . . . . . . . . . . . . . . . . . . . 141 5.5 ProbabilisticModelofLinkStructure . . . . . . . . . . . . . . . . . . . . . . . . . 141 5.6 PRMswithClassHierarchies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151 5.7 InferenceinPRMs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159 5.8 Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161 5.9 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 6 Relational Markov Networks 175 BenTaskar, PieterAbbeel, Ming-Fai Wong, Daphne Koller 6.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175 6.2 RelationalClassificationandLinkPrediction . . . . . . . . . . . . . . . . . . . . . 177 6.3 GraphStructureandSubgraphTemplates . . . . . . . . . . . . . . . . . . . . . . . 178 6.4 UndirectedModelsforClassification . . . . . . . . . . . . . . . . . . . . . . . . . . 180 6.5 LearningtheModels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184 6.6 ExperimentalResults. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187 6.7 DiscussionandConclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197 7 Probabilistic Entity-RelationshipModels, PRMs, and Plate Models 201 David Heckerman, Chris Meek, Daphne Koller 7.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201 7.2 Background: GraphicalModels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202 7.3 TheBasicIdeas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204 7.4 ProbabilisticEntity-RelationshipModels . . . . . . . . . . . . . . . . . . . . . . . . 210 7.5 PlateModels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226 7.6 ProbabilisticRelationalModels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228 7.7 TechnicalDetails . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229 7.8 ExtensionsandFutureWork . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233 8 Relational Dependency Networks 239 JenniferNeville, David Jensen 8.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239 8.2 Dependency Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242 8.3 RelationalDependency Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243 8.4 Experiments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252 8.5 RelatedWork . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262 8.6 DiscussionandFutureWork. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264 9 Logic-based Formalismsfor Statistical Relational Learning 269 James Cussens 9.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 269 9.2 Representation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 271 9.3 Inference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278 9.4 Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 281 9.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287 10Bayesian Logic Programming: Theory and Tool 291 KristianKersting, Luc De Raedt Contents vii 10.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 291 10.2 OnBayesianNetworksandLogicPrograms . . . . . . . . . . . . . . . . . . . . . . 293 10.3 BayesianLogicPrograms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 296 10.4 Extensions oftheBasicFramework . . . . . . . . . . . . . . . . . . . . . . . . . . . 304 10.5 LearningBayesianLogicPrograms . . . . . . . . . . . . . . . . . . . . . . . . . . . 311 10.6 Balios–TheEngineforBasicLogicPrograms . . . . . . . . . . . . . . . . . . . . 315 10.7 RelatedWork . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315 10.8 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 318 11Stochastic Logic Programs: A Tutorial 323 Stephen Muggleton, Niels Pahlavi 11.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 323 11.2 MixingDeterministicandProbabilisticChoice . . . . . . . . . . . . . . . . . . . . 324 11.3 Stochastic Grammars. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 330 11.4 Stochastic LogicPrograms. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 333 11.5 LearningTechniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 335 11.6 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 337 12Markov Logic: A Unifying Framework for Statistical RelationalLearning 339 Pedro Domingos, MatthewRichardson 12.1 TheNeedforaUnifyingFramework . . . . . . . . . . . . . . . . . . . . . . . . . . 339 12.2 MarkovNetworks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341 12.3 First-OrderLogic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 342 12.4 MarkovLogic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 344 12.5 SRLApproaches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 350 12.6 SRLTasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 354 12.7 Inference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 356 12.8 Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 358 12.9 Experiments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 360 12.10 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 367 13BLOG: ProbabilisticModels with Unknown Objects 373 Brian Milch, Bhaskara Marthi, Stuart Russell, David Sontag, Daniel L. Ong, Andrey Kolobov 13.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373 13.2 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 375 13.3 SyntaxandSemantics: PossibleWorlds. . . . . . . . . . . . . . . . . . . . . . . . . 378 13.4 SyntaxandSemantics: Probabilities . . . . . . . . . . . . . . . . . . . . . . . . . . 383 13.5 EvidenceandQueries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388 13.6 Inference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388 13.7 RelatedWork . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 393 13.8 ConclusionsandFutureWork . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394 14The Design and Implementation of IBAL: A General-Purpose Probabilistic Language 399 Avi Pfeffer 14.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 399 14.2 TheIBALLanguage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 401 14.3 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407 14.4 Semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 411 14.5 DesiderataforInference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 415 14.6 RelatedApproaches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 416 14.7 Inference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 419 viii Contents 14.8 LessonsLearnedandConclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . 429 15Lifted First-Order ProbabilisticInference 433 Rodrigo de Salvo Braz, Eyal Amir, DanRoth 15.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 433 15.2 Language, Semantics andInferenceproblem . . . . . . . . . . . . . . . . . . . . . . 435 15.3 TheFirst-OrderVariableElimination(FOVE)algorithm. . . . . . . . . . . . . . . 437 15.4 Anexperiment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 444 15.5 Auxiliaryoperations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 446 15.6 Applicabilityofliftedinference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 448 15.7 FutureDirections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 449 15.8 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 449 16Feature Generation and Selection in Multi-RelationalStatisticalLearning 453 Alexandrin Popescul, LyleH. Ungar 16.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453 16.2 DetailedMethodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 458 16.3 ExperimentalEvaluation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 463 16.4 RelatedWorkandDiscussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 471 16.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 472 17Learning a New View of a Database: With an Application in Mammography 477 Jesse Davis, Elizabeth Burnside, Inˆes Dutra, David Page, Raghu Ramakrishnan, Jude Shavlik, V´ıtor Santos Costa 17.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 477 17.2 ViewLearningforMammography. . . . . . . . . . . . . . . . . . . . . . . . . . . . 478 17.3 NaiveViewLearningFramework . . . . . . . . . . . . . . . . . . . . . . . . . . . . 482 17.4 InitialExperiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 483 17.5 Integrated ViewLearningFramework. . . . . . . . . . . . . . . . . . . . . . . . . . 490 17.6 FurtherExperimentsandResults . . . . . . . . . . . . . . . . . . . . . . . . . . . . 491 17.7 RelatedWork . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 493 17.8 ConclusionsandFutureWork . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494 18Reinforcement Learning in Relational Domains: A Policy-LanguageApproach 499 Alan Fern, SungWook Yoon, Robert Givan 18.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 499 18.2 ProblemSetup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 502 18.3 ApproximatePolicyIterationwithaPolicyLanguage Bias. . . . . . . . . . . . . . 503 18.4 APIforRelational Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 507 18.5 Bootstrapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 516 18.6 RelationalPlanningExperiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . 520 18.7 RelatedWork . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 527 18.8 SummaryandFutureWork . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 530 19Statistical Relational Learning for Natural Language Information Extraction 535 Razvan C. Bunescu, Raymond J. Mooney 19.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 535 19.2 BackgroundonNaturalLanguageProcessing . . . . . . . . . . . . . . . . . . . . . 536 19.3 InformationExtraction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 537 19.4 CollectiveInformationExtractionwithRMNs . . . . . . . . . . . . . . . . . . . . . 538 19.5 FutureResearchonSRLforNLP . . . . . . . . . . . . . . . . . . . . . . . . . . . . 549 19.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 550 Contents ix 20Global Inference for Entity and Relation Identification via a Linear Programming Formulation 553 Dan Roth, Wen-tau Yih 20.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 553 20.2 TheRelationalInferenceProblem. . . . . . . . . . . . . . . . . . . . . . . . . . . . 556 20.3 Integer LinearProgrammingInference . . . . . . . . . . . . . . . . . . . . . . . . . 560 20.4 SolvingInteger LinearProgramming . . . . . . . . . . . . . . . . . . . . . . . . . . 562 20.5 Experiments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 563 20.6 ComparisonwithOtherInferenceMethods . . . . . . . . . . . . . . . . . . . . . . 570 20.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 576 Contributors 581 Index 587
Description: