ebook img

Beyond the Worst-Case Analysis of Algorithms PDF

706 Pages·2021·12.67 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Beyond the Worst-Case Analysis of Algorithms

BeyondtheWorst-Case Analysis of Algorithms There are no silver bullets in algorithm design, and no single algorithmic idea is powerful and flexible enough to solve every computational problem. Nor are there silver bullets in algorithm analysis, as the most enlightening method for analyzing an algorithm often depends on the problem and the application. However, typical algorithmscoursesrelyalmostentirelyonasingleanalysisframework,thatof worst- caseanalysis,whereinanalgorithmisassessedbyitsworstperformanceonanyinput of agivensize. The purpose of this book is to popularize several alternatives to worst-case analysis and their most notable algorithmic applications, from clustering to linear programmingtoneuralnetworktraining.Fortyleadingresearchershavecontributed introductionstodifferentfacetsof thisfield,emphasizingthemostimportantmodels andresults,manyof whichcanbetaughtinlecturestobeginninggraduatestudents intheoreticalcomputerscienceandmachinelearning. Tim Roughgarden is a professor of computer science at Columbia University. For his research, he has been awarded the ACM Grace Murray Hopper Award, the Presidential Early Career Award for Scientists and Engineers (PECASE), the KalaiPrizeinComputerScienceandGameTheory,theSocialChoiceandWelfare Prize, the Mathematical Programming Society’s Tucker Prize, and the EATCS- SIGACTGödelPrize.Hewasaninvitedspeakeratthe2006InternationalCongress of Mathematicians, the Shapley Lecturer at the 2008 World Congress of the Game TheorySociety,andaGuggenheimFellowin2017.HisotherbooksincludeTwenty Lectures on Algorithmic Game Theory (2016) and the Algorithms Illuminated book series(2017–2020). Beyond the Worst-Case Analysis of Algorithms Editedby Tim Roughgarden ColumbiaUniversity,NewYork UniversityPrintingHouse,CambridgeCB28BS,UnitedKingdom OneLibertyPlaza,20thFloor,NewYork,NY10006,USA 477WilliamstownRoad,PortMelbourne,VIC3207,Australia 314–321,3rdFloor,Plot3,SplendorForum,JasolaDistrictCentre,NewDelhi–110025,India 79AnsonRoad,#06–04/06,Singapore079906 CambridgeUniversityPressispartoftheUniversityofCambridge. ItfurtherstheUniversity’smissionbydisseminatingknowledgeinthepursuitof education,learning,andresearchatthehighestinternationallevelsofexcellence. www.cambridge.org Informationonthistitle:www.cambridge.org/9781108494311 DOI:10.1017/9781108637435 ©CambridgeUniversityPress2021 Thispublicationisincopyright.Subjecttostatutoryexception andtotheprovisionsofrelevantcollectivelicensingagreements, noreproductionofanypartmaytakeplacewithoutthewritten permissionofCambridgeUniversityPress. Firstpublished2021 PrintedintheUnitedKingdombyTJBooksLimited,PadstowCornwall AcataloguerecordforthispublicationisavailablefromtheBritishLibrary. ISBN978-1-108-49431-1Hardback CambridgeUniversityPresshasnoresponsibilityforthepersistenceoraccuracyof URLsforexternalorthird-partyinternetwebsitesreferredtointhispublication anddoesnotguaranteethatanycontentonsuchwebsitesis,orwillremain, accurateorappropriate. Contents Preface pagexiii Listof Contributors xv 1 Introduction 1 TimRoughgarden 1.1 TheWorst-CaseAnalysisof Algorithms 1 1.2 FamousFailuresandtheNeedforAlternatives 3 1.3 Example:ParameterizedBoundsinOnlinePaging 8 1.4 Overviewof theBook 12 1.5 Notes 20 PARTONE REFINEMENTSOFWORST-CASEANALYSIS 2 ParameterizedAlgorithms 27 FedorV.Fomin,DanielLokshtanov,SaketSaurabh,andMeiravZehavi 2.1 Introduction 27 2.2 Randomization 31 2.3 StructuralParameterizations 34 2.4 Kernelization 35 2.5 HardnessandOptimality 39 2.6 Outlook:NewParadigmsandApplicationDomains 42 2.7 TheBigPicture 46 2.8 Notes 47 3 FromAdaptiveAnalysistoInstanceOptimality 52 JérémyBarbay 3.1 CaseStudy1:MaximaSets 52 3.2 CaseStudy2:Instance-OptimalAggregationAlgorithms 60 3.3 Surveyof AdditionalResultsandTechniques 64 3.4 Discussion 65 3.5 SelectedOpenProblems 66 3.6 KeyTakeaways 67 3.7 Notes 68 v CONTENTS 4 ResourceAugmentation 72 TimRoughgarden 4.1 OnlinePagingRevisited 72 4.2 Discussion 75 4.3 SelfishRouting 77 4.4 SpeedScalinginScheduling 81 4.5 LooselyCompetitiveAlgorithms 86 4.6 Notes 89 PARTTWO DETERMINISTICMODELSOFDATA 5 PerturbationResilience 95 KonstantinMakarychevandYuryMakarychev 5.1 Introduction 95 5.2 CombinatorialOptimizationProblems 98 5.3 DesigningCertifiedAlgorithms 101 5.4 Examplesof CertifiedAlgorithms 106 5.5 Perturbation-ResilientClusteringProblems 108 5.6 Algorithmfor2-Perturbation-ResilientInstances 111 5.7 (3+ε)-CertifiedLocalSearchAlgorithmfork-Medians 113 5.8 Notes 115 6 ApproximationStabilityandProxyObjectives 120 AvrimBlum 6.1 IntroductionandMotivation 120 6.2 DefinitionsandDiscussion 121 6.3 Thek-MedianProblem 125 6.4 k-Means,Min-Sum,andOtherClusteringObjectives 132 6.5 ClusteringApplications 133 6.6 NashEquilibria 134 6.7 TheBigPicture 135 6.8 OpenQuestions 136 6.9 Relaxations 137 6.10 Notes 137 7 SparseRecovery 140 EricPrice 7.1 SparseRecovery 140 7.2 ASimpleInsertion-OnlyStreamingAlgorithm 142 7.3 HandlingDeletions:LinearSketchingAlgorithms 143 7.4 UniformAlgorithms 148 7.5 LowerBound 154 7.6 DifferentMeasurementModels 155 7.7 MatrixRecovery 158 7.8 Notes 160 vi CONTENTS PARTTHREE SEMIRANDOMMODELS 8 DistributionalAnalysis 167 TimRoughgarden 8.1 Introduction 167 8.2 Average-CaseJustificationsof ClassicalAlgorithms 171 8.3 Good-on-AverageAlgorithmsforEuclideanProblems 175 8.4 RandomGraphsandPlantedModels 179 8.5 RobustDistributionalAnalysis 183 8.6 Notes 184 9 IntroductiontoSemirandomModels 189 UrielFeige 9.1 Introduction 189 9.2 WhyStudySemirandomModels? 192 9.3 SomeRepresentativeWork 196 9.4 OpenProblems 209 10 SemirandomStochasticBlockModels 212 AnkurMoitra 10.1 Introduction 212 10.2 RecoveryviaSemidefiniteProgramming 215 10.3 RobustnessAgainstaMonotoneAdversary 218 10.4 InformationTheoreticLimitsof ExactRecovery 219 10.5 PartialRecoveryandBelief Propagation 221 10.6 RandomversusSemirandomSeparations 223 10.7 AboveAverage-CaseAnalysis 226 10.8 SemirandomMixtureModels 230 11 Random-OrderModels 234 AnupamGuptaandSahilSingla 11.1 Motivation:PickingaLargeElement 234 11.2 TheSecretaryProblem 237 11.3 Multiple-SecretaryandOtherMaximizationProblems 238 11.4 MinimizationProblems 247 11.5 RelatedModelsandExtensions 250 11.6 Notes 254 12 Self-ImprovingAlgorithms 259 C.Seshadhri 12.1 Introduction 259 12.2 InformationTheoryBasics 263 12.3 TheSelf-ImprovingSorter 266 12.4 Self-ImprovingAlgorithmsfor2DMaxima 272 12.5 MoreSelf-ImprovingAlgorithms 277 12.6 Critiqueof theSelf-ImprovingModel 278 vii CONTENTS PARTFOUR SMOOTHEDANALYSIS 13 SmoothedAnalysisof LocalSearch 285 BodoManthey 13.1 Introduction 285 13.2 SmoothedAnalysisof theRunningTime 286 13.3 SmoothedAnalysisof theApproximationRatio 301 13.4 DiscussionandOpenProblems 304 13.5 Notes 305 14 SmoothedAnalysisof theSimplexMethod 309 DanielDadushandSophieHuiberts 14.1 Introduction 309 14.2 TheShadowVertexSimplexMethod 310 14.3 TheSuccessiveShortestPathAlgorithm 315 14.4 LPswithGaussianConstraints 319 14.5 Discussion 329 14.6 Notes 330 15 SmoothedAnalysisof ParetoCurvesinMultiobjectiveOptimization 334 HeikoRöglin 15.1 AlgorithmsforComputingParetoCurves 334 15.2 Numberof Pareto-optimalSolutions 342 15.3 SmoothedComplexityof BinaryOptimizationProblems 352 15.4 Conclusions 354 15.5 Notes 355 PARTFIVE APPLICATIONSINMACHINELEARNING ANDSTATISTICS 16 NoiseinClassification 361 Maria-FlorinaBalcanandNikaHaghtalab 16.1 Introduction 361 16.2 Model 362 16.3 TheBestCaseandtheWorstCase 363 16.4 Benefitsof AssumptionsontheMarginalDistribution 365 16.5 Benefitsof AssumptionsontheNoise 374 16.6 FinalRemarksandCurrentResearchDirections 378 17 RobustHigh-DimensionalStatistics 382 IliasDiakonikolasandDanielM.Kane 17.1 Introduction 382 17.2 RobustMeanEstimation 384 17.3 BeyondRobustMeanEstimation 396 17.4 Notes 399 viii

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.