ebook img

Self-Adjusting Computation PDF

299 Pages·2012·1.6 MB·English
by  
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Self-Adjusting Computation

Self-Adjusting Computation UmutA.Acar May2005 CMU-CS-05-129 SchoolofComputerScience CarnegieMellonUniversity Pittsburgh, PA15213 ThesisCommittee: GuyBlelloch,co-chair RobertHarper,co-chair DanielDominicKaplanSleator SimonPeytonJones,Microsoft Research,Cambridge,UK RobertEndreTarjan,PrincetonUniversity Submittedinpartialful(cid:2)llment oftherequirements forthedegreeofDoctorofPhilosophy (cid:13)c 2005UmutA.Acar ThisresearchwassponsoredinpartbytheNationalScienceFoundationundergrantCCR-0085982,CCR-0122581andEIA- 9706572, andtheDepartmentofEnergyundercontractno. DE-FG02-91ER40682. Theviewsandconclusionscontainedinthis documentarethoseoftheauthorandshouldnotbeinterpretedasrepresentingtheof(cid:2)cialpolicies,eitherexpressedorimplied,of anysponsoringinstitution,theU.S.governmentoranyotherentity. Keywords: Self-adjusting computation, dynamic algorithms, dynamic data structures, kinetic data structures, dynamic dependence graphs, memoization, change propagation, trace stability, functional pro- gramming, lambda calculus, type systems, operational semantics, modi(cid:2)able references, selective memo- ization,sorting,convexhulls,paralleltreecontraction, dynamictrees,rake-and-compress trees. Abstract This thesis investigates a model of computation, called self-adjusting computation, where computations adjust to any external change to their data (state) automatically. Theexternal changes can change any data (e.g., the input) or decisions made during the computation. For example, a self-adjusting program can compute a property of a dynamically changing set of objects, or a set of moving objects, etc. This thesis presentsalgorithmicandprogramming-language techniquesfordevising,analyzing,andimplementingself- adjusting programs. From the algorithmic perspective, we describe novel data structures for tracking the dependences in a computation and a change-propagation algorithm for adjusting computations to changes. We show that the overhead of our dependence tracking techniques is O(1). To determine the effectiveness of change propagation,wepresentananalysistechnique,calledtracestability,andapplyittoanumberofapplications. From the languages perspective, we describe language facilities for writing self-adjusting programs in a type-safe and correct manner. The techniques make writing self-adjusting programs nearly as easy as ordinary(non-self-adjusting)programs. Akeypropertyofthetechniquesisthattheyenabletheprogrammer to control the cost of dependence tracking by applying it selectively. Using language techniques, we also formalizethechange-propagation algorithmandprovethatitiscorrect. Wedemonstrate thatourtechniques areef(cid:2)cientbothintheoryandinpracticebyconsidering anumber ofapplications. Ourapplicationsincludearandomsamplingalgorithmonlists,thequicksortandmergesort algorithms, the Graham’s Scan and the quick algorithm for planar convex hull, and the tree-contraction al- gorithm. Fromthetheoretical perspective, weapplytracestabilitytoourapplications andshowcomplexity bounds that are within an expected constant factor of the best bounds achieved by special-purpose algo- rithms. From the practical perspective, we implement a general purpose library for writing self-adjusting programs, and implement and evaluate self-adjusting versions of our applications. Our experiments show that our techniques dramatically simplify writing self-adjusting programs, and can yield very good perfor- manceevenwhencompared tospecial-purpose algorithms, bothintheoryandinpractice. FormyparentsDu¤rdaneandI(cid:1)smailAcar Acknowledgments Thisthesiswouldnothavebeenpossible withoutthesupport andencouragement ofmyadvisors GuyBlel- lochandRobertHarper. Manyoftheresultsinthisthesishavecomeoutoflongconversations andmeetings withBobandGuy. Ithankyoubothverymuch. I thank Simon Peyton Jones and Bob Tarjan for supporting my research endevours and for giving me feedbackonthethesis. Simon’sfeedbackwascriticalintyingthedifferentpartsofthethesistogether. Bob’s feedback helped simplify the (cid:2)rst three parts of the thesis and started us thinking about some interesting questions. My colleague Maverick Woo helped with the proofs in the third part of the thesis, especially on the stability oftreecontraction. Maverickalsouncovered abodyofrelatedworkthatwewerenotawareof. I thank Renato Werneck (of Princeton) for giving us his code for the Link-Cut trees, and for many discussions aboutdynamic-trees datastructures. IthankJernejBarbicforhishelpwiththegraphicslibrary forvisualizing kineticconvexhulls. IhavehadthechancetoworkwithbrightyoungresearchersatCMU.JorgeVittes(nowatStanford)did alotofthehardworkinimplementing ourlibraries forkinetic datastructures andfordynamictrees. Kanat Tangwongsan helped implement keyparts ourSMLlibrary andhelped withtheexperiments. Ithank Jorge andKanatfortheirenthusiasm anddedication. Matthias Blume (of Toyota Technological Institute) helped with the implementation of the SML li- brary. Matthias and John Reppy (of the University ofChicago) helped in (cid:2)guring out the intricacies of the SML/NJ’sgarbagecollector. Many colleagues and friends from CMU enriched my years in graduate school. I thank them for their friendship: KonstantinAndreev,JernejBarbic,PaulBennett,MihaiBudiu,ArmandDebruge,TerryDerosia, Derek Dreyer, Jun Gao, Mor Harchol-Balter, Stavros Harizopoulos, Rose Hoberman, Laurie Hiyakumoto, Yiannis Koutis,GaryMiller, AleksNanevski, AlinaOprea,FlorinOprea,LinaPapadaki,Spriros Papadim- itrou, Sanjay Rao, Daniel Spoonhover, Srinath Sridhar, Mike Vandeweghe, Virginia Vassilevska, Maverick Woo,ShuhengZhou. Outside CMU, I thank Yasemin Altun (of TTI), Go¤rkem C‚elik (of UBC), Thu Doan (of IBM Austin), Burak Erdogan (of UIUC),Ram Mettu (of Dartmouth College) for their friendship and for the great times biking, rock-climbing, snowboarding, or just enjoying. I had great fun hanging out with the RISD crowd; thankstoyouJenniKatajama¤ki,CelesteMink,QuinnShamlian,AmyStein,andJaredZimmerman. Finally, I thank my family for their unwavering support. My parents Du¤rdane and (cid:1)Ismail, my brother Ug(cid:16)ur,andmysisterAsl(cid:17)havealwaysbeentherewhenIneeded them. vii viii ACKNOWLEDGMENTS Contents 1 Introduction 1 1.1 OverviewandContributions ofthisThesis . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.1.1 PartI: AlgorithmsandDataStructures . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.1.2 PartII: TraceStability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.1.3 PartIII: Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.1.4 PartIV: LanguageTechniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.1.5 PartV: Implementation andExperiments . . . . . . . . . . . . . . . . . . . . . . . 6 2 RelatedWork 9 2.1 AlgorithmsCommunity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 2.1.1 DesignandAnalysisTechniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 2.1.2 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 2.1.3 ThisThesis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.2 ProgrammingLanguages Community . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.2.1 StaticDependence Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.2.2 Memoization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.2.3 PartialEvaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.2.4 ThisThesis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 I AlgorithmsandDataStructures 17 3 TheMachineModel 21 3.1 TheClosureMachine . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 3.2 TheNormalForm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 3.2.1 TheCorrespondence betweenaProgramanditsNormalForm . . . . . . . . . . . . 26 4 DynamicDependenceGraphs 29 4.1 DynamicDependence Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 ix x CONTENTS 4.2 VirtualClockandtheOrderMaintenance DataStructure . . . . . . . . . . . . . . . . . . . 30 4.3 Constructing DynamicDependence Graphs. . . . . . . . . . . . . . . . . . . . . . . . . . . 31 4.4 ChangePropagation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 4.4.1 ExampleChangePropagation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 5 MemoizedDynamicDependenceGraphs 37 5.1 LimitationsofDynamicDependence Graphs. . . . . . . . . . . . . . . . . . . . . . . . . . 38 5.2 MemoizedDynamicDependence Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 5.3 Constructing MemoizedDynamicDependence Graphs. . . . . . . . . . . . . . . . . . . . . 40 5.4 MemoizedChangePropagation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 5.5 Discussions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 6 DataStructuresandAnalysis 45 6.1 DataStructures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 6.1.1 TheVirtualClock . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 6.1.2 DynamicDependence Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 6.1.3 MemoTables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 6.1.4 PriorityQueues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 6.2 Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 II Trace Stability 51 7 TracesandTraceDistance 55 7.1 Traces,Cognates,andTraceDistance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 7.2 Intrinsic (Minimum)TraceDistance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 7.3 Monotone TracesandChangePropagation . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 7.3.1 ChangePropagationforMonotone Traces . . . . . . . . . . . . . . . . . . . . . . . 60 8 TraceStability 65 8.1 TraceModels,InputChanges,andTraceStability . . . . . . . . . . . . . . . . . . . . . . . 65 8.2 Bounding thepriorityqueueoverhead . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 8.2.1 Dependence width . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 8.2.2 Read-WriteRegularComputations . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 8.3 TraceStabilityTheorems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 III Applications 73 9 ListAlgorithms 77

Description:
(e.g., the input) or decisions made during the computation. From the algorithmic perspective, we describe novel data structures for tracking the dependences in a computation and a change-propagation algorithm for adjusting The techniques make writing self-adjusting programs nearly as easy as.
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.