Algorithms Lecture 1: Recursion [Fa’13] Thecontrolofalargeforceisthesameprincipleasthecontrolofafewmen: itismerelyaquestionofdividinguptheirnumbers. —SunZi,TheArtofWar(c.400C.E.),translatedbyLionelGiles(1910) Ourlifeisfritteredawaybydetail....Simplify,simplify. —HenryDavidThoreau,Walden(1854) Nothingisparticularlyhardifyoudivideitintosmalljobs. —HenryFord Dothehardjobsfirst.Theeasyjobswilltakecareofthemselves. —DaleCarnegie 1 Recursion 1.1 Reductions Reductionisthesinglemostcommontechniqueusedindesigningalgorithms. Reducingoneproblem X toanotherproblem Y meanstowriteanalgorithmfor X thatusesanalgorithmfor Y asablackboxor subroutine. Crucially,thecorrectnessoftheresultingalgorithmcannotdependinanywayonhowthe algorithmfor Y works. Theonlythingwecanassumeisthattheblackboxsolves Y correctly. Theinner workingsoftheblackboxaresimplynoneofourbusiness;they’resomebodyelse’sproblem. It’soften besttoliterallythinkoftheblackboxasfunctioningbymagic. Forexample,theHuntington-HillalgorithmdescribedinLecture0reducestheproblemofapportion- ingCongresstotheproblemofmaintainingapriorityqueuethatsupportstheoperationsINSERT and EXTRACTMAX. Theabstractdatatype“priorityqueue”isablackbox;thecorrectnessoftheapportionment algorithmdoesnotdependonanyspecificpriorityqueuedatastructure. Ofcourse,therunningtimeof theapportionmentalgorithmdependsontherunningtimeoftheINSERT andEXTRACTMAX algorithms, butthat’saseparateissuefromthecorrectnessofthealgorithm. Thebeautyofthereductionisthatwe cancreateamoreefficientapportionmentalgorithmbysimplyswappinginanewpriorityqueuedata structure. Moreover, the designer of that data structure does not need to know or care that it will be usedtoapportionCongress. Whenyoudesignalgorithms,youmaynotknowexactlyhowthebasicbuildingblocksyouuseare implemented,orhowyouralgorithmsmightbeusedasbuildingblockstosolveevenbiggerproblems. Evenwhenyoudoknowpreciselyhowyourcomponentswork,itisoftenextremely usefultopretend thatyoudon’t. 1.2 Simplify and Delegate Recursionisaaparticularlypowerfulkindofreduction,whichcanbedescribedlooselyasfollows: • Ifthegiveninstanceoftheproblemissmallorsimpleenough,justsolveit. • Otherwise,reducetheproblemtooneormoresimpler instances of the same problem. Iftheself-referenceisconfusing,it’shelpfultoimaginethatsomeoneelseisgoingtosolvethesimpler problems, just as you would assume for other types of reductions. I like to call that someone else the Recursion Fairy. Your only task is to simplify the original problem, or to solve it directly when simplification is either unnecessary or impossible; the Recursion Fairy will magically take care of (cid:13)c Copyright2013JeffErickson. ReleasedunderaCreativeCommonsAttribution-NonCommercial-ShareAlike3.0License(http://creativecommons.org/licenses/by-nc-sa/3.0/). Freedistributionisstronglyencouraged;commercialdistributionisexpresslyforbidden. Seehttp://www.cs.uiuc.edu/~jeffe/teaching/algorithms/forthemostrecentrevision. 1 Algorithms Lecture 1: Recursion [Fa’13] all the simpler subproblems for you, using Methods That Are None Of Your Business So Butt Out.1 MathematicallysophisticatedreadersmightrecognizetheRecursionFairybyitsmoreformalname,the InductionHypothesis. There is one mild technical condition that must be satisfied in order for any recursive method to workcorrectly: Theremustbenoinfinitesequenceofreductionsto‘simpler’and‘simpler’subproblems. Eventually,therecursivereductionsmuststopwithanelementarybasecasethatcanbesolvedbysome othermethod;otherwise,therecursivealgorithmwillloopforever. Thisfinitenessconditionisalmost alwayssatisfiedtrivially,butweshouldalwaysbewaryof“obvious”recursivealgorithmsthatactually recurseforever. (Alltoooften,“obvious”isasynonymfor“false”.) 1.3 Tower of Hanoi TheTowerofHanoipuzzlewasfirstpublishedbythemathematicianFrançoisÉduoardAnatoleLucasin 1883,underthepseudonym‘N.Claus(deSiam)’(ananagramof‘Lucasd’Amiens’). Thefollowingyear, HenrideParvilledescribedthepuzzlewiththefollowingremarkablestory:2 InthegreattempleatBenaresbeneaththedomewhichmarksthecentreoftheworld,restsabrassplatein whicharefixedthreediamondneedles,eachacubithighandasthickasthebodyofabee.Ononeofthese needles,atthecreation,Godplacedsixty-fourdiscsofpuregold,thelargestdiscrestingonthebrassplate, andtheothersgettingsmallerandsmalleruptothetopone. ThisistheTowerofBramah. Dayandnight unceasinglytheprieststransferthediscsfromonediamondneedletoanotheraccordingtothefixedand immutablelawsofBramah,whichrequirethatthepriestondutymustnotmovemorethanonediscatatime andthathemustplacethisdisconaneedlesothatthereisnosmallerdiscbelowit.Whenthesixty-fourdiscs shallhavebeenthustransferredfromtheneedleonwhichatthecreationGodplacedthemtooneoftheother needles,tower,temple,andBrahminsalikewillcrumbleintodust,andwithathunderclaptheworldwillvanish. Ofcourse,asgoodcomputerscientists,ourfirstinstinctonreadingthisstoryistosustitutethevariable nforthehardwiredconstant64. Howcanwemoveatowerof ndisksfromoneneedletoanother,using athirdneedleasanoccasionalplaceholder,withouteverplacingadiskontopofasmallerdisk? TheTowerofHanoipuzzle Thetricktosolvingthispuzzleistothinkrecursively. Insteadoftryingtosolvetheentirepuzzleall at once, let’s concentrate on moving just the largest disk. We can’t move it at the beginning, because alltheotherdisksarecoveringit;wehavetomovethose n−1diskstothethirdneedlebeforewecan 1WhenIwasastudent,Iusedtoattributerecursionto“elves”insteadoftheRecursionFairy,referringtotheBrothers Grimmstoryaboutanoldshoemakerwholeaveshisworkunfinishedwhenhegoestobed,onlytodiscoveruponwaking thatelves(“Wichtelmänner”)havefinishedeverythingovernight. SomeonemoreentheogenicallyexperiencedthanImight recognizethemasTerenceMcKenna’s“self-transformingmachineelves”. 2ThisEnglishtranslationisfromW.W.RouseBallandH.S.M.Coxeter’sbookMathematicalRecreationsandEssays. 2 Algorithms Lecture 1: Recursion [Fa’13] movethe nthdisk. Andthenafterwemovethe nthdisk,wehavetomovethose n−1disksbackontop ofit. Sonowallwehavetofigureoutishowto... STOP!! That’s it! We’re done! We’ve successfully reduced the n-disk Tower of Hanoi problem to two instances of the (n−1)-disk Tower of Hanoi problem, which we can gleefully hand off to the RecursionFairy(or,tocarrytheoriginalstoryfurther,tothejuniormonksatthetemple). recursion recursion TheTowerofHanoialgorithm;ignoreeverythingbutthebottomdisk Ourrecursivereductiondoesmakeonesubtlebutimportantassumption: Thereisalargestdisk. In other words, our recursive algorithm works for any n≥1, but it breaks down when n=0. We must handlethatbasecasedirectly. Fortunately,themonksatBenares,beinggoodBuddhists,arequiteadept atmovingzerodisksfromoneneedletoanotherinnotimeatall. ThebasecasefortheTowerofHanoialgorithm.Thereisnospoon. While it’s tempting to think about how all those smaller disks get moved—or more generally, whathappens whentherecursion isunrolled—it’snotnecessary. Foreven slightlymorecomplicated algorithms,unrollingtherecursionisfarmoreconfusingthanilluminating. Ouronly taskistoreduce the problem to one or more simpler instances, or to solve the problem directly if such a reduction is impossible. Ouralgorithmistriviallycorrectwhen n=0. Forany n≥1,theRecursionFairycorrectly moves(ormoreformally,theinductivehypothesisimpliesthatouralgorithmcorrectlymoves)thetop n−1disks,soouralgorithmisclearlycorrect. Here’stherecursiveHanoialgorithminmoretypicalpseudocode. Thisalgorithmmovesastackof n disksfromasourceneedle(src)toadestinationneedle(dst)usingathirdtemporaryneedle(tmp)asa placeholder. HANOI(n,src,dst,tmp): ifn>0 HANOI(n−1,src,tmp,dst) movedisknfromsrctodst HANOI(n−1,tmp,dst,src) LetT(n)denotethenumberofmovesrequiredtotransferndisks—therunningtimeofouralgorithm. Our vacuous base case implies that T(0)=0, and the more general recursive algorithm implies that T(n)=2T(n−1)+1forany n≥1. Theannihilatormethod(orguessingandcheckingbyinduction) quickly gives us the closed form solution T(n) = 2n −1. In particular, moving a tower of 64 disks requires264−1= 18,446,744,073,709,551,615individualmoves. Thus,evenattheimpressiverateof 3 Algorithms Lecture 1: Recursion [Fa’13] onemovepersecond,themonksatBenareswillbeatworkforapproximately585billionyearsbefore tower,temple,andBrahminsalikewillcrumbleintodust,andwithathunderclaptheworldwillvanish. 1.4 Mergesort Mergesort is one of the earliest algorithms proposed for sorting. According to Donald Knuth, it was proposedbyJohnvonNeumannasearlyas1945. 1. Dividetheinputarrayintotwosubarraysofroughlyequalsize. 2. Recursivelymergesorteachofthesubarrays. 3. Mergethenewly-sortedsubarraysintoasinglesortedarray. Input: S O R T I N G E X A M P L Divide: S O R T I N G E X A M P L Recurse: I N O S R T A E G L M P X Merge: A E G I L M N O P R S T X Amergesortexample. Thefirststepiscompletelytrivial—weonlyneedtocomputethemedianarrayindex—andwecan delegate the second step to the Recursion Fairy. All the real work is done in the final step; the two sortedsubarrayscanbemergedusingasimplelinear-timealgorithm. Here’sacompletedescriptionof thealgorithm;tokeeptherecursivestructureclear,weseparateoutthemergestepasanindependent subroutine. MERGE(A[1..n],m): i←1; j←m+1 fork←1ton if j>n MERGESORT(A[1..n]): B[k]←A[i]; i←i+1 ifn>1 elseifi>m m←(cid:98)n/2(cid:99) B[k]←A[j]; j← j+1 MERGESORT(A[1..m]) elseifA[i]<A[j] MERGESORT(A[m+1..n]) B[k]←A[i]; i←i+1 MERGE(A[1..n],m) else B[k]←A[j]; j← j+1 fork←1ton A[k]←B[k] Toprovethatthisalgorithmiscorrect,weapplyouroldfriendinductiontwice,firsttotheMERGE subroutinethentothetop-levelMERGESORT algorithm. • We prove MERGE is correct by induction on n−k+1, which is the total size of the two sorted subarraysA[i..m]andA[j..n]thatremaintobemergedinto B[k..n]whenthe kthiterationof themainloopbegins. Therearefivecasestoconsider. Yes,five. – Ifk>n,thealgorithmcorrectlymergesthetwoemptysubarraysbydoingabsolutelynothing. (Thisisthebasecaseoftheinductiveproof.) – If i ≤ m and j > n, the subarray A[j..n] is empty. Because both subarrays are sorted, the smallestelementintheunionofthetwosubarraysisA[i]. Sotheassignment B[k]←A[i] iscorrect. TheinductivehypothesisimpliesthattheremainingsubarraysA[i+1..m]and A[j..n]arecorrectlymergedinto B[k+1..n]. 4 Algorithms Lecture 1: Recursion [Fa’13] – Similarly,if i>mand j≤n,theassignment B[k]←A[j]iscorrect,andTheRecursionFairy correctlymerges—sorry,ImeantheinductivehypothesisimpliesthattheMERGE algorithm correctlymerges—theremainingsubarraysA[i..m]andA[j+1..n]into B[k+1..n]. – If i≤mand j≤nandA[i]<A[j],thenthesmallestremainingelementisA[i]. So B[k]is assignedcorrectly,andtheRecursionFairycorrectlymergestherestofthesubarrays. – Finally,if i≤mand j≤nandA[i]≥A[j],thenthesmallestremainingelementisA[j]. So B[k]isassignedcorrectly,andtheRecursionFairycorrectlydoestherest. • NowweproveMERGESORT correctbyinduction;therearetwocasestoconsider. Yes,two. – If n≤1,thealgorithmcorrectlydoesnothing. – Otherwise,theRecursionFairycorrectlysorts—sorry,Imeantheinductionhypothesisimplies thatouralgorithmcorrectlysorts—thetwosmallersubarraysA[1..m]andA[m+1..n],after whichtheyarecorrectlyMERGEdintoasinglesortedarray(bythepreviousargument). What’stherunningtime? BecausetheMERGESORT algorithmisrecursive,itsrunningtimewillbe expressedbyarecurrence. MERGE clearlytakeslineartime,becauseit’sasimplefor-loopwithconstant workperiteration. WeimmediatelyobtainthefollowingrecurrenceforMERGESORT: T(n)=T(cid:0)(cid:100)n/2(cid:101)(cid:1)+T(cid:0)(cid:98)n/2(cid:99)(cid:1)+O(n). Asinmostdivide-and-conquerrecurrences,wecansafelystripoutthefloorsandceilingsusingadomain transformation,3 givingusthesimplerrecurrence T(n)=2T(n/2)+O(n). The“alllevelsequal”caseoftherecursiontreemethodnowimmediatelyimpliestheclosed-formsolution T(n)=O(nlogn). (Recursiontreesanddomaintransformationsaredescribedindetailinaseparate noteonsolvingrecurrences.) 1.5 Quicksort Quicksortisanotherrecursivesortingalgorithm,discoveredbyTonyHoarein1962. Inthisalgorithm, thehardworkissplittingthearrayintosubsetssothatmergingthefinalresultistrivial. 1. Chooseapivotelementfromthearray. 2. Partitionthearrayintothreesubarrayscontainingtheelementssmallerthanthepivot,thepivot elementitself,andtheelementslargerthanthepivot. 3. Recursivelyquicksortthefirstandlastsubarray. Input: S O R T I N G E X A M P L Chooseapivot: S O R T I N G E X A M P L Partition: A G E I L N R O X S M P T Recurse: A E G I L M N O P R S T X Aquicksortexample. Here’samoredetaileddescriptionofthealgorithm. IntheseparatePARTITION subroutine,theinput parameter p isindexofthepivotelementintheunsortedarray;thesubroutinepartitionsthearrayand returnsthenewindexofthepivot. 3Seethecoursenotesonsolvingrecurrencesformoredetails. 5 Algorithms Lecture 1: Recursion [Fa’13] PARTITION(A[1..n],p): if(p(cid:54)=n) swapA[p]↔A[n] QUICKSORT(A[1..n]): i←0; j←n if(n>1) while(i< j) ChooseapivotelementA[p] repeati←i+1until(i= j orA[i]≥A[n]) r←PARTITION(A,p) repeat j← j−1until(i= j orA[j]≤A[n]) QUICKSORT(A[1..r−1]) if(i< j) QUICKSORT(A[r+1..n]) swapA[i]↔A[j] if(i(cid:54)=n) swapA[i]↔A[n] returni Just like mergesort, proving QUICKSORT is correct requires two separate induction proofs: one to provethatPARTITION correctlypartitionsthearray,andtheothertoprovethatQUICKSORT correctlysorts assumingPARTITION iscorrect. I’llleavethegorydetailsasanexerciseforthereader. The analysis is also similar to mergesort. PARTITION runs in O(n) time: j−i =n at the beginning, j−i=0attheend,andwedoaconstantamountofworkeachtimeweincrement i ordecrement j. ForQUICKSORT,wegetarecurrencethatdependson r,therankofthechosenpivotelement: T(n)=T(r−1)+T(n−r)+O(n) IfwecouldchoosethepivottobethemedianelementofthearrayA,wewouldhave r =(cid:100)n/2(cid:101),thetwo subproblemswouldbeasclosetothesamesizeaspossible,therecurrencewouldbecome T(n)=2T(cid:0)(cid:100)n/2(cid:101)−1(cid:1)+T(cid:0)(cid:98)n/2(cid:99)(cid:1)+O(n)≤2T(n/2)+O(n), andwe’dhave T(n)=O(nlogn)bytherecursiontreemethod. Infact,aswewillseelater,wecanlocatethemedianelementinanunsortedarrayinlineartime. However,thealgorithmisfairlycomplicated,andthehiddenconstantintheO(·)notationislarge. In practice,programmerssettleforsomethingsimple,likechoosingthefirstorlastelementofthearray. In thiscase, r takeanyvaluebetween1and n,sowehave T(n)= max (cid:0)T(r−1)+T(n−r)+O(n)(cid:1) 1≤r≤n In the worst case, the two subproblems are completely unbalanced—either r =1 or r = n—and the recurrencebecomes T(n)≤T(n−1)+O(n). Thesolutionis T(n)=O(n2). Another common heuristic is called “median of three”—choose three elements (usually at the beginning, middle, and end of the array), and take the median of those three elements the pivot. Althoughthisheuristicissomewhatmoreefficientinpracticethanjustchoosingoneelement,especially whenthearrayisalready(nearly)sorted,wecanstillhave r =2or r =n−1intheworstcase. With themedian-of-threeheuristic,therecurrencebecomes T(n)≤T(1)+T(n−2)+O(n),whosesolutionis still T(n)=O(n2). Intuitively,thepivotelementwill‘usually’fallsomewhereinthemiddleofthearray,saybetween n/10and9n/10. Thisobservationsuggeststhattheaverage-caserunningtimeisO(nlogn). Although this intuition is actually correct (at least under the right formal assumptions), we are still far from a proof thatquicksortisusuallyefficient. Wewillformalizethisintuitionaboutaverage-casebehaviorina laterlecture. 6 Algorithms Lecture 1: Recursion [Fa’13] 1.6 The Pattern Bothmergesortandandquicksortfollowageneralthree-steppatternsharedbyalldivideandconquer algorithms: 1. Dividethegiveninstanceoftheproblemintoseveralindependentsmallerinstances. 2. DelegateeachsmallerinstancetotheRecursionFairy. 3. Combinethesolutionsforthesmallerinstancesintothefinalsolutionforthegiveninstance. Ifthesizeofanysubproblemfallsbelowsomeconstantthreshold,therecursionbottomsout. Hopefully, atthatpoint,theproblemistrivial,butifnot,weswitchtoadifferentalgorithminstead. Proving a divide-and-conquer algorithm correct almost always requires induction. Analyzing the runningtimerequiressettingupandsolvingarecurrence,whichusually(butunfortunatelynotalways!) canbesolvedusingrecursiontrees,perhapsafterasimpledomaintransformation. 1.7 Median Selection So how do we find the median element of an array in linear time? The following algorithm was discoveredbyManuelBlum,BobFloyd,VaughanPratt,RonRivest,andBobTarjanintheearly1970s. Their algorithm actually solves the more general problem of selecting the kth largest element in an n-element array, given the array and the integer g as input, using a variant of an algorithm called either“quickselect”or“one-armedquicksort”. Thebasicquickselectalgorithmchoosesapivotelement, partitionsthearrayusingthePARTITION subroutinefromQUICKSORT,andthenrecursivelysearchesonly oneofthetwosubarrays. QUICKSELECT(A[1..n],k): ifn=1 returnA[1] else ChooseapivotelementA[p] r←PARTITION(A[1..n],p) ifk<r returnQUICKSELECT(A[1..r−1],k) elseifk>r returnQUICKSELECT(A[r+1..n],k−r) else returnA[r] Theworst-caserunningtimeofQUICKSELECT obeysarecurrencesimilartothequicksortrecurrence. Wedon’tknowthevalueof r orwhichsubarraywe’llrecursivelysearch,sowe’lljustassumetheworst. T(n) ≤ max (max{T(r−1),T(n−r)}+O(n)) 1≤r≤n Wecansimplifytherecurrencebyusing(cid:96)todenotethelengthoftherecursivesubproblem: T(n) ≤ max T((cid:96))+O(n) ≤ T(n−1)+O(n) 0≤(cid:96)≤n−1 Aswithquicksort,wegetthesolution T(n)=O(n2)when(cid:96)=n−1,whichhappenswhenthechosen pivotelementiseitherthesmallestelementorlargestelementofthearray. 7 Algorithms Lecture 1: Recursion [Fa’13] Ontheotherhand,wecouldavoidthisquadraticbehaviorifwecouldsomehowmagicallychoosea goodpivot,where(cid:96)≤αnforsomeconstantα<1. Inthiscase,therecurrencewouldsimplifyto T(n)≤T(αn)+O(n). Thisrecurrenceexpandsintoadescendinggeometricseries,whichisdominatedbyitslargestterm,so T(n)=O(n). The Blum-Floyd-Pratt-Rivest-Tarjan algorithm chooses a good pivot for one-armed quicksort by recursivelycomputingthemedianofacarefully-selectedsubsetoftheinputarray. MOM5SELECT(A[1..n],k): ifn≤25 usebruteforce else m←(cid:100)n/5(cid:101) for i ←1to m M[i]←MEDIANOFFIVE(A[5i−4..5i]) 〈〈Bruteforce!〉〉 mom←MOMSELECT(M[1..m],(cid:98)m/2(cid:99)) 〈〈Recursion!〉〉 r←PARTITION(A[1..n],mom) ifk<r returnMOMSELECT(A[1..r−1],k) 〈〈Recursion!〉〉 elseifk>r returnMOMSELECT(A[r+1..n],k−r) 〈〈Recursion!〉〉 else returnmom The recursive structure of the algorithm requires a slightly larger base case. There’s absolutely nothingspecialabouttheconstant25inthepseudocode;fortheoreticalpurposes,anyotherconstant like42or666or8765309wouldworkjustaswell. Iftheinputarrayistoolargetohandlebybruteforce,wedivideitinto(cid:100)n/5(cid:101)blocks,eachcontaining exactly5elements,exceptpossiblythelast. (Ifthelastblockisn’tfull,justthrowinafew∞s.) Wefind themedianofeachblockbybruteforceandcollectthosemediansintoanewarray M[1..(cid:100)n/5(cid:101)]. Then werecursivelycomputethemedianofthisnewarray. Finallyweusethemedianofmedians—hence ‘mom’—asthepivotinone-armedquicksort. The key insight is that neither of these two subarrays can be too large. The median of medians is larger than (cid:100)(cid:100)n/5(cid:101)/2(cid:101)−1 ≈ n/10 block medians, and each of those medians is larger than two other elements in its block. Thus, mom is larger than at least 3n/10 elements in the input array, and symmetrically, mom is smaller than at least 3n/10 input elements. Thus, in the worst case, the final recursivecallsearchesanarrayofsize7n/10. We can visualize the algorithm’s behavior by drawing the input array as a 5×(cid:100)n/5(cid:101) grid, which each column represents five consecutive elements. For purposes of illustration, imagine that we sort everycolumnfromtopdown,andthenwesortthecolumnsbytheirmiddleelement. (Letmeemphasize thatthealgorithmdoesnotactuallydothis!) Inthisarrangement,themedian-of-mediansistheelement closesttothecenterofthegrid. The left half of the first three rows of the grid contains 3n/10 elements, each of which is smaller than the median-of-medians. If the element we’re looking for is larger than the median-of-medians, ouralgorithmwillthrowawayeverythingsmallerthanthemedian-of-median,includingthose3n/10 elements,beforerecursing. Thus,theinputtotherecursivesubproblemcontainsatmost7n/10elements. Asymmetricargumentapplieswhenourtargetelementissmallerthanthemedian-of-medians. 8 Algorithms Lecture 1: Recursion [Fa’13] Visualizingthemedianofmedians Discardingapproximately3/10ofthearray Weconcludethattheworst-caserunningtimeofthealgorithmobeysthefollowingrecurrence: T(n)≤O(n)+T(n/5)+T(7n/10). Therecursiontreemethodimpliesthesolution T(n)=O(n). Finer analysis reveals that the constant hidden by the O() is quite large, even if we count only comparisons; this is not a practical algorithm for small inputs. (In particular, mergesort uses fewer comparisons in the worst case when n<4,000,000.) Selecting the median of 5 elements requires at most6comparisons,soweneedatmost6n/5comparisonstosetuptherecursivesubproblem. Weneed another n−1 comparisons to partition the array after the recursive call returns. So a more accurate recurrenceforthetotalnumberofcomparisonsis T(n)≤11n/5+T(n/5)+T(7n/10). Therecursiontreemethodimpliestheupperbound 11n(cid:88)(cid:18) 9 (cid:19)i 11n T(n)≤ = ·10=22n. 5 10 5 i≥0 1.8 Multiplication Addingtwo n-digitnumberstakesO(n)timebythestandarditerative‘ripple-carry’algorithm,usinga lookuptableforeachone-digitaddition. Similarly,multiplyingan n-digitnumberbyaone-digitnumber takesO(n)time,usingessentiallythesamealgorithm. Whataboutmultiplyingtwo n-digitnumbers? Inmostoftheworld,gradeschoolstudents(suppos- edly)learntomultiplybybreakingtheprobleminto none-digitmultiplicationsand nadditions: 9 Algorithms Lecture 1: Recursion [Fa’13] 31415962 ×27182818 251327696 31415962 251327696 62831924 251327696 31415962 219911734 62831924 853974377340916 We could easily formalize this algorithm as a pair of nested for-loops. The algorithm runs in Θ(n2) time—altogether,thereareΘ(n2)digitsinthepartialproducts,andforeachdigit,wespendconstant time. TheEgyptian/Russianpeasantmultiplicationalgorithmdescribedinthefirstlecturealsorunsin Θ(n2)time. Perhapswecangetamoreefficientalgorithmbyexploitingthefollowingidentity: (10ma+b)(10mc+d)=102mac+10m(bc+ad)+bd Here is a divide-and-conquer algorithm that computes the product of two n-digit numbers x and y, basedonthisformula. Eachofthefoursub-products e,f,g,hiscomputedrecursively. Thelastlinedoes notinvolveanymultiplications,however;tomultiplybyapoweroften,wejustshiftthedigitsandfill intherightnumberofzeros. MULTIPLY(x,y,n): ifn=1 return x·y else m←(cid:100)n/2(cid:101) a←(cid:98)x/10m(cid:99); b←x mod10m d←(cid:98)y/10m(cid:99); c← y mod10m e←MULTIPLY(a,c,m) f ←MULTIPLY(b,d,m) g←MULTIPLY(b,c,m) h←MULTIPLY(a,d,m) return102me+10m(g+h)+f Youcaneasilyprovebyinductionthatthisalgorithmiscorrect. Therunningtimeforthisalgorithmis givenbytherecurrence T(n)=4T((cid:100)n/2(cid:101))+Θ(n), T(1)=1, which solves to T(n) = Θ(n2) by the recursion tree method (after a simple domain transformation). Hmm...Iguessthisdidn’thelpafterall. Inthemid-1950s,thefamousRussianmathematicianAndreyKolmogorovconjecturedthatthereis noalgorithmtomultiplytwo n-digitnumbersin o(n2)time. However,in1960,afterKolmogorovposed hisconjectureataseminaratMoscowUniversity,Anatoli˘ıKaratsuba,oneofthestudentsintheseminar, discoveredaremarkablecounterexample. AccordingtoKarastubahimself, After the seminar I told Kolmogorov about the new algorithm and about the disproof of the n2 conjecture. Kolmogorovwasveryagitatedbecausethiscontradictedhisveryplausibleconjecture. At thenextmeetingoftheseminar,Kolmogorovhimselftoldtheparticipantsaboutmymethod,andat thatpointtheseminarwasterminated. 10
Description: