ebook img

A Backward Analysis for Constraint Logic Programs PDF

32 Pages·0.38 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview A Backward Analysis for Constraint Logic Programs

Underconsideration for publication in Theory and Practice of Logic Programming 1 A Backward Analysis for Constraint Logic Programs 2 0 0 ANDY KING 2 Universityof Kent at Canterbury, Canterbury, CT2 7NF, UK. n email: [email protected] a J LUNJIN LU 6 Oakland University,Rochester, MI 48309, USA. 1 email: [email protected] ] L P Abstract . s c One recurring problem in program development is that of understanding how to re-use [ codedevelopedbyathirdparty.Inthecontextof(constraint)logicprogramming,partof thisproblemreducestofiguringouthowtoqueryaprogram.Ifthelogicprogramdoesnot 1 come with any documentation, then the programmer is forced to either experiment with v 1 queries in an ad hoc fashion or trace the control-flow of the program (backward) to infer 1 themodes in which a predicate must be called so as toavoid an instantiation error. This 0 paper presents an abstract interpretation scheme that automates the latter technique. 1 Theanalysis presentedin thispapercan infer modingproperties which ifsatisfied bythe 0 initialquery,comewiththeguaranteethattheprogramandquerycannevergenerateany 2 modingorinstantiationerrors.Otherapplicationsoftheanalysisarediscussed.Thepaper 0 explainshowabstractdomainswithcertaincomputationalproperties(theycondense)can / s be used to trace control-flow backward (right-to-left) to infer useful properties of initial c queries. A correctness argument is presented and an implementation is reported. : v i X r a 1 Introduction The myth of the lonely logic programmer writing a program in isolation is just that: a myth. Applications (and application components) are usually implemented and maintained by a team. One consequence of this is a significant proportion of the program development effort is devoted to understanding code developed by another. One advantage of (constraint) logic programsfor software development is thattheirdeclarativenaturemakesthemlessopaquethan,say,C++programs.One disadvantageoflogic programsoverC++ programs,however,is that the signature (argumenttypes)ofapredicatedonotcompletelyspecifyhowthepredicateshould be invoked. In particular, a call to a predicate from an unexpected context may generate an error if an argument of the call is insufficiently instantiated (even if the program and query are well-typed). This is because logic programs contain builtins andcallsto these builtins oftenimpose moding requirementsonthe query. If the program is developed by another programmer, it may not be clear how to 2 Andy King and Lunjin Lu query a predicate so as to avoidan instantiation error.In these circumstances, the programmerwilloften resortto a trialanderrortactic intheir searchfor aninitial callmode.Thiscanbebothfrustratingandtediousand,ofcourse,cannotguarantee coverage of all the program execution paths. This paper presents an analysis for inferring moding properties which, if satisfied by the initial query, ensure that the program does not generate instantiation errors. Of course, it does not mean that the inferred call has the form exactly intended by the original programmer – no analysis can do that – the analysis just recovers mode information. Nevertheless, this is a useful first step in understanding the code developed by another. The problem of inferring initial queries which do not lead to instantiation errors is an instance of the more general problem of deducing how to call a program so that it conforms to some desired property, for example, calls to builtins do not error,theprogramterminates,orcallstobuiltinsbehavepredictably.Thebackward analysispresented in this paper is designedto infer conditions on the query which, if satisfied, guarantee that resulting derivations satisfy a property such as one of those above. Specifically, the analysis framework can be instantiated to solve the following analysis problems: • Builtins and library functions can behave unpredictably when called with infinite rational trees. For example, the query ?- X = X + X, Y is X will not terminate in SICStus Prolog because the arithmetic operator expects its input to be a finite tree rather than an infinite rational tree. Moreover, the standardtermorderingofPrologdoesnotlift to rationaltrees,sothe builtin sort can behave unpredictably when sorting rational trees. These problems (andrelatedproblemswith builtins)motivate the use ofdependency analysis fortrackingwhichtermsaredefinitelyfinite(Bagnaraet al.,2001).Thebasic idea is to describe the constraint x = f(x ,...,x ) by the Boolean function 1 n x ⇔ ∧n x which encodes that x is bound to a finite tree iff each x is i=1 i i bound to a finite tree. Although not proposed in the context of backward analysis (Bagnara et al., 2001),the frameworkproposedin this paper can be instantiatedwithafinitetreedependencydomaintoinferfinitenessproperties on the query which, if satisfied, guarantee that builtins are not called with problematic arguments. • Termination inference is the problem of inferring initial modes for a query that,ifsatisfied,ensurethatalogicprogramterminates.Thisproblemgener- alises termination checking which verifies program termination for a class of queries specified by a givenmode. Terminationinference dates back to (Mes- nard, 1996)but it has been recently observed(Genaim & Codish, 2001)that the missing link between termination checking and termination inference is backward analysis. A termination inference analyser is reported in (Genaim & Codish, 2001) composed from two components: a standard termination checker(Codish&Taboch,1999)andthebackwardanalysisdescribedinthis paper. The resulting analyser is similar to the cTI analyser of (Mesnard & Neumerkel, 2001) – the main difference is its design as two existing black- A Backward Analysis for Constraint Logic Programs 3 boxcomponentswhich,accordingto(Genaim&Codish,2001),simplifiesthe formal justification and implementation. • Modeanalysisisusefulforimplementingccpprograms.Inparticular(Debray et al., 1992) explains how various low-level optimisations, such as returning output values in registers, can be applied if goals can be scheduled left-to- rightwithoutsuspension.Ifthe guardsofthe predicatesarere-interpretedas moding requirements, then the backward mode analysis can infer sufficient conditions for avoiding deadlock under left-to-right scheduling. The analysis presented in this paper thus has applications outside programdevelopment. Tosummarise,theanalysispresentedinthispapercandeducepropertiesofthecall which,ifsatisfied,guaranteethatresultingderivationsfulfillsomedesiredproperty. The analysis is unusual in that it applies lower approximation (see 2.4.1) as well as upper approximation (see 2.3.1); it is formulated in terms of a greatest fixpoint calculation (see 2.4) as well as least fixpoint calculation (see 2.3); the analysis also imposes some unusual restrictions on the abstract domain (see 2.4.6). 1.1 Backward analysis Backward analysis has been applied extensively in functional programming in, among other things, projection analysis (Wadler & Hughes, 1987), stream strict- ness analysis (Hall & Wise, 1989), inverse image analysis (Dyber, 1991), etc. By reasoning about the context of a function application, these analyses can identify opportunities for eager evaluation that are missed by (forward) strictness analysis as proposed by (Mycroft, 1981). Furthermore, backward reasoning on imperative programs dates back to the early days of static analysis (Cousot & Cousot, 1982). By way of contrast, backward analysis has been rarely applied in logic program- ming.Onenotableexceptionisthedemandanalysisof(Debray,1993).Thisanalysis infersthedegreeofinstantiationnecessaryfortheguardsofaconcurrentconstraint program (ccp) to reduce. It is a local analysis that does not consider the possible suspension of body calls. This analysis detects those (uni-modal) predicates which canbeimplementedwithspecialisedsuspensionmachinery.Amoreelaborateback- ward analysis for ccp is presented by (Falaschi et al., 2000). This demand analysis infers howmuchinput is necessaryfora procedureto generatea certainamountof output.Thisinformationisusefulforaddingsynchronisation(ask)constraintstoa proceduretodelayexecutionandtherebyincreasegrainsize,andyetnotintroduce deadlock. (Section 7 provides more extensive and reflective review of the related work.) 1.2 Contributions Ourworkisquitedifferent.Asfarasweareaware,itisuniqueinthatitfocuseson the backward analysis of (constraint) logic programs with left-to-right scheduling. Specifically, our work makes the following practical and theoretical contributions: • it shows how to compute an initial mode of a predicate which is safe in that 4 Andy King and Lunjin Lu if a query is at least as instantiated as the inferred mode, the execution is guaranteedto be free frominstantiation errors.The modes inferredare often disjunctive,sometimessurprisingand,forthesmallpredicatesthatweverified by hand, appear to be optimal. • it specifies a practicalalgorithmfor calculatinginitialmodes that is straight- forward to implement in that it reduces to two bottom-up fixpoint calcula- tions. Furthermore, this backward analysis problem cannot be solved with any existing abstract interpretation machinery. • to the best our knowledge, it is the first time domains that are closed un- der Heyting completion (Giacobazzi & Scozzari, 1998), or equivalently are condensing (Marriott & Søndergaard, 1993), have been applied to backward analysis.Put another way,our work adds credence to the belief that conden- sation is an important property in the analysis of logic programs. The final point requires some unpacking. Condensation was originally proposed in (Langen,1991),thougharguablythe simplest statementofthis property(Marriott & Søndergaard, 1993) is for downward closed domains such as Pos (Armstrong et al., 1998)and the Pos-like type dependency domains (Codish & Lagoon,2000). Suppose that f : X → X is an abstract operation on a downward closed domain X equipped with an operation ∧ that mimics unification or constraint solving. X is condensing iff x ∧f(y) = f(x∧ y) for all x,y ∈ X. Hence, if X is condens- ing, x ∧ f(true) = f(x) where true represents the weakest abstract constraint. More exactly,if f(true) representsthe resultof the goal-independentanalysis,and f(x) the result of the goal-dependent one with an initial constraint x, then the equivalence f(x) = x∧f(true) enables goal-dependent analysis to be performed in a goal-independent way without loss of precision. This, in turn, can simplify the implementation of an analyser (Armstrong et al., 1998). Because of this, do- mainrefinementmachineryhasbeendevisedtoenrichadomainwithnewelements to obtain the desired condensing property (Giacobazzi & Scozzari, 1998). It turns out that it is always possible to systematically design a condensing domain for a given downward closed property (Giacobazzi & Scozzari, 1998)[Theorem 8.2] by applying Heyting completion. Conversely, under some reasonable hypotheses, all condensing domains can be reconstructed by Heyting completion (Giacobazzi & Scozzari, 1998)[Theorem8.3]. One consequence of this is that condensing domains come equipped with a (pseudo-complement) operator and this turns out to be an operation that is important in backward analysis. To summarise, machinery has been developedto synthesise condensing domains and condensing domains provide operations suitable for backward analysis. 1.3 Organisation of the paper The rest of the paper is structured as follows. Section 2 introduces the key ideas of the paper in an informal way through a worked example. Section 3 introduces the necessary preliminaries for the formal sections that follow. Section 4 presents anoperationalsemanticsforconstraintlogicprogramswithassertionsinwhichthe set of program states is augmented by a special error state. Section 5 develops a A Backward Analysis for Constraint Logic Programs 5 semantics which computes those initial states that cannot lead to the error state. The semantics defines a framework for backwardanalysis and formally argues cor- rectness. Section 6 describes an instantiation of the framework for mode analysis. Section 7 reviews the related work and section 8 concludes. Much of the formal machinery is borrowed directly from (Giacobazzi et al., 1995; Giacobazzi & Scoz- zari, 1998) and in particular the reader is referred to (Giacobazzi et al., 1995) for proofs of the semantic results stated in section 3 (albeit presentedin a slightly dif- ferent form). To aid continuity in the paper, the remaining proofs are relegated to appendix A. 2 Worked example 2.1 Basic components Thissectioninformallypresentsanabstractinterpretationschemewhichinfershow to query a given predicate so as to avoid run-time moding errors. In other words, the analysis deduces moding properties of the call that, if satisfied, guarantee that resultingderivationscannotencounteraninstantiationerror.Toillustrate,consider theQuicksortprogramlistedintheleftcolumnoffigure1.Thisisthefirstingredient of the analysis: the input program. The second ingredient is an abstract domain which,inthiscase,isPos.PosisthedomainofpositiveBooleanfunctions,thatis, thesetoffunctionsf :{0,1}n→{0,1}suchthatf(1,...,1)=1.Hencex∨y ∈Pos since 1∨1 = 1 but ¬x 6∈ Pos since ¬1 = 0. Pos is augmented with the bottom element 0 with 1 being the top element. The domain is ordered by entailment |= and, in this example, will be used to represent grounding dependencies. Pos comes equipped with the logical operations: conjunction ∧, disjunction ∨, implication ⇒ (and thus bi-implication ⇔). Conjunction is used to conjoin the information from different body atoms, while disjunction is used to combine the informationfromdifferentclauses.Conjunctionanddisjunction,inturn,enabletwo projection operators to be defined: ∃ (f)=f[x7→0]∨f[x7→1] and ∀ (f)= f′ if x x f′ ∈Pos otherwise ∀ (f)=0 where f′ =f[x7→0]∧f[x7→1]. Note that although x f[x7→0]∨f[x7→1]∈Pos for allf ∈Pos it does notfollow thatf[x7→0]∧f[x7→ 1] ∈ Pos for all f ∈ Pos. Indeed, (x ⇐ y)[x 7→ 0]∧(x ⇐ y)[x 7→ 1] = ¬y. Both operators are used to project out the body variables that are not in the head of a clause. Specifically, these operators eliminate the variable x from the formula f. Theyaredualinthesensethat∀ (f)|=f |=∃ (f).Thesearethebasiccomponents x x of the analysis. 2.2 Normalisation and abstraction Theanalysiscomponentsareassembledintwosteps.Thefirstisabottom-upanal- ysis for success patterns, that is, a bottom-up analysiswhich infers the groundness dependencies which are known to be created by each predicate regardless of the calling pattern. This step is a least fixpoint (lfp) calculation. The second step is a bottom-up analysis for input modes (the objective of the analysis). This step is a 6 Andy King and Lunjin Lu qs([], s, s). qs(t1,s,t2) :- qs(t1,s,t2) :- qs([m|xs], s, t) :- t1 = [], t2 = s. 1⋄g1. pt(xs,m,l,h), qs(t1,s,t) :- qs(t1,s,t) :- qs(l, s, [m|r]), t1 = [m|xs], 1⋄g2, qs(h, r, t). t3 = [m|r], pt(xs,m,l,h), pt(xs,m,l,h), qs(l,s,t3), pt([], , [], []). qs(l,s,t3), qs(h,r,t). pt([x|xs], m, [x|l], h) :- qs(h,r,t). m =< x, pt(t1, ,t2,t3) :- pt(xs,m,l,h). pt(t1, ,t2,t3) :- 1⋄g3. pt([x|xs], m, l, [x|h]) :- t1 = [], pt(t1,m,t2,h) :- m > x, t2 = [], t3 = []. 1⋄g4, pt(xs,m,l,h). pt(t1,m,t2,h) :- =<’(m, x), t1 = [x|xs], pt(xs,m,l,h). t2 = [x|l], pt(t1,m,l,t2) :- m =< x 1⋄g5, pt(xs,m,l,h). >’(m, x), pt(t1,m,l,t2) :- pt(xs,m,l,h). t1 = [x|xs], t2 = [x|h], =<’(m, x) :- g6⋄g6. m > x, pt(xs,m,l,h). >’(m, x) :- g6⋄g6. Fig. 1. Quicksort: raw, normalised and abstracted greatestfixpoint(gfp)computation.Tosimplifybothsteps,theprogramisputinto aforminwhichthe argumentsofheadandbody atoms aredistinct variables.This givesthe normalisedprogramlisted in the centre column of figure 1. This program is then abstracted by replacing each Herbrand constraint x=f(x ,...,x ) with 1 n a formula x⇔∧n x that describes its grounding dependency. This gives the ab- i=1 i stractprogramlistedinthe rightcolumnoffigure1.Theformula1intheassertion represents true whereas the formulae g that appear in the abstract program are i as follows: g =t ∧(t ⇔s) g =t ⇔(x∧xs)∧t ⇔(x∧l) 1 1 2 4 1 2 g =t ⇔(m∧xs)∧t ⇔(m∧r) g =t ⇔(x∧xs)∧t ⇔(x∧h) 2 1 3 5 1 2 g =t ∧t ∧t g =m∧x 3 1 2 3 6 Builtins that occur in the source, such as the tests =< and >, are handled by augmentingtheabstractprogramwithfreshpredicates,=<′ and>′,whichexpress the grounding behaviour of the builtins. The ⋄ symbol separates an assertion (the requiredmode) fromanother Pos formula describing the grounding behaviourof a successful call to the builtin (the success mode). For example, the formula g left 6 of ⋄ in the =<′ clause asserts that the =< test will error if its first two arguments are not ground, whereas the g right of ⋄ describes the state that holds if the 6 test succeeds. These formulae do not coincide for all builtins (see Table 1). For quicksort, the only non-trivial assertions arise from builtins. This would change if the programmer introduced assertions for verification (Puebla et al., 2000a). A Backward Analysis for Constraint Logic Programs 7 2.3 Least fixpoint calculation An iterative algorithm is used to compute the lfp and thereby characterise the success patterns of the program. A success pattern is a pair consisting of an atom withdistinctvariablesforargumentspairedwithaPosformulaoverthosevariables. Renamingandequalityofformulaeinduceanequivalencebetweensuccesspatterns which is needed to detect the fixpoint. The patterns hp(u,w,v),u∧(w ⇔ v)i and hp(x ,x ,x ),(x ⇔ x )∧x i, for example, are considered to be identical: both 1 2 3 3 2 1 expressthesameinter-argumentgroundnessdependencies.Eachiterationproduces a set of success patterns: at most one pair for each predicate in the program. 2.3.1 Upper approximation of success patterns Asuccesspatternrecordsaninter-argumentgroundnessdependencythatdescribes the binding effects of executing a predicate. If hp(~x),fi correctly describes the predicate p, and g holds whenever f holds, then hp(~x),gi also correctly describes p. Success patterns can thus be approximated from above without compromising correctness. Iterationisperformedinabottom-upfashionandcommenceswithF =∅.F 0 j+1 is computed from F by considering each clause p(~x)←d⋄f,p (~x ),...,p (~x ) in j 1 1 n n turn. Initially F = ∅. The success pattern formulae f for the n body atoms j+1 i are conjoined with f to obtain g =f ∧∧n f . Variables not present in p(~x), Y i=1 i say, are then eliminated from g by computing g′ =∃ (g) (weakening g) where Y ∃ (g) = ∃ (...∃ (g)). Weakening g does not compromise correctness be- {y1...yn} y1 yn cause success patterns can be safety approximated from above. 2.3.2 Weakening upper approximations If F already contains a pattern of the form hp(~x),g′′i, then this pattern is re- j+1 placedwith hp(~x),g′∨g′′i, otherwise F is revisedto include hp(~x),g′i. Thus the j+1 success patterns become progressivelyweakeron each iteration.Again, correctness is preserved because success patterns can be safety approximated from above. 2.3.3 Least fixpoint calculation for Quicksort For brevity, let ~u=hx ,x i,~v =hx ,x ,x i and w~ =hx ,x ,x ,x i. Then the lfp 1 2 1 2 3 1 2 3 4 for the abstracted Quicksort program is obtained (and checked) in the following 3 iterations: hqs(~v),x ∧(x ⇔x )i hqs(~v),x ⇔(x ∧x )i 1 2 3 2 1 3 F = hpt(w~),x1∧x3∧x4i  F = hpt(w~),x1∧x3∧x4i  1 h=<′(~u),x1∧x2i  2 h=<′(~u),x1∧x2i  h>′(~u),x ∧x i h>′(~u),x ∧x i 1 2 1 2     Finally, F = F . The space of success patterns forms a complete lattice which 3 2 ensures that a lfp (a most precision solution) exists. The iterative process will always terminate since the space is finite and hence the number of times each 8 Andy King and Lunjin Lu success pattern can be updated is also finite. Moreover, it will converge onto the lfp since iteration commences with the bottom element F =∅. 0 Observe that F , the lfp, faithfully describes the grounding behaviour of quick- 2 sort:a qsgoalwillgrounditssecondargumentifitis calledwithits firstandthird arguments already ground and vice versa. Note that assertions are not considered in the lfp calculation. 2.4 Greatest fixpoint calculation A bottom-up strategy is used to compute a gfp and thereby characterise the safe callpatternsoftheprogram.Asafecallpatterndescribesqueriesthatdonotviolate theassertions.Acallpatternhasthesameformasasuccesspattern(sothereisone callpatternperpredicateratherthanoneperclause).Onestartswithassumingno call causes an error and then checks this assumption by reasoning backwards over all clauses. If an assertion is violated, the set of safe call patterns for the involved predicate is strengthened (made smaller), and the whole process is repeated until the assumptions turn out to be valid (the gfp is reached). 2.4.1 Lower approximation of safe call patterns Iteration commences with D ={hp(~x),1i|p∈Π} where Π is the set of predicate 0 symbolsoccurringintheprogram.Aniterativealgorithmincrementallystrengthens the call pattern formulae until they only describe queries which lead to computa- tions that satisfy the assertions. Note that call patterns describe a subset (rather than a superset) of those queries which are safe. Call patterns are thus lower ap- proximations in contrast to success patterns which are upper approximations.Put anotherway,ifhp(~x),gicorrectlydescribessomesafecallpatternsofp,andg holds whenever f holds, then hp(~x),fi also correctly describes some safe call patterns of p. Call patterns can thus be approximated from below without compromising correctness (but not from above). D is computed from D by considering each p(~x)←d⋄f,p (~x ),...,p (~x ) k+1 k 1 1 n n inturn and calculatinga formulathat characterisesits safe calling modes.Initially set D = D . A safe calling mode is calculated by propagating moding require- k+1 k mentsright-to-leftbyrepeatedapplicationofthelogicaloperator⇒.Moreexactly, let f denote the success pattern formula for p (~x ) in the previously computed lfp i i i and let d denote the call pattern formula for p (~x ) in D . Set e =1 and then i i i k n+1 compute e =d ∧(f ⇒e ) for 1≤i≤n. Each e describes a safe calling mode i i i i+1 i for the compound goal p (~x ),...,p (~x ). i i n n 2.4.2 Intuition and explanation Theintuitionbehindthesymbolismisthatd representsthedemandthatisalready i known for p (~x ) not to error whereas e is d possibly strengthened with extra i i i i demandsoastoensurethatthesub-goalp (~x ),...,p (~x )alsodoesnoterror i+1 i+1 n n when executed immediately after p (~x ). Put another way, anything larger than d i i i A Backward Analysis for Constraint Logic Programs 9 maypossiblycauseanerrorwhenexecutingp (~x )andanythinglargerthane may i i i possibly cause an error when executing p (~x ),...,p (~x ). i i n n The basic inductive step in the analysis is to compute an e which ensures that i p (~x ),...,p (~x ) does not error, given d and e which respectively ensure that i i n n i i+1 p (~x ) andp (~x ),...,p (~x ) donoterror.This steptranslatesa demandafter i i i+1 i+1 n n thecalltop (~x )intoademandbeforethecalltop (~x ).Thetacticistosete =1 i i i i n+1 and then compute e = d ∧(f ⇒ e ) for i ≤ n. This tactic is best explained i i i i+1 by unfolding the definitions of e , then e , then e , and so on. This reverse n n−1 n−2 orderingreflectstheorderinwhichthee arecomputed;thee arecomputedwhilst i i walking backward across the clause. Any calling mode is safe for the empty goal andhencee =1.Notethate =d ∧(f ⇒e )=d ∧(¬f ∨1)=d .Hence n+1 n n n n+1 n n n e represents a safe calling mode for the goal p (~x ). n n n Observe that e should not be larger than d , otherwise an error may occur i i while executing p (~x ). Observe too that if p (~x ),...,p (~x ) is called with a mode i i i i n n described by d , then p (~x ),...,p (~x ) is called with a mode described by i i+1 i+1 n n (d ∧f ) since f describes the success patterns of p (~x ). The mode (d ∧f ) may i i i i i i i satisfy the e demand. If it does not, then the minimal extra demand is added i+1 to (d ∧f ) so as to satisfy e .This minimal extrademand is ((d ∧f )⇒e ) – i i i+1 i i i+1 the weakest mode that, in conjunction with (d ∧f ), ensures that e holds. Put i i i+1 another way, ((d ∧f )⇒e )=∨{f ∈Pos|(d ∧f )∧f |=e }. i i i+1 i i i+1 Combining the requirements to satisfy p (~x ) and then p (~x ),...,p (~x ), i i i+1 i+1 n n gives e = d ∧((d ∧f )⇒e ) which reduces to e = d ∧ (f ⇒ e ) and i i i i i+1 i i i i+1 corresponds to the tactic used in the basic inductive step. 2.4.3 Pseudo-complement This step of calculating the weakest mode that when conjoined with d ∧f im- i i plies e , is the very heart of the analysis. Setting e = 0 would trivially achieve i+1 i safety, but e should be as weak as possible to maximise the class of safe queries i inferred. For Pos, computing the weakest e reduces to applying the ⇒ operator, i butmoregenerally,thisstepamountstoapplyingthepseudo-complementoperator. The pseudo-complement operator (if it exists for a given abstract domain) takes, as input, two abstractions and returns, as output, the weakest abstraction whose conjunctionwiththefirstinputabstractionisatleastasstrongasthesecondinput abstraction. If the domain did not possess a pseudo-complement, then there is not alwaysaunique weakestabstraction(whoseconjunctionwithonegivenabstraction is at least as strong as another given abstraction). To see this, consider the domain Def (Armstrong et al., 1998) which does not possess a pseudo-complement. Def is the sub-class of Pos that is definite (Arm- strong et al., 1998). This means that Def has the special property that each of its Boolean functions can be expressed as a (possibly empty) conjunction of propo- sitional Horn clauses. As with Pos, Def is assumed to be augmented with the bottom element 0. Def can thus represent the grounding dependencies x∧y, x, x⇔y, y, x⇐y, x⇒y, 0 and 1 but not x∨y. Suppose that d ∧f = (x ⇔ y) i i and e = (x∧y). Then conjoining x with d ∧f would be at least as strong i+1 i i 10 Andy King and Lunjin Lu as e and symmetrically conjoining y with d ∧f would be at least as strong i+1 i i as e . However, Def does not contain a Boolean function strictly weaker than i+1 both x and y, namely x∨y, whose conjunction with d ∧f is at least as strong i i as e . Thus setting e = x or e = y would be safe but setting e = (x∨y) is i+1 i i i prohibited because x∨y falls outside Def. Moreover, setting e = 0 would loose i anunacceptable degreeofprecision.Achoice wouldthus haveto be made between setting e = x and e = y in some arbitrary fashion, so there would be no clear i i tactic for maximising precision. Returningtothecompoundgoalp (~x ),...,p (~x ),acalldescribedbythemode i i n n d ∧((d ∧f ) ⇒ e ) is thus sufficient to ensure that neither p (~x ) nor the sub- i i i i+1 i i goal p (~x ),...,p (~x ) error. Since d ∧((d ∧f ) ⇒ e ) = d ∧(f ⇒ e ) i+1 i+1 n n i i i i+1 i i i+1 = e it follows that p (~x ),...,p (~x ) will not error if its call is described by e . In i i i n n i particular, it follows that e describes a safe calling mode for the body atoms of 1 the clause p(~x)←d⋄f,p (~x ),...,p (~x ). 1 1 n n The next step is to calculate g = d ∧(f ⇒ e ). The abstraction f describes 1 the grounding behaviour of the Herbrand constraint added to the store prior to executing the body atoms. Thus (f ⇒ e ) describes the weakest mode that, in 1 conjunction with f, ensures that e holds, and hence the body atoms are called 1 safely. Hence d∧(f ⇒ e ) represents the weakest demand that both satisfies the 1 body atoms and the assertion d. One subtlety which relates to the abstraction process, is that d is required to be a lower-approximationof the assertion whereas f is required to be an upper-approximation of the constraint. Put another way, if the mode d describes the binding on the store, then the (concrete) assertion is satisfied, whereas if the (concrete) constraint is added to the store, then the store is described by the mode f. Table 1 details how to abstract various builtins for groundness for a declarative subset of ISO Prolog. 2.4.4 Strengthening lower approximations Variables not present in p(~x), Y say, are then eliminated by g′ =∀ (g) (strength- Y ening g) where ∀ (g) = ∀ (...∀ (g)). A safe calling mode for this partic- {y1...yn} y1 yn ular clause is then given by g′. Eliminating variables from g by strengthening g is unusual and initially appears strange. Recall, however, that call patterns can be approximatedfrombelow withoutcompromisingcorrectness(butnotfromabove). In particularthe standardprojectiontactic of computing ∃ (g) would result {y1...yn} in an upper approximationof g that possibly describes a larger set of concrete call patterns which would be incorrect. The direction of approximation thus dictates thateliminatingthevariablesY fromgmuststrengtheng.Indeed,gholdswhenever ∀ (g) holds and therefore g holds whenever ∀ (g) holds as required. yi {y1...yn} D will contain a call pattern hp(~x),g′′i and, assuming g′ ∧g′′ 6= g′′, this is k+1 updated with hp(~x),g′∧g′′i. Thus the call patterns become progressively stronger on each iteration. Correctness is preserved because call patterns can be safely ap- proximated from below. The space of call patterns forms a complete lattice which ensures that a gfp exists. In fact, because call patterns are approximated from be- low, the gfp is the most precise solution, and therefore the desired solution. (This

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.