Causal Commutative Arrows Revisited Jeremy Yallop Hai (Paul) Liu UniversityofCambridge IntelLabs September 21, 2016 1/22 (cid:73) Plausible, because it preserves semantics. (cid:73) Effective, when conditions are met: (cid:73) It has to terminate; (cid:73) It gives simpler program as a result; (cid:73) It enables other optimizations. (cid:73) with a few catches: (cid:73) Strongly normalizing can be too restrictive; (cid:73) Sharing is hard to preserve; (cid:73) Static or dynamic implementation? Normalization as an optimization technique? 2/22 (cid:73) Effective, when conditions are met: (cid:73) It has to terminate; (cid:73) It gives simpler program as a result; (cid:73) It enables other optimizations. (cid:73) with a few catches: (cid:73) Strongly normalizing can be too restrictive; (cid:73) Sharing is hard to preserve; (cid:73) Static or dynamic implementation? Normalization as an optimization technique? (cid:73) Plausible, because it preserves semantics. 2/22 (cid:73) It has to terminate; (cid:73) It gives simpler program as a result; (cid:73) It enables other optimizations. (cid:73) with a few catches: (cid:73) Strongly normalizing can be too restrictive; (cid:73) Sharing is hard to preserve; (cid:73) Static or dynamic implementation? Normalization as an optimization technique? (cid:73) Plausible, because it preserves semantics. (cid:73) Effective, when conditions are met: 2/22 (cid:73) with a few catches: (cid:73) Strongly normalizing can be too restrictive; (cid:73) Sharing is hard to preserve; (cid:73) Static or dynamic implementation? Normalization as an optimization technique? (cid:73) Plausible, because it preserves semantics. (cid:73) Effective, when conditions are met: (cid:73) It has to terminate; (cid:73) It gives simpler program as a result; (cid:73) It enables other optimizations. 2/22 Normalization as an optimization technique? (cid:73) Plausible, because it preserves semantics. (cid:73) Effective, when conditions are met: (cid:73) It has to terminate; (cid:73) It gives simpler program as a result; (cid:73) It enables other optimizations. (cid:73) with a few catches: (cid:73) Strongly normalizing can be too restrictive; (cid:73) Sharing is hard to preserve; (cid:73) Static or dynamic implementation? 2/22 Arrows Arrows are a generalization of monads (Hughes 2000). class Arrow (arr ::∗ → ∗ → ∗) where arr ::(a → b) → arr a b (≫)::arr a b → arr b c → arr a c first ::arr a b → arr (a,c) (b,c) class Arrow arr ⇒ ArrowLoop arr where loop ::arr (a,c) (b,c) → arr a b (a) arr f (b) f ≫g (c) first f (d) second f (e) f (cid:63)(cid:63)(cid:63)g (f) loop f 3/22 Arrow and ArrowLoop laws arr id ≫f ≡ f (leftidentity) f ≫arr id ≡ f (rightidentity) (f ≫g)≫h ≡ f ≫(g ≫h) (associativity) arr (g .f) ≡ arr f ≫arr g (composition) first(arr f) ≡ arr (f×id) (extension) first(f ≫g) ≡ firstf ≫firstg (functor) firstf ≫arr (id×g) ≡ arr (id×g)≫firstf (exchange) firstf ≫arr fst ≡ arr fst≫f (unit) first(firstf)≫arr assoc ≡ arr assoc ≫firstf (association) loop(firsth≫f) ≡ h≫loopf (lefttightening) loop(f ≫firsth) ≡ loopf ≫h (righttightening) loop(f ≫arr (id×k)) ≡ loop(arr (id×k)≫f) (sliding) loop(loopf) ≡ loop(arr assoc-1 .f .arr assoc) (vanishing) second (loopf) ≡ loop(arr assoc .second f .arr assoc-1) (superposing) loop(arr f) ≡ arr (tracef) (extension) 4/22 (b) normalized Normalizing arrows (a dataflow example) (a) original 5/22 Normalizing arrows (a dataflow example) (a) original (b) normalized 5/22
Description: