ebook img

On the survival probability of a random walk in random environment with killing PDF

0.26 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview On the survival probability of a random walk in random environment with killing

On the survival probability of a random walk in random environment with killing 5 1 Stefan Junk 0 2 TechnischeUniversit¨at Mu¨nchen Department of Mathematics n M14 - Chair for Probability Theory a Boltzmannstr. 3, J 85747 Garching beiMu¨nchen, Germany 7 E-mail address: [email protected] URL:https://www-m14.ma.tum.de/personen/junk/ ] R P Abstract. We consider one dimensional random walks in random environment . where every time the process stays at a location, it dies with a fixed probability. h t Under some mild assumptions it is easy to show that the survival probability goes a to zero as time tends to infinity. In this paper we derive formulas for the rate with m which this probability decays. It turns out that there are three distinct regimes, [ depending on the law of the environment. 1 v 1 2 1. Introduction 5 1 0 Weusethefollowingnotationstodescriberandomwalksinrandomenvironments . (RWREs) on Z in i.i.d. environments: Let P be a probability measure on 1 0 Ω:={(ωx)x∈Z =(ωx+,ωx0,ωx−)x∈Z |∀x: ωx+,ωx0,ωx− ≥0,ωx++ωx0+ωx− =1} 5 1 We interpretan elementω ∈Ω as the transitionprobabilities for a randomwalkin v: Z: Let (Xn)n∈N be Markov chain on Z with transition kernel Xi Pω(Xn+1 =z+1|Xn =z) :=ωz+ P (X =z−1|X =z) :=ω− ω n+1 n z r P (X =z|X =z) :=ω0 a ω n+1 n z We write Pz := P (.|X = z), and will omit the z if the random walk starts in ω ω 0 zero. We are interested in the joint probability measure Pz :=P ×Pz ω of environment and random walk. An important assumption will be that P is a product measure, meaning that {ω :x∈Z} is i.i.d. under P x Itisreasonabletodemandthat,P-almostsurely,ω+ >0andω− >0,sothat(with 0 0 probability one) the process can move infinitely far in both directions. We make the even stronger assumption, that (P-almost surely) there is a uniform bound for 2000 Mathematics Subject Classification. 60K37. Key words and phrases. Randomwalkinrandomenvironment,survival. 1 2 Stefan Junk theminimalprobabilityofgoingtotheleftorright. Thisconditionisusuallycalled ”uniform ellipticity” (UE): ∃ε >0 s.t. P(ω+ ≥ε )=P(ω− ≥ε )=1 (UE) 0 0 0 0 0 An important quantity is denoted ω− ρ := i for i∈Z (1.1) i ω+ i An overview over results for this kind of random walk in random environment can be found for example in Zeitouni (2004). We now introduce the model we are interested in. It is originally motivated by two papers on statistical mechanics1 where a higher-dimensional version was used to describe polymers folding in a solution having random variations in the local density. FromthisphysicalmotivationitisdesirablethattheRWREstaysinparts of the environment where the probability on staying at the same location (which can be interpreted as a local density of a solution) is small. We model this by choosing some r ∈ (0,1), and whenever the process stays in place, it dies with probability r. Remark 1.1. TheresultsinthispaperareformeasuresP ontheenvironmentwhere thesurvivalprobabilityisdominatedbyeventsdependingonlyonP,suchthatthe parameter r does not appear in the result. See also the remark in section 3.5. Formally, consider a probability space as above where we additionally have a sequence (ξ ) of i.i.d. Bernoulli random variables, independent of environment n n and random walk, with success probability r. Then we can define the extinction time τ by τ :=inf{n≥1:X =X ,ξ =1} (1.2) n n−1 n If we assume that there is a positive probability for extinction, that is P(ω0 >0)>0 0 then it is easy to show lim P(τ >n)=0. We are interested in the asymptotic n→∞ behavior of P(τ >n). 2. Preliminaries One can easily give a lower bound for the survival probability by considering a set of environments in which the survival probability is large, and such that the probability for such a favorable environment is not too small. In this section we introduce the notion of a valley, which will play the role of such an environment. For some fixed ω ∈Ω we define the potential function as follows: ⌊x⌋ lnρ if ⌊x⌋>0 i=0 i V :R→R, x7→ 0 if ⌊x⌋=0 (2.1)  −P−i=1⌊x⌋lnρi if ⌊x⌋<0 For some interval [a,c] and an environmPent ω we define the following quantities: H (a,c):= max max V(x)− min V(x) + b∈[a,c] x∈[b,c] x∈[a,b] (cid:18) (cid:19) 1Giacometti etal.(1994)andGiacometti andMurthy(1996) On thesurvivalprobability of a random walk in random environment with killing 3 H (a,c):= max max V(x)− min V(x) − b∈[a,c] x∈[a,b] x∈[b,c] (cid:18) (cid:19) H(a,c):=min{H (a,c),H (a,c)} + − See also figure 2.1 for an illustration. When no confusion occurs we simply write H ,H and H. + − V(x) H + H − x a b c Figure 2.1. We denote by H− the maximal difference V(x)−V(y) in the potential betweenanytwopointsx<y in[a,b]. ThesameholdsforH+ with x < y replaced by x > y. Starting from the point of minimal potential, the randomwalkhastoovercomeapotential differenceofatleastH−∧H+=H toleavethevalley. We denote the first hitting time of the boundary by U =U :=inf{n≥0:X =a∨X =c} a,c n n Then we have the following two lemmas, which give upper and lower bounds on the probability of leaving a valley. Lemma 2.1. Let ω ∈Ω be an environment such that ∀x∈Z: ω0 =0,ω+ ≥ε ,ω− ≥ε x x 0 x 0 where ε is the constant from (UE). Then there exist constants γ > 0 and γ = 0 1 2 γ (ε )>0 such that for c−a≥γ (ε ) and for all n≥1 we have: 2 0 2 0 U max Px a,c >n ≤e−n (2.2) x∈(a,c) ω (cid:18)γ1(c−a)4eH(a,c) (cid:19) Lemma 2.2. Let ω be as in lemma 2.1, and assume we find some b ∈ (a,c) such that V(c)= max V(x), V(a)= max V(x) (2.3) x∈[b,c] x∈[a,b] Then there are γ ,γ >0 such that for c−a≥γ and for all n≥1 we have 3 4 4 U 1 min Px γ ln(2(c−a)) a,c >n ≥ e−n (2.4) x∈(a,c) ω 3 eH(a,c) 2(c−a) (cid:18) (cid:19) 4 Stefan Junk This versionoflemmas 2.1and2.2istakenfromGantertetal.(2009)2, whereas the proofscanbe foundin Gantertetal. (2010)3. Note thatthe point bofminimal potentialappearsonlyintheassumption,butnotintheresultoflemma2.2. Butif werestrictourselvestoarandomwalkstartingfromb,wecangetridtheassumption that the environment attains maximal potential at the edges: Corollary 2.3. Under the assumptions of lemma 2.1, let b be the point with min- imal potential in (a,c). Then for all c−a≥γ and all n≥1 we have 4 U 1 Pb γ ln(2(c−a)) a,c >n ≥ e−n (2.5) ω 3 eH(a,c) 2(c−a) (cid:18) (cid:19) Here we can use the same constants γ ,γ as in lemma 2.2. 3 4 Proof of the corollary: Choosea¯∈[a,b]andc¯∈[b,c]suchthatV(a¯)=max V(x) x∈[a,b] and V(c¯) = max V(x). Then [a¯,c¯] satisfies (2.3), and we have H(a,c) = x∈[b,c] H(a¯,c¯). Using lemma 2.2 we get U U Pb γ ln(2(c−a)) a,c >n ≥Pb γ ln(2(c¯−a¯)) a¯,c¯ >n ≥ ω 3 eH(a,c) ω 3 eH(a¯,c¯) (cid:18) (cid:19) (cid:18) (cid:19) 1 1 ≥ e−n ≥ e−n 2(c¯−a¯) 2(c−a) (cid:3) Let b ,b ,h>0, x∈Z and k ∈N. We define the event that there is a valley of 1 2 depth hlnn at the interval I :=[x−b lnn,x+b lnn] around the location x by n 1 2 V(x)=min V(y) 1 y∈In Sk(x,b ,b ,h):= ∀i∈I :ω0 ≤ ∩ V(x)−V(x−b lnn)≥hlnn (2.6) n 1 2 n i k  1  (cid:26) (cid:27) V(x−b2lnn)−V(x)≥hlnn NotethatsinceweextendedthedefinitionofV toR,thisdependsonlyonlocations in x+⌊−b lnn⌋,x+⌊b lnn⌋ ∩Z 1 2 In the definition of Sk, (cid:2)the left event ensures that (cid:3)while the random walk stays n inside I , the probability of dying is not too large. The right event implies for n ω ∈Sk(x,b ,b ,h) that n 1 2 H(x−b ln,x+b ln)≥hlnx 1 2 so that we can use corollary 2.3 to bound the probability of leaving I . n For k =∞, we denote S∞(x,b ,b ,h):= Sk(x,b ,b ,h) n 1 2 n 1 2 k∈N \ Equivalently,S∞(x,b ,b ,h)canbedefinedasin(2.6),withthelefteventreplaced 1 2 by ∀i∈I :ω0 =0 . n i (cid:8) (cid:9) 2cf. Gantertetal.(2009), p23 3cf. Gantertetal.(2010), pp15 On thesurvivalprobability of a random walk in random environment with killing 5 3. Results 3.1. The polynomial case. Wefirstcoverthecasewherewecancreateavalleysuch that ω0 is zero onthe inside ofthe valley. Consequently,the randomwalk survives · as soon as we can ensure that it does not leave the valley. Lemma 3.1. Assume p:=min P(lnρ >0,ω0 =0),P(lnρ <0,ω0 =0) >0 (3.1) 0 0 0 0 Then for all b ,b ,h>n0 and k ∈N∪{∞}, we have o 1 2 lnP(Sk(x,b ,b ,h)) lim n 1 2 =C (b ,b ,h) k 1 2 n→∞ −lnn where 1 1 C (b ,b ,h)=−(b +b )lnP ω0 ≤ +sup ht−b lnE ρt ω0 ≤ + k 1 2 1 2 0 k 1 0 0 k (cid:18) (cid:19) t>0(cid:26) (cid:18) (cid:12) (cid:19)(cid:27) (cid:12) 1 (cid:12) +sup ht−b lnE ρ−t ω0 ≤ (cid:12) (3.2) 2 0 0 k t>0(cid:26) (cid:18) (cid:12) (cid:19)(cid:27) Furthermore we set (cid:12) (cid:12) (cid:12) D (h):=inf C (b ,b ,h):b ,b >0 k k 1 2 1 2 n o Then lim D (h)=D (h) k ∞ k→∞ Themainresultofthispaperisthatinthiscase,P(τ >n)decaysatapolynomial rate. Theorem 3.2. Assumption (3.1) implies lnP(τ >n) lim =D (1)∈(0,∞) ∞ n→∞ −lnn 3.2. Survival inside of a valley. In this section we cover the case where (3.1) is violated, meaning there are no valleys with ω0 = 0 on the inside. It may however · happen that there are valleys consisting of locations that are not too dangerous in the sense that ω0 decays with n. We denote such an event by · T (x,b ,b ,h):=Sn(x,b ,b ,h) (3.3) n 1 2 n 1 2 V(x)=min V(y) 1 y∈In = ∀i∈I :ω0 ≤ ∩ V(x)−V(x−b lnn)≥hlnn n i n  1  (cid:26) (cid:27) V(x−b2lnn)−V(x)≥hlnn To decide when T has positive probability we look at n   1 p+ :=P lnρ >0,ω0 ≤ n 0 0 n (cid:18) (cid:19) 1 p− :=P lnρ <0,ω0 ≤ n 0 0 n (cid:18) (cid:19) 1 p0 :=P lnρ =0,ω0 ≤ n 0 0 n (cid:18) (cid:19) 6 Stefan Junk By monotone convergence we see that (3.1) being violated implies lim min p+,p− =0 (3.4) n n n→∞ n o For this section assume ∀n: min p+,p− >0 (3.5) n n ThenT haspositiveprobabilityforallnn. InordoertocomputeP(T )weneedsome n n regularity in the way lnρ behaves when conditioning on ω0 ≤ 1 . We assume · · n the following limits exist as weak limits of probability distributions: (cid:8) (cid:9) 1 P+(·):=P lnρ ∈· ω0 ≤ ,lnρ >0 −−−w−→ P+(·) (3.6a) n 0 0 n 0 n→∞ (cid:18) (cid:12) (cid:19) (cid:12) 1 Pn−(·):=P −lnρ0 ∈·(cid:12)(cid:12)ω00 ≤ n,lnρ0 <0 −n−→−w−∞→ P−(·) (3.6b) (cid:18) (cid:12) (cid:19) (cid:12) 1 Q+n(·):=P lnρ0 ∈·(cid:12)(cid:12)ω00 ≤ n,lnρ0 ≥0 −n−→−w−∞→ Q+(·) (3.6c) (cid:18) (cid:12) (cid:19) (cid:12) 1 Q−n(·):=P −lnρ0 ∈·(cid:12)(cid:12)ω00 ≤ n,lnρ0 ≤0 −n−→−w−∞→ Q−(·) (3.6d) (cid:18) (cid:12) (cid:19) (cid:12) Here−→w denotes weakconverge(cid:12)nce,andP+,P−,Q+ andQ− arethe limiting mea- (cid:12) sures having support in [0,∞). We make the following assumption to ensure that the first two limits are non-degenerate, meaning (0,∞) has positive probability. ∃t>0 such that P+([t,∞))>0 and P−([t,∞))>0 (3.7) Note that Q+ or Q− are allowed to be degenerate, by which we mean equal to the Dirac measure in zero. Also note that in the previous case we had P+(·)=P(lnρ ∈·|ω =0,lnρ >0) 0 0 0 and (3.1) implied (3.7). We need to look at the limiting distribution of lnρ still a · little closer: Define 1 ε+ :=sup ε:P ω0 ≤ ,lnρ >ε >0 n 0 n 0 (cid:26) (cid:18) (cid:19) (cid:27) 1 1−ε =sup ε:P lnρ >ε ω0 ≤ ,lnρ >0 >0 ≤ln 0 0 0 n 0 ε (cid:26) (cid:18) (cid:12) (cid:19) (cid:27) 0 Here the last inequality is due to(cid:12) (UE). The sequence (ε+) is decreasing and (cid:12) n n bounded by zero, therefore some li(cid:12)mit exists: ε+ := lim ε+ n n→∞ We have P+([ε+,∞)) = 0 all n ≥ m. Therefore P+([ε+,∞)) = 0 for any m, and n m m (3.7) implies ε+ > 0. Alternatively, ε+ is the essential supremum of a random variable having distribution P+. We define ε− ∈ (0,∞) similarly as the essential supremumofarandomvariablewithdistributionP−. Inthesamewaytheessential infima can be controlled: δ+ :=sup t≥0:Q+([0,t])=0 ≥0 (3.8) n n n o This is an increasing sequence bounded by ε+, and we set δ+ := lim δ+ ∈[0,ǫ+] n n→∞ On thesurvivalprobability of a random walk in random environment with killing 7 Note thatδ+ andδ− correspondto the essentialinfima ofrandomvariableshaving distribution Q+ and Q−. Both δ+ or δ− may be zero, for example if (3.4) holds together with P(ω0 =0,ω+ =ω− = 1)>0. Those quantities will now play a role 0 0 0 2 in the value of the exponent. 3.3. The intermediate case. Lemma 3.3. In addition to assumptions (3.4) - (3.7), assume the following limits exist in [0,∞]: lnp− lnp0 a+ := lim n and a+ := lim n n→∞lnp+n 0 n→∞lnp+n and set a− := lim lnp+n = a+ −1 as well as a− := lim lnp0n =a−a+ n→∞lnp−n 0 n→∞lnp−n 0 (cid:0) (cid:1) We then get lnP(T (x,b ,b ,h)) n 1 2 lim =C(b ,b ,h) (3.9) n→∞lnnln(min{p+n,p−n}) 1 2 where C(b ,b ,h)=min{1,a−}C+(b ,h)+min{1,a+}C−(b ,h) (3.10) 1 2 2 1 and h+δ+b +min{1,a−,a−}(−h+b ε−) C−(b ,h)= 1 0 1 1 ε−+δ+ h+δ−b +min{1,a+,a+}(−h+b ε+) C+(b ,h)= 2 0 2 2 ε++δ− This result applies uniformly for all b ∈ [ h ,k],b ∈ [ h ,k] and furthermore we 1 ε− 2 ε+ have h h inf C−(b ,h):b ∈ ,k = 1 1 ε− ε− (cid:26) (cid:20) (cid:21)(cid:27) h h inf C+(b ,h):b ∈ ,k = 2 2 ε+ ε+ (cid:26) (cid:20) (cid:21)(cid:27) Inthe lemma,weset 1 :=∞, 1 :=0andln0:=−∞. Becauseofthislastpart, 0 ∞ p0 =0 for some n implies a+ =a− =∞. n 0 0 The intuition behind a+ is that this quantity measures how much P+ and Q+ 0 differ: If a+ > 1, then P+ = Q+ holds, while a+ < 1 implies Q+({0}) = 1. 0 0 Similarly to the quantity D (h) in lemma 3.1, we write k min{1,a−} min{1,a+} D := + ε+ ε− so that h h inf C(b ,b ,h):b ≥ ,b ≥ =hD 1 2 1 ε− 2 ε+ (cid:26) (cid:27) Then we have the following result 8 Stefan Junk Theorem 3.4. We work under the assumptions of lemma 3.3. Assume that for some κ>0 lnmin{p+,p−} lim n n =c∈(0,∞) (3.11) n→∞ −lnκn then 1 κ κ lnP(τ >n) lnP(τ >n) D ≤liminf ≤limsup ≤D (3.12) 1+κ(cid:18)1+κ(cid:19) n→∞ −clnκ+1n n→∞ −clnκ+1n 3.4. The stretched-exponential case. We have obtained some results for the case where min{p+,p−} is of constant order or of the order e−clnκn for some κ,c > 0. n n Now we give a weaker result the remaining case. Theorem 3.5. Assume that the assumptions of lemma 3.3 hold, but instead of (3.11) we have some κ>0 such that ln(min{p+,p−}) lim n n =c∈(0,∞) (3.13) n→∞ −nκ Then we have κ ln(−lnP(τ >n)) ln(−lnP(τ >n)) ≤liminf ≤limsup ≤κ (3.14) 1+5κ n→∞ lnn n→∞ lnn Let us briefly consider why the concept of valleys is not well suited for handling thecaseofTheorem3.5. ConsidertwomeasuresP andP˜ ontheenvironmentsuch that both satisfy min{p+,p−}∼ce−nκ n n but 1 P ω0 =0,ω+ =ω− = =:γ >0 (3.15) 0 0 0 2 (cid:18) (cid:19) and 1 P˜ ω0 =0,ω+ =ω− = =0 (3.15’) 0 0 0 2 (cid:18) (cid:19) DenotetheRWREshavingmeasureP andP˜ fortheenvironmentbyPrespectively P˜. We define for the interval In :=[−n31,n31 ] the event 1 S := ∀x∈I :ω0 =0,ω+ =ω− = n x x x 2 (cid:26) (cid:27) that the RWRE inside the interval I is a simple random walk. We have P(S) ∼ n exp(2n13 lnγ). We can use the following result for simple random walks starting in zero: Theorem 3.6. Let U be the first time that the random walk hits the boundary an n interval of length l(n) around zero. Moreover assume l2(n) lim l(n)=∞ and lim =0 n→∞ n→∞ n Then for c>0 l2(n) cπ2 lim lnP(U ≥cn)=− n n→∞ n 8 On thesurvivalprobability of a random walk in random environment with killing 9 The proof can be found in Spitzer (1976), pp 237. Using this Theorem with l(n)=n13, we have π P(τ >n)≥P(S) inf Pω(Un ≥n)≥exp − +2lnγ−ǫ n31 ω∈S 8 n(cid:16) (cid:17) o OntheotherhandthereisnocorrespondinglowerboundforP˜. Nowforκ> 1 this 3 bound is better than the one obtained by surviving on the inside of a valley, since thecostofcreatingavalleyisoftheordere−cnκln(n). Howevertheprobabilitythat a valley occurs does not depend on whether (3.15) or (3.15’) hold. 3.5. Remarks. Remark 3.7. In the definition of the model, we introduced the quantity r as the probabilitythattherandomwalkdiesonceitstaysatthesamelocation. Onenotes that r does not appear in the constants in (3.2) and (3.10). That is because we have obtained the results under conditions (3.1) and (3.5), which imply that there arevalleys,whereontheinsidetheprobabilityforsurvivalis1inthefirstcase,and some positive constant depending on r in the second case. In the proofs we show that the survival probability is dominated by the probability that such a valley is formed, which depends only on the environment and not on r. Remark 3.8. An interesting question is whether for any sequence (qn)n∈N, we can find a probability measure for which min{p+,p−} decays exactly as (q ) . Indeed n n n n wecaneasilyconstructasuitablemeasure. Itisenoughtodescribethedistribution of ω because P is a product measure. For this, let (q ) be any real sequence in 0 n n [0,1] decreasing to zero. Define 1+ε 1 1 1 1 Π+ := 1− , , 1− n 2+ε n n 2+ε n (cid:18) (cid:18) (cid:19) (cid:18) (cid:19)(cid:19) 1 1 1 1+ε 1 Π− := 1− , , 1− n 2+ε n n 2+ε n (cid:18) (cid:18) (cid:19) (cid:18) (cid:19)(cid:19) Then for any n ω =Π+ =⇒ ρ =1+ε 0 n 0 ω =Π− =⇒ ρ =(1+ε)−1 0 n 0 Nowwe define P as the discreteprobability measuretaking values inthe set{Π± : n n∈N}, with q −q P(ω =Π+)=P(ω =Π−)= n n+1 ∈[0,1],n∈N 0 n 0 n 2c Herec:=q isthenormalizingconstantsuchthatP isaprobabilitymeasure. Now 0 independently of n 1 1 1 P lnρ ∈· ω0 ≤ ,ρ 6=1 = δ (·)+ δ (·) 0 0 n 0 2 −ln(1+ε) 2 ln(1+ε) (cid:18) (cid:12) (cid:19) (cid:12) where δx is the Dirac di(cid:12)stribution in x. Conditions (3.6a),(3.6b) are satisfied with (cid:12) P+([t,∞))=P−([t,∞))= (t) [ln(1+ε),∞) 1 Moreoverwe get ε+ =ε− =δ+ =δ− =ln(1+ε) and a+ =a− =1. We summarize this in the following corollary: 10 Stefan Junk Corollary 3.9. For every sequence (q ) in [0,1] decreasing to zero there is a prob- n n ability measure P such that min{p+,p−}=q ∀n∈N n n n Inparticular, foreveryκ>0wefindaprobability measureP andconstantsc ,c > 1 2 0 such that for all n large enough e−c1ln1+κn ≤P(τ >n)≤e−c2ln1+κn 4. Proofs - the polynomial case The proofs in this section modify the proof of Theorem 1.3 in Gantert et al. (2009)4. We use Cramer’s Theorem from large deviations5: Theorem 4.1. Let (X ) be i.i.d. random variables taking values in R with law n n≥1 Q such that the moment generating function Λ(t) := EQ(etX1) is finite for some t>0. Then for all x>E(X ) we have 1 lnQ(1 k X ≥x) lim − k i=1 i =Λ∗(x) (4.1) k→∞ k P where Λ∗ is the Legendre transform of Q: Λ∗(x)=sup{tx−lnΛ(t)} t≥0 Proof of lemma 3.1: For ease of notation we will omit the integer parts, that is we treat b lnn, b lnn and hlnn as integers. We fix k and say that a location x is 2 1 safe if ω0 ≤ 1. Consider the following events: x k A+ :={ω is safe ∀i∈[x,x+b lnn]} (4.2a) n i 2 A− :={ω is safe ∀i∈[x−b lnn,x]} (4.2b) n i 1 x+b2lnn B+ := lnρ ≥hlnn (4.2c) n i ( ) i=x X x B− := lnρ ≥hlnn (4.2d) n i ( ) i=x−Xb1lnn C+ :={V(x)=min{V(y):y ∈[x,x+b lnn]}} (4.2e) n 2 C− :={V(x)=min{V(y):y ∈[x−b lnn,x]}} (4.2f) n 1 Now Sk decomposes as n Sk(x,b ,b ,h)=A+∩B+∩C+∩A−∩B−∩C− n 1 2 n n n n n n Conditioned on A+, the distribution of lnρ for i=0,...,b lnn is n i 2 1 Q(·):=P lnρ ∈· ω0 ≤ 0 0 k (cid:18) (cid:12) (cid:19) (cid:12) (cid:12) (cid:12) 4cf. Gantertetal.(2009), pp7 5cf. forexamplesection2.2.1inDemboandZeitouni(2010), inparticularcorollary2.2.19

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.