ebook img

On the speed of a one-dimensional random walk in a random environment perturbed by cookies of strength one PDF

0.21 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview On the speed of a one-dimensional random walk in a random environment perturbed by cookies of strength one

ON THE SPEED OF A ONE-DIMENSIONAL RANDOM WALK IN A RANDOM ENVIRONMENT PERTURBED BY COOKIES OF STRENGTH ONE ELISABETH BAUERNSCHUBERT Abstract. We consider a random walk in an i.i.d. random environment on Z that is perturbed by cookies of strength 1. The number of cookies per site is assumed to be 5 i.i.d. Results on thespeed of the random walk are obtained. 1 Our main tool is the correspondence in certain cases between the random walk and a 0 branchingprocess in a random environment with migration. 2 n a J 6 1. Introduction 1 In [5] and [6] the author studied a left-transient (respectively recurrent) one-dimensional ] R random walk in a random environment that is perturbed by cookies of maximal strength P and established criteria for transience and recurrence. In the current article, we study the . h speed of this random walk. t a m We recall the model from [5, 6] and explain it in a few words. Choose a sequence (px)x Z, with p ∈ (0,1) for all x ∈ Z, at random and puton every integer x ∈ Z a random numb∈er x [ Mx of cookies (Mx ∈N0). Now, start a nearest-neighbor random walk at 0: If the walker 1 encounters a cookie on his current position x, he consumes it and is excited to jump to v x+1 a.s. If there is no cookie, he goes to x+1 with probability p and to x−1 with 3 x 1 probability 1−px. For illustrations of the model see [5, Fig. 1] or [6, Figure 1]. 0 N Z 4 For a precise definition, denote by Ω := ([0,1] ) the space of so-called environments. Let 0 (Ω ,F ) be a suitable measurable space with probability measures P for ω ∈ Ω and ′ ′ x,ω 1. x ∈ Z and (Sn)n 0 a process on Ω′, such that for all ω = ((ω(x,i))i 1)x Z and all z ∈ Z 0 ≥ ≥ ∈ P [S = z] = 1, 5 z,ω 0 1 P [S = S +1|(S ) ] = ω(S ,#{m ≤ n : S =S }), z,ω n+1 n m 1 m n n m n : ≤ ≤ v P [S = S −1|(S ) ] = 1−ω(S ,#{m ≤ n : S = S }) i z,ω n+1 n m 1 m n n m n X ≤ ≤ is satisfied. The so-called excited random walk (ERW for short) (S ) is a nearest- n n 0 r ≥ a neighbor random walk under Pz,ω that starts in z and whose transition probability upon the ith visit to site y ∈ Z is given by ω(y,i). In the usual notion, ω(y,i) is also said to be the strength of the ith cookie on site y. The elements ω ∈ Ω itself are chosen at random according to some probability measure P on Ω. Averaging the so-called quenched measure P over the environments ω yields the x,ω annealed or averaged measure P [·] := E[P [·]] on the product space Ω×Ω . By E, E x x,ω ′ x,ω and E , we denote the expectation operators respectively. x Thediscussionofexcited randomwalks started with [7]whereasimplesymmetricrandom walk(inZd,d≥ 1)isdisturbedbyonecookieateachsite. Themodel,whichisalsoknown Date: January 16, 2015. Key words and phrases. Excited random walk in arandom environment;Cookies ofstrength 1; Speed; Law of large numbers; Branching process in a random environment with migration. 1 2 ELISABETHBAUERNSCHUBERT as cookie random walk, has been generalized in various ways, e.g. in the one-dimensional case among others by Zerner [22], Basdevant and Singh [4] and Kosygina and Zerner [16]. For a recent survey on ERWs see [17]. In our setting, we consider cookies of strength 1. For each x ∈ Z, the number of cookies of maximal strength at site x is defined by M := sup{i ≥ 1 : ω(x,j) = 1 ∀1≤ j ≤ i} x with the convention sup∅ = 0. Throughout the paper, all or parts of the following assumptions on P will be needed, compare also to [5, 6]: Assumptions A. There is P-a.s. a sequence (px)x Z ∈ (0,1)Z such that the following ∈ holds. A.1 ω(x,i) = p for all x ∈ Z and for all i > M . x x A.2 (px,Mx)x Z is i.i.d. and {px,Mx;x ∈ Z} is independent under P. A.3 E[|logρ |2∈] < ∞ and E[ρ2] < ∞ where ρ := (1−p )p 1 for x ∈Z. 0 0 x x −x A.4 P[ρ = 1] < 1. 0 A.5 P[M = 0] > 0. 0 If Assumption A holds with P[M = 0] = 1, the process belongs to the class of random x walks in random environments (RWRE for short). For anoverview andresults on RWREs wereferthereaderto[21]andreferencestherein. InthemoststudiedERWmodel,asimple symmetric random walk is perturbed by cookies; commonly, the number of cookies per site is bounded, but the cookies may have strength between 0 and 1, see e.g. [4, 16, 17]. In order to emphasize that the underlying process in our model is an RWRE, we call our modeldescribedaboveexcited random walk in a random environment (ERWREforshort). This model has already been introduced by the author in [5, 6]. Assumption A.4 excludes the simple symmetric random walk as underlying dynamic. By Assumption A.5 we avoid the trivial case where the random walker encounters at least one cookie on every integer P-a.s. Under Assumption A.1, ω is given P-a.s. by a sequence (p,M) := (px,Mx)x Z. For clarity ∈ and convenience let us therefore write P for the quenched measure instead of P x,(p,M) x,ω and just P if x = 0. (p,M) Forarandomwalkinani.i.d.randomenvironment,Solomonprovedin[19,Theorem(1.7)] the following recurrence and transience criteria. Suppose that Assumptions A.1 and A.2 hold with P[M = 0] = 1 and that E[logρ ] is well defined. Then, P -a.s., lim S = x 0 0 n n +∞ if E[logρ ] < 0, lim S = −∞ if E[logρ ] > 0 and (S ) is rec→ur∞rent if 0 n n 0 n n 0 E[logρ ]= 0. →∞ ≥ 0 Theorem 1.1 in [5] and Theorem 1 in [6] provide transience and recurrence criteria for an ERWRE with underlying left-transient or recurrent RWRE. In the present version we dropped the restriction P[M = ∞] = 0 by allowing P[M = ∞] ∈ [0,1). The notation x x (·) abbreviates max(0,·). + Theorem 1.1 ([5]). Let Assumption A hold and assume that E[logρ ]> 0. 0 (i) If E[(logM ) ]< ∞, then lim S = −∞ P -a.s. 0 + n n 0 (ii) If E[(logM ) ] = ∞ and if lim→su∞p (t·P[logM > t]) < E[logρ ], then S = 0 0 + t 0 0 n infinitely often P -a.s. →∞ 0 (iii) If liminf (t·P[logM > t])> E[logρ ], then lim S =+∞ P -a.s. t 0 0 n n 0 →∞ →∞ ON THE SPEED OF ERWRE 3 Similar criteria in the case where the RWRE is recurrent are provided in [6]. In the current work we study how “fast“ the random walks in Theorem 1.1 and in [6] go to infinity when they are transient (to the left or to the right). Therefore note that by [17, Theorem 4.1] (S ) satisfies in the setting of Theorem 1.1 a strong law of large n n 0 ≥ numbers, i.e. there exists a non-random ν ∈ [−1,1] such that S n lim = ν P -a.s. 0 n n →∞ This limit ν is called speed or velocity of the random walk, see also [17, Sections 4 and 5]. If the underlying dynamic in the ERW model is the simple symmetric random walk and if the number of cookies (here with strength strictly in between 0 and 1) is bounded — i.e. there is a deterministic K ∈ N such that ω(x,i) = 1 for all i > K and x ∈ Z a.s. —, 2 then results on the speed can be found in [4, Theorem 1.1] and [16, Theorem 2]. The key parameter in this case turned out to be the average total drift per site δ¯:= E (2ω(0,i)−1) . " # i 1 X≥ Under some weak ellipticity assumptions, it has been obtained that ν = 0 if δ¯ ∈ [−2,2], ν < 0 if δ¯< −2 and ν > 0 if δ¯> 2, [4, 16]. The speed for an RWRE is given in [19, Theorem (1.16)], see also [21, Theorem 2.1.9 + Remark]. Let us briefly recall Solomon’s result. Theorem 1.2 ([19]). Let Assumptions A.1 and A.2 hold with P[M = 0] = 1 (RWRE) x and let E[logρ ] be well defined. Then, P -a.s., 0 0 (i) ν = 11+−EE[[ρρ00]] > 0 if E[ρ0] < 1, (ii) ν = −11−+EE[[ρρ−0−11]] < 0 if E[ρ−01]< 1 and 0 (iii) ν = 0 if E[ρ ] 1 ≤ 1≤ E[ρ 1]. 0 − −0 For the ERWRE model studied in [5, 6] we will show the following results on the speed in the present paper. Theorem 1.3. Let Assumption A hold and suppose the underlying RWRE is left-transient or recurrent, i.e. E[logρ ]≥ 0. 0 (i) If E[M ] < ∞ and E[ρ 1] < 1 then the ERWRE goes to −∞ with negative speed: 0 −0 ν = lim Sn < 0 P -a.s. n n 0 (ii) If E[M ]→=∞∞ or E[ρ 1]≥ 1, and if additionally E[ρ ]> P[M < ∞] 1 then ν = 0. 0 −0 0 0 − (iii) If E[ρ ] < P[M < ∞] 1 then ν > 0. 0 0 − Note that (i)-(iii) in Theorem 1.3 cover all cases except for E[ρ ] = P[M < ∞] 1. 0 0 − Remark 1.4. By Jensen’s inequality and Assumption A.4, E[logρ ] ≥ 0 implies that 0 E[ρ ] > 1. Hence, if M is finite P-a.s. and if the underlying RWRE is left-transient 0 0 or recurrent there is no chance for the random walker to go to infinity with positive speed in the setting of Theorem 1.3. This is only possible if there are “enough” infi- nite cookie stacks. On the other hand, if the RWRE tends to −∞ with negative speed (i.e. E[ρ 1] < 1), the cookies may slow it down without changing its transience behavior. −0 According to Theorems 1.1 and 1.3 this occurs if E[M ]= ∞, but E[(logM ) ] < ∞. 0 0 + 4 ELISABETHBAUERNSCHUBERT Further questions concern an ERWRE where the underlying dynamic is transient to the right with zero speed. Can cookies accelerate this RWRE? How many cookies of maximal strength can be placed without increasing the speed and what is the influence of the distribution of ρ ? Answers — but not yet in a complete version — are given in the next 0 theorem. Theorem 1.5. Let Assumption A hold and suppose that the underlying RWRE is right- transient with zero speed, i.e. E[logρ ] <0 and E[ρ ] ≥ 1. 0 0 (i) Assume that γE[ρ ]P[M < ∞] > 1, where γ = E[ρβ] with β such that 0 0 0 E[ρβlogρ ]= 0. Then ν = lim Sn = 0, P -a.s. 0 0 n n 0 (ii) If E[ρ ] ≥ P[M = 0] 1 then ν =→0∞. 0 0 − (iii) If E[ρ ] < P[M < ∞] 1 then ν > 0. 0 0 − Remark 1.6. Let us remark that β in Theorem 1.5(i) exists, is unique and 0 < β < 1. Moreover γ < 1. To see this use the moment generating function g(t) := E[ρt], t ∈ R, and 0 recall its properties e.g. from [8]. By Assumption A.3, g(t) is finite on [0,2]. Furthermore note that the derivative is g (t)= E[ρt logρ ] and under the assumptions of Theorem 1.5, ′ 0 0 g(0) = 1, 1≤ g(1) < ∞ and −∞ < g (0) < 0. ′ Note that it is still open if it is possible to obtain positive speed if the underlying RWRE is transient to the right with zero speed and M is finite P-a.s. 0 ToproveTheorems1.3 and1.5 weusethreetools. Inthesituation whentheERWREgoes to −∞ it is not hard to prove non-zero or zero speed. Basically one uses the formulation of thespeedknownfrom e.g. [22, Theorem 13] or [16, Section 7]. Onemethodto studythe speedwhen the ERWRE goes to infinity, is based on a well-known regeneration or renewal structureof these random walks and a relation to certain branching processes, see e.g. [14] in the case of RWREs and [16, Section 6], [17, Sections 4 and 5] and references therein in the case of ERWs. The right-transient ERWRE will be related to a specific branching process in a random environment with migration (BPMRE for short). Its velocity is then positive or non-positive according to whether the expected total size of the BPMRE until its firsttime ofextinction is finiteor infinite. Therefore,this workalso contains aresulton BPMRE. As a third tool — especially used when we deal with infinite cookie stacks and in order to obtain positive speed — we simply apply results about RWREs and exploit the monotonicity of ν with respect to the cookie environment, see [17, Proposition 4.2]. The article is organized as follows. The next section is devoted to the study of a specific branching process in a random environment with emigration. It is slightly different to the BPMRE that corresponds to the ERWRE in the case of transience to the right, but is later used to prove that the expected total size of the latter branching process up to its first time of dying out is infinite. In Section 3 the correspondence between (S ) and a n n 0 ≥ BPMRE is given. Section 4 finally contains the proofs of Theorems 1.3 and 1.5. 2. Branching process in a random environment with migration Inthissectionweintroduceabranchingprocessinarandomenvironmentwithemigration. It will be similar to the BPMRE that is related to the ERWRE in Section 3. The first process has the advantage of being easier to handle. For convenience and in view of its application to (S ) , let us define the branching process on Ω . Therefore, we assume n n 0 ′ without loss of gene≥rality that there is a family {ξ(n);i,n ∈ N} of independent random i variables on Ω such that, for P-a.e. (p,M), ′ P [ξ(n) = k] = (1−p )k ·p for k ∈ N , (p,M) i n n 0 ON THE SPEED OF ERWRE 5 i.e. for all i ∈ N, ξ(n) is geometrically distributed with parameter p . Let us define now i n Z := 1 and for n ≥ 1 recursively 0 Zn−1 (n) Z := ξ −M . n  i n i=1 X +   This process belongs to the class of branching processes in an i.i.d. random environment with migration. In our setting there is no immigration and the number of emigrants is unbounded. Furthermore, in the above definition, the numberof emigrants is immediately subtracted from the population size. Note that 0 is an absorbing state for (Z ) . n n 0 ≥ Given an environment (p,M), the expected number of offspring in generation n per indi- vidual is 1−p (n) n E [ξ ] = = ρ (p,M) 1 p n n and its variance is 1−p (n) n Var [ξ ]= . (p,M) 1 p2 n The literature on branching processes in general is extensive, see for instance [3, 13, 20]. If there is no migration in any generation, i.e. M = 0 for all n ∈ N, then (Z ) n n n 0 ≥ belongs to the class of branching processes in random environments (BPRE for short), see for instance [18, 3] or [9] and references therein. The branching process combining the concept of reproduction according to a random environment with the phenomenon of migration — and here especially unbounded emigration — does not seem to be broadly discussed. To the best of our knowledge, the results given in Proposition 2.1 are not yet covered by the literature. Therefore, we will prove Proposition 2.1, that is required for our study of ERWREs, directly in this section. We use the usual classification of BPREs, see e.g. [3] or [9], and call (Z ) subcritical, n n 0 critical or supercritical according to whether E[logρ ] < 0,= 0 or > 0. Th≥e BPRE dies 1 out a.s. in the subcritical and critical regime, whereas — under a certain integrability condition — the supercritical BPRE may explode, see [18, 3]. Note that the process (Z ) is heuristically similar to some random difference equation: Z should be more n n 0 n ≥ or less (ρ ·Z −M ) . This similarity helps to study the BPRE with emigration in n n 1 n + − the proof of the following proposition. For some heuristic to Proposition 2.1 we refer the reader to the Remarks 2.2 and 2.3 below. Proposition 2.1. Let Assumption A hold and assume E[ρ ] > P[M < ∞] 1. The total 1 1 − population size of (Z ) has infinite mean, i.e. E [ Z ] = ∞, if one of the following n n 0 0 j 0 j conditions holds. ≥ ≥ P (i) (Z ) is supercritical or critical (E[logρ ] ≥ 0). n n 0 1 (ii) (Z ) ≥ is subcritical (E[logρ ] < 0) and γE[ρ ]P[M < ∞] > 1, where γ = E[ρβ] n n 0 1 1 1 1 with β≥such that E[ρβlogρ ] = 0. 1 1 Remark 2.2. Let us give a short heuristic for Proposition 2.1 in the supercritical setting. Consider a sequence that grows exponentially until some “catastrophe” happens that causes extinction. Precisely, for some a > 1 let X˜ := 1 and recursively X˜ := aX˜ if 0 n n 1 M < aX˜ and X˜ := 0 otherwise, for n ∈ N. Now, calculations show that for e−very n n 1 n m ∈ N − m E X˜ ≥ E[X˜ |TX˜ > m]·P[TX˜ > m]= am· P[M < ak] j m 0 0 1 " # j 0 k=1 X≥ Y 6 ELISABETHBAUERNSCHUBERT where TX˜ := inf{n ≥ 1 : X˜ = 0}. Thus, the expected sum of X˜ is infinite if a·P[M < 0 n j 1 ∞]> 1. Remark 2.3. Note that Proposition 2.1 covers supercritical, critical and some subcritical BPREs with emigration. The heuristics to the proposition for supercritical BPREs with emigration were given in Remark 2.2. That the result should also hold for critical BPREs with emigration and specific subcritical BPREs with emigration, is motivated by recent work on BPREs, see e.g. [9, 2, 1] and references therein. There, it was shown that critical andso-calledweaklysubcriticalBPREsbehaveinasupercriticalmannerwhenconditioned on survival. Thus, one can hope that the prize to pay for survival is negligible compared to the growth of an supercritical BPRE with emigration. Proof of Proposition 2.1. The key idea of the proof is to couple (Z ) and a process n n 0 ≥ (X ) that is similar to a random difference equation. More precisely, let X := 1 and n n 0 0 ≥ recursively X := (ρ X −M ) n n n 1 n + − for n ∈ N. Note the analogy to the idea in the proofs of Theorem 2.2 in [5] and Theorem 4 in [6]. If the sequence (M ) is neglected, the growth of (X ) is determined by n n 1 n n 0 ≥ ≥ its “associated random walk”. The same random walk basically describes the behavior of the BPRE without migration, see for instance [9]. Therefore, let us define U := logρ for i i i ∈ N and Y := U +...+U for i ∈ N. Since E[ρ ] = E[expU ] > 1 by assumption, we i 1 i 1 1 can find for every 0 < δ < 1 some κ> 0 and 0 <κ˜ ≤ 1 such that (1) P[U > κ]κ˜ > 1−δ. 1 Set ǫ := κ·κ˜ and for m ∈ N (2) A(m) := {for all 1 ≤ j ≤ m :Y ≥ ǫ·j}. j We control the probability of A(m) from below by P[A(m)] ≥ P[∀1≤ i ≤ ⌈κ˜m⌉: U ≥ κ, ∀⌈κ˜m⌉ < j ≤ m :Y −Y ≥ 0] i j κ˜m ⌈ ⌉ ≥ P[U > κ]κ˜m+1·P[∀1≤ j ≤ (1−κ˜)m : Y ≥ 0] 1 j (3) ≥ P[U > κ]κ˜m+1·P[∀1≤ j ≤ m : Y ≥ 0]. 1 j In order to control the emigration define for m and k ∈N, with m ≥ k B(m,k):= {M = ... = M = 0,∀k < n ≤ m :M < n}. 1 k n The events A(m) and B(m,k) are independent under P and m (4) P[B(m,k)] = P[M = 0]k · P[M < j]. 1 1 j=k+1 Y On A(m)∩B(m,k) with m ≥ k, it is obtained that for all 1 ≤ j ≤ k X = exp(Y )≥ exp(ǫj) j j and by induction for all k < j ≤ m j ∞ X ≥ exp(Y ) 1− nexp(−Y ) ≥ exp(Y ) 1− nexp(−ǫn) . j j n j ! ! n=k+1 n=k+1 X X Thus, there exists 0 < c < 1 such that for sufficiently large k and all m ≥ k, on 1 A(m)∩B(m,k), (5) X ≥ c exp(Y ) ≥ c exp(ǫj) > 0 for all 1 ≤ j ≤ m. j 1 j 1 ON THE SPEED OF ERWRE 7 Fix k ∈ N such that (5) holds. For every m ≥ k X j E Z ≥ E Z ,A(m),B(m,k),∀j ≤ m :Z ≥ 0 j 0 j j j " # " # j 0 j 0 X≥ X≥ 1 X j ≥ E X ,A(m),B(m,k),∀j ≤ m : Z ≥ 0 m j m j (cid:20) (cid:21) c X 1 j (6) ≥ E exp(Y ),A(m),B(m,k),∀j ≤ m : Z ≥ . 0 m j m j (cid:20) (cid:21) For the moment let us have a closer look at P [∀j ≤ m : Z ≥ Xj] on A(m)∩B(m,k). (p,M) j j We can write X j P ∀j ≤ m : Z ≥ (p,M) j j (cid:20) (cid:21) m X X j j 1 (7) = P(p,M)[Z1 ≥ X1] P(p,M) Zj ≥ Zj 1 ≥ − ,...,Z1 ≥ X1 j − j −1 jY=2 (cid:20) (cid:12) (cid:21) (cid:12) and obtain, with (5), on A(m)∩B(m,k) for 2 ≤ j ≤(cid:12)m X X j j 1 P(p,M) Zj ≥ Zj 1 ≥ − ,...,Z1 ≥ X1 j − j −1 (cid:20) (cid:12) (cid:21) (cid:12) n ρ X −M X (cid:12) (j) j j 1 j j 1 = P(p,M)" ξi −Mj ≥ −j Zj−1 = n ≥ j −−1,...,Z1 ≥ X1# n c1Xexp(Yj−1) Xi=1 (cid:12) ≥ j−1 (cid:12) (cid:12) X j 1 ·P(p,M) Zj 1 = n Zj 1 ≥ − ,...,Z1 ≥X1 − − j −1 (cid:20) (cid:12) (cid:21) n (cid:12)ρ (j −1)n 1 (j) (cid:12) j ≥ P ξ ≥ +(1− )M (p,M) i j j j " # n c1Xexp(Yj−1) Xi=1 ≥ j−1 (8) X j 1 ·P(p,M) Zj 1 = n Zj 1 ≥ − ,...,Z1 ≥X1 . − − j −1 (cid:20) (cid:12) (cid:21) Further calculations yield for (p,M(cid:12)(cid:12)) satisfying A(m)∩B(m,k), for n ≥ c1exp(Yj−1) and j 1 2≤ j ≤ m − n n ρ (j −1)n 1 ρ (j −1)n (j) j (j) j P ξ ≥ +(1− )M ≥ P ξ ≥ +j (p,M) i j j j (p,M) i j " # " # i=1 i=1 X X n nρ (j) j (9) = P nρ − ξ ≤ −j . (p,M) j i j " # i=1 X Notethatherenρ ≥ c1exp(Yj) ≥ c1exp(ǫj) by (5). Choosej ∈ Nsuchthatc exp(ǫj)j 3 ≥ j j 1 j 1 0 1 − 2 for all j ≥ j . As in [5−, p. 643] w−e apply now Chebyshevs inequality. For sufficiently 0 large m, we get for all j < j ≤ m 0 n n nρ nρ (j) j (j) j P nρ − ξ ≤ −j ≥ P nρ − ξ ≤ −j (p,M) j i j (p,M) j i j " # " # Xi=1 (cid:12) Xi=1 (cid:12) nVar (ξ(j)) nVar (ξ(j))(cid:12)(cid:12)j2 j(cid:12)(cid:12)2 (p,M) 1 (p,M) 1 (10) ≥ 1− = 1− ≥ 1−4 . (njρj −j)2 (1− njρ2j)2(nρj)2 nρjpj 8 ELISABETHBAUERNSCHUBERT On A(m) we have for all n ≥ c1 exp(Y ) on the one hand nρ p ≥ c1 exp(ǫj)p j 1 j 1 j j j 1 j and on the other hand nρ p = n−(1 − p )−≥ c1 exp(ǫ(j − 1))(1 − p ). Th−us, nρ p ≥ j j j j 1 j j j 1 · c1 exp(ǫ(j −1)). − 2 j 1 − This gives together with (8), (9) and (10) for all j ≤ j ≤ m 0 X X (11) P(p,M) Zj ≥ j Zj 1 ≥ j−1,...,Z1 ≥ X1 ≥ 1−c2j3e−ǫj j − j −1 (cid:20) (cid:12) (cid:21) for some c > 0. Hence, there(cid:12) is some j ≥ j and some constant c > 0 such that a 2 (cid:12) 1 0 3 similar calculation as in (7) yields together with (11) for all large m X X P ∀j ≤ m : Z ≥ j ≥ P ∀j ≤ j :Z ≥ j 1−c i3e ǫi (p,M) j j (p,M) 1 j j 2 − (cid:20) (cid:21) (cid:20) (cid:21)iY≥j1(cid:0) (cid:1) exp(Y ) j ≥ c P ∀j ≤ j : Z ≥ . 3 (p,M) 1 j j (cid:20) (cid:21) The last inequality holds since X ≤ exp(Y ) for all i ∈ N by definition. Recall (6), (2), (3) i i and the independence of (p,M). We obtain for sufficiently large m (such that ⌈κ˜m⌉ ≥ j ) 1 and some constant c > 0 that 4 c exp(Y ) E Z ≥ 4E exp(Y )P ∀j ≤ j : Z ≥ j ,A(m),B(m,k) 0 j m (p,M) 1 j m j " # j 0 (cid:20) (cid:20) (cid:21) (cid:21) X≥ c exp(Y ) ≥ 4E exp(Y )P ∀j ≤ j :Z ≥ j ,∀i ≤j :U ≥ κ,B(m,k) m j1 (p,M) 1 j j 1 i (cid:20) (cid:20) (cid:21) (cid:21) m (12) ·E ρ ,∀j < i ≤ ⌈κ˜m⌉: U ≥ κ,∀⌈κ˜m⌉ < j ≤m :Y −Y ≥ 0 . i 1 i j κ˜m " ⌈ ⌉ # i=Yj1+1 Since exp(Y )P ∀j ≤ j : Z ≥ exp(Y )j 1 and {∀i ≤ j : U ≥ κ} only depend on j1 (p,M) 1 j j − 1 i (p ,M ) we obtain, by independence of (p,M), for some constant c > 0 i i 1≤i≤j1 (cid:2) (cid:3) 5 exp(Y ) E exp(Y )P ∀j ≤ j :Z ≥ j ,∀i≤ j : U ≥ κ,B(m,k) = c P[B(m,k)] j1 (p,M) 1 j j 1 i 5 (cid:20) (cid:20) (cid:21) (cid:21) for all large m. The FKG inequality, see for instance [12, Theorem (2.4), p. 34], gives m E ρ ,∀j < i ≤ ⌈κ˜m⌉: U ≥ κ,∀⌈κ˜m⌉ < j ≤ m : Y −Y ≥ 0 i 1 i j κ˜m " ⌈ ⌉ # i=Yj1+1 m ≥ E ρ P ∀j < i ≤ ⌈κ˜m⌉: U ≥ κ,∀⌈κ˜m⌉ < j ≤ m : Y −Y ≥ 0 i 1 i j κ˜m " # ⌈ ⌉ i=Yj1+1 (cid:2) (cid:3) Thisinequalitycanbeappliedhere,since(pj)j N isasequenceof[0,1]-valued i.i.d.random variables, E[ρ2] < ∞, and m ρ and 1 ∈ are both 1 i=j1+1 i {∀i≤⌈κ˜m⌉:Ui≥κ,∀⌈κ˜m⌉<j≤m:Yj−Y⌈κ˜m⌉≥0} monotonically decreasing functions in (pj)j N with respect to the usual partial order on N Q ∈ [0,1] . Hence, together with (12), we have for some constant c > 0 that 6 c (13) E Z ≥ 6P[B(m,k)]E[ρ ]m(P[U > κ]κ˜)mP[∀1≤ j ≤ m : Y ≥ 0]. 0 j 1 1 j m " # j 0 X≥ Thus, E [ Z ] is infinite if the right hand side of (13) goes to infinity for m → ∞. 0 j 0 j Recall from (4≥) that P[B(m,k)] = P[M = 0]k m P[M < j] and remark that P[M < P 1 j=k+1 1 1 Q ON THE SPEED OF ERWRE 9 j] → P[M < ∞] as j → ∞. Furthermore it is known in the case E[logρ ] ≥ 0 that 1 1 P[∀1≤ j ≤m :Y ≥ 0] eventually exceeds 1 up to some multiplicative constant, see for j √m instance [11, XII.7]. Thus the proposition follows immediately for supercritical or critical BPREs with emigration whenwe choose δ in (1) sosmall that (1−δ)P[M < ∞]E[ρ ]> 1. 1 1 Let(Z ) beasubcriticalBPREwithemigration andE[ρ ] > 1. Dueto(13)thebehav- n n 0 1 iorofP[∀1≥≤ j ≤ m :Y ≥ 0],asmgoestoinfinity,isofinterest. IfthedistributionofU is j 1 non-lattice P[∀1 ≤ j ≤ m : Yj ≥ 0] is of order m−32γm. (Recall that γ = E[exp(βU1)] < 1 with β such that E[U exp(βU )] = 0.) For references see for instance [10, Theorem II] 1 1 or [1, Theorem 1.1 and Corollary 1.2]. Thus, E [ Z ] = ∞ if γE[ρ ]P[M < ∞] > 1 0 j 0 j 1 1 and the proposition is proven. For the lattice case, s≥ome monotonicity argument can be used. P (cid:3) 3. Connection between random walks and branching processes We turn now to the correspondence between ERWREs and certain BPMREs. Recall from the introduction that an RWRE is perturbed by cookies of maximal strength and that our aim is to study the speed of this ERWRE. In the current section we suppose that, additionally to Assumption A, the drift induced by the cookies wins, i.e. that (14) P lim S = +∞ = 1. 0 n n →∞ h i Criteria for transience to the right are given in Theorem 1.1 and in Theorem 1 of [6] in the case of a left-transient or recurrent underlying RWRE. If the RWRE is right-transient then monotonicity with respect to the environment implies (14), see [22, Lemma 15] (the condition ω(x,i) ≥ 1 for all x ∈ Z and i ∈ N in [22] is not necessary for the proof of 2 Lemma 15). Due to Theorem 4.1 in [17] the speed of the ERWRE exists on {S → ∞}. n The question is if there is a phase transition between zero and positive speed. A well-known tool to study the speed of an one-dimensional ERW is the so-called regen- eration or renewal structure, see [16, Section 6] or [17, Section 4] and references therein. According to Lemma 4.5 in [17] there are P -a.s. infinitely many random times j on the 0 event {S → ∞} with S < S for all m < j and S ≥ S for all k ≥ j. The increas- n m j k j ing enumeration of these renewal times is denoted by (τk)k N. By [17, Lemma 4.5] and ∈ (14), we have that (S ) , (S − S ) , k ≥ 1, are independent under P , n 0≤n≤τ1 n τk τk≤n≤τk+1 0 (S −S ) , k ≥ 1, have the same distribution under P and E [S −S ] < ∞. n τk τk≤n≤τk+1 0 0 τ2 τ1 Theorem 4.6 in [17] gives, P -a.s., 0 S E [S −S ] ν = lim n = 0 τ2 τ1 . n n E0[τ2−τ1] →∞ Thus, (15) ν = 0 iff E [τ −τ ]= ∞. 0 2 1 The key to study the distribution of τ −τ relies on the discussion of a branching process 2 1 with migration in random environment that corresponds to the ERWRE. Compare this method to the one used for RWRE in [14] and for ERW in [4, Section 2], [16, Section 6] and [15, Section 2], see also [17, Section 5] and references therein. For details concerning the connection we refer the reader to the specific sections in [16, 15, 17]. Let us consider the so-called backward branching process of the ERWRE. Therefore, recall that (S ) is transient to the right by (14) and thus τ < τ < ∞ P -a.s. As in [16, n n 0 1 2 0 ≥ Section 6], denote by D := #{n ∈ N : τ < n < τ ,S = S −k and S = S −k−1}, k ∈ N , k 1 2 n τ2 n+1 τ2 0 10 ELISABETHBAUERNSCHUBERT the number of downcrossings from S −k to S −k −1 between times τ and τ . The τ2 τ2 1 2 number of upcrossings in this time interval is S −S + D . Hence τ2 τ1 k 0 k ≥ (16) τ −τ = S −S +2 DP, 2 1 τ2 τ1 k k 0 X≥ and thus E [τ −τ ]= ∞ if and only if E [ D ]= ∞. 0 2 1 0 k 0 k ≥ It can be shown like in the proof of [16, LemPma 12] that (D ) is distributed, under P , k k 0 0 ≥ like a BPMRE (W ) defined by W = 0 and k k 0 0 ≥ Wk−1+1−Mk (k) W = 1 ξ , k k TW i { ≤ 0 } i=1 X where ξ(j), i,j ∈ N , are random variables on Ω that are independent under P , and i 0 ′ (p,M) P [ξ(j) = n]= (1−p )np forn ∈N . TherandomvariableTW := inf{k ≥ 1 :W = 0} (p,M) i j j 0 0 k denotes the first time of extinction of (W ) . k k 0 ≥ The correspondence now yields by (16) TW 1 0 − E [τ −τ ]= ∞ iff E W = E W = ∞ 0 2 1 0 k 0 k " # " # k 0 k=1 X≥ X and therefore by (15) TW 1 0 − (17) ν =0 iff E W = ∞. 0 k " # k=1 X 4. On the speed of the random walk, proofs At first we show that (S ) satisfies a strong law of large numbers. n n 0 ≥ Theorem 4.1. Let Assumption A hold. Then there exists a non-random ν ∈ [−1,1] such that lim S /n = ν P -a.s. n n 0 →∞ Proof. IfE[(logM ) ]< ∞andE[logρ ] > 0,theERWREgoesto−∞P -a.s.byTheorem 0 + 0 0 1.1(i). Then, (S ) satisfies a strong law of large numbers according to [17, Theorem n n 0 ≥ 4.1]. If E[(logM ) ]= ∞ and E[logρ ]> 0, [5, Proposition 4.1] and monotonicity with respect 0 + 0 N Z to the environment — see [22, Lemma 15] which also holds for Ω = ([0,1] ) — imply P supS = ∞ = 1. 0 n (cid:20)n≥0 (cid:21) The same holds if the underlying RWRE is recurrent or right-transient, i.e. E[logρ ]≤ 0. 0 Since P[M = 0] > 0 a weak ellipticity condition as described in [17, p. 108] holds for the 0 environmentω. Theorem3.2 in[17]— weak ellipticity issufficientforcase (d)intheproof — yields P [|S | → ∞] ∈ {0,1}. Since P [liminf S ∈ {±∞}] = 1 by [17, Lemma 0 n 0 n n →∞ 2.2] we get P inf S = −∞ ∈{0,1}. 0 n n 0 (cid:20) ≥ (cid:21) Thus, (S ) satisfies a strong law of large numbers according to [17, Theorem 4.1]. (cid:3) n n 0 ≥

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.