ebook img

Asymptotic behaviour of gossip processes and small world networks PDF

0.32 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Asymptotic behaviour of gossip processes and small world networks

Asymptotic behaviour of gossip processes and small world networks A. D. Barbour and G. Reinert ∗ † Universit¨at Zu¨rich and National University of Singapore; 2 1 University of Oxford 0 2 b e F Abstract 8 2 Both smallworldmodelsof randomnetworks withoccasional longrangeconnec- tions and gossip processes with occasional long range transmission of information ] R have similar characteristic behaviour. The long range elements appreciably reduce P theeffective distances, measuredinspaceor intime, between pairsof typicalpoints. . h Inthispaper,weshowthattheircommonbehaviourcanbeinterpretedasaproduct t of the locally branching nature of the models. In particular, it is shown that both a m typical distances between points and the proportion of space that can be reached [ withinagiven distance ortimecan beapproximated by formulaeinvolving thelimit random variable of the branching process. 2 v 5 Keywords: Small world graph, gossip process, branching process approximation. 9 MSC2010 Subject Classification: 92H30; 60K35; 60J85 8 5 . 2 1 Introduction 0 2 1 : Moore & Newman [7] introduced a continuous analogue of the Watts & Strogatz [8] v “small world” model. In this model, a random number of chords, with Poisson distribu- i X tion Po(Lρ/2), are uniformly and independently superimposed as shortcuts on a circle r a of circumference L. Distance is measured as usual along the circumference, but chords are deemed to be of length zero, and interest centres on finding the statistics of shortest path distances between pairs of points. A closely related model, the “great circle model”, was introduced somewhat earlier by Ball et al. [2], in the context of epidemics; here, dis- tance between points translates into time taken for one infected person to infect another. InBarbour & Reinert [4], assuming theexpected number Lρ/2 ofshortcuts to belarge, we ∗Angewandte Mathematik, Universit¨at Zu¨rich, Winterthurertrasse190, CH-8057ZU¨RICH; ADB was Visiting Research Professor at the National University of Singapore while part of this work was carried out. WorksupportedinpartbyAustralianResearchCouncilGrantsNosDP120102728andDP120102398 †DepartmentofStatistics,UniversityofOxford,1SouthParksRoad,OXFORDOX13TG,UK.GDR was supported in part by EPSRC and BBSRC through OCISB. 1 proved a distributional approximation for the distance between a randomly chosen pair of points P and P′, and gave a bound on the order of the error, in terms of total variation distance. We also showed that analogous results could be proved in higher dimensions by much the same method, when the circle is replaced by a sphere or a torus. It turns out that the reduction in the typical distances between pairs of points that results from introducing shortcuts is still substantial, but less dramatic than in one dimension. More recently, Chatterjee & Durrett [5] studied a model for the spread of gossip that is the continuous analogue of one of a number of models discussed in Aldous [1]. Here, information spreads locally from an individual to his neighbours on the two-dimensional torus, and also occasionally to other, randomly chosen members of the community. Thus a disc of informed individuals, centred on an initial informant, grows steadily in the torus; long range transmissions of information occur in a Poisson process, whose rate is propor- tional to the area (number) of informed individuals, and any such transmission contacts a randomly chosen point of the torus, initiating a new disc of informed individuals. The distinction between this model and the corresponding two dimensional model in [4] is that, in the gossip model, the Poisson process runs at a rate proportional to the area of the currently informed region; in [4], where the Poisson number of shortcuts is consid- ered to be fixed in advance, the Poisson process corresponds to a process of discovery of shortcuts, and its rate is thus proportional to the length of the boundary of the informed region. Here, we consider the development of such a process on a smooth closed homogeneous Riemannian manifold C of dimension d, such as a sphere or a torus, having large finite volume C =: L with respect to its intrinsic metric. We then assume that, around each | | point P ofC, there isacollection ofclosed subsets (P,s), s 0, thatareballsofradius s K ≥ with respect to a metric d that makes C a geodesic space, and with (intrinsic) volumes C v ( ) := (P,s) sdv( ) as s 0, for some v( ) > 0; s is thought of as time, and s K |K | ∼ K → K v( ) 1/d as a (linear) speed of propagation. We shall in particular assume that { K } s−dv ( )/v( ) 1 c (s v( )/L 1/d)γg, s > 0, (1.1) s g | K K − | ≤ { K } for some γ > 0. The metric d need not be the same as the intrinsic metric; for instance, g C on the torus, we could consider rectangular as well as circular neighbourhoods. Two such subsets (P,t) and (Q,u) then intersect when P (Q,t+u), or, equivalently, when K K ∈ K Q (P,t+u), sothattheprobabilityofintersection ifP andQarechosenindependently ∈ K and uniformly distributed on C (with respect to the intrinsic volume), is given by q (t,u) := L−1v ( ) = L−1v( )(t+u)d(1+R (t,u)), (1.2) L t+u L K K where R (t,u) c (t+u)(v( )/L)1/d γg. (1.3) L g | | ≤ { K } The set (P,s) denotes the set of points ‘locally’ contacted after time s has elapsed K followinganinitial‘longrange’contactatP, thoughtofas‘islands’ inC, andthecomplete set of contacts Y (t) at time t is the union of these sets growing from an initial point P , P0 0 and from all long range contacts made before t. The rate at which long range contacts are made is proportional either to the area of the boundary of Y(t) (small world processes) 2 or to its volume (gossip processes), and we denote the constant of proportionality by ρ; long range contacts are made to independently and uniformly chosen points of C. It is clear that, at least for a while, such a process can be closely approximated using a Markovian growth and branching process X∗ taking values in j, where denotes the j≥1 ∪ C C closed subsets of C. Long range contacts aremade to independently anduniformly chosen points of C, and, at time s after a contact at P∗, the set (P∗,s) is one component of the K state of the process. Thus the state of the branching process at time t can be encoded in terms of the position P∗ C of the initial individual and the number n(t) of contacts that 0 ∈ have taken place up to time t, together with the pairs (τ∗,P∗),...,(τ∗ ,P∗ ) R C 1 1 n(t) n(t) ∈ +× of times and positions of the contacts, giving X∗(t) = (J∗(0,t),J∗(1,t),...,J∗(n(t),t)), where J∗(j,t) := (P∗,t τ∗) if τ∗ t and J∗(j,t) = otherwise; we take τ∗ = 0. K j − j j ≤ ∅ 0 In ‘gossip’ models, new contacts are made at a rate proportional to the current volume, which, for the process X∗ is given by n(t) n(t) V∗(t) := vt−τj∗(K) ∼ (t−τj∗)dv(K); j=0 j=0 X X in ‘small world’ models, the rate is proportional to the derivative of the volume. If Y (t) C denotes the state of the actual process of interest, then it can initially P ⊂ be approximated by taking P∗ = P, and then forming the union 0 n(t) Y∗ (t) := J∗(j,t). (1.4) P∗ 0 j=0 [ However, assoonastheunionisnot disjoint, theactual rateofcontacts decreases, andthe two processes diverge. To be more precise about the definition of Y = YP∗ in terms of X∗, 0 we augment the encoding of X∗ by adding to each pair (τ∗,P∗) both the position Q C j j j ∈ and the index I of the island from which the j-th contact was made, together with a j mark G which is 0 if the j-th island is ‘real’, and 1 if it is a ‘ghost’ island. G is specified j j by setting G = 1 (island j is a ghost) if G = 1, or if P∗ Y(τ∗ ), or if Q / J , j Ij j ∈ j− j ∈ Ij,j where b J := P C: d (P,P∗) < d (P,P∗) for all m < j such that G = 0 . l,j { ∈ C l C m m } To explain these restrictions, note that new branching process contacts (designated by b G = 0) contribute to Y only if they are originate from a real island, and start at a point j outside Y, and multiple counting from overlapping islands is also to be prohibited. Thus Y(t) := J∗(j,t), (1.5) j∈[JY(t) where (t) := j 0: τ∗ t, G = 0 , and its volume V(t) and area A(t) are in this JY { ≥ j ≤ j } way functions of (τ∗,P∗,Q ,I ): j 0,τ∗ t . In view of this construction, the two { j j j j ≥ j ≤ } 3 processes Y and Y∗ are coupled so as to make the identification of Y(t) and Y∗(t) exact, until the union in (1.4) ceases to be disjoint. In [4], the distribution of inter-point distances is determined by running two such branching processes from randomly chosen initial points, each for a length of time t∗ at which the mean number of overlaps in (1.4) is of order O(1). At this time, conditionally on the contact times in the two branching processes, the number of permissible overlaps between the sets (1.4) — cases in which an island in one branching process is contained within an island in the other could not have arisen in the actual process — has ap- proximately a Poisson distribution, and the distance between the initial points is greater than 2t∗ if there are no permissible overlaps. In this way, and by varying the choice of t∗ appropriately, the distribution of inter-point distances can be approximated, without ever having to go into the dependence structure that becomes important in the process Y at times significantly larger than t∗. In contrast, Chatterjee & Durrett [5] go beyond the branching phase in the analysis of Y in their two-dimensional gossip model, and are able to prove a conditional law of large numbers for the fraction of the torus contained in Y, given the outcome of the branching phase. They also establish the asymptotics of the first time at which Y is the whole torus. The main result of this paper is Theorem 3.2, which establishes a conditional law of large numbers, together with a rough error bound, for the time evolution of the covered fraction of C, in the setting of a quite general small world or gossip process. In the particular case of the torus, this extends the limit law proved by Chatterjee & Durrett [5], by providing an estimate of the approximation error that is uniform for all time. Our argument, as in [4], avoids any detailed analysis of Y beyond the branching phase. We first note that the asymptotics of the mean fraction EV(t)/L of C that is contained in Y, for t in the relevant range (corresponding to 2t∗ above), can be deduced for small world processes from Theorem 4.2 of [4], and we show that an analogous theorem holds also for gossip processes. The same argument also gives an asymptotic expression for E V(t)/L , the conditional expectation of the covered fraction at t, given the history s { |F } of the process up to time s. The law of largenumbers isthen proved by using anargument of much the same flavour, now involving three independent branching processes in their early phases, to derive an asymptotic formula for E [V(t)/L]2 . This is shown to be s asymptotically the same as [E V(t)/L ]2, for la{rge enough|Fs,}from which it follows s { |F } that the conditional variance of V(t)/L, given the information in , is small, and hence s F that the value of V(t)/L is (almost) fixed. An analogousargument is used, for instance, in Ball, Sirl & Trapman [3], where they show that the proportion of susceptibles infected an epidemicinalargepopulationisclosetoafixedvalue, providedthattheepidemicisalarge one. A by-product of our argument is to identify the solution h to a particular integral equation, that appears in Aldous [1] and also plays a substantial part in the formula given by Chatterjee & Durrett [5], in terms of the Laplace transform of the branching process limit random variable W. The paper concludes by extending the results to finite subsets of homogeneous manifolds, such as rectangles in R2. 4 2 The branching phase As in [4], we base our analysis of the coverage process on the pure growth Markov branch- ing process X∗, which has neighbourhoods with centres independently and uniformly positioned in C. In this section, to describe the behaviour of such processes, we specialize to the case of ‘flat’ manifolds, such as tori, in which v ( ) = sdv( ), s 0, (2.1) s K K ≥ so that the constant c in (1.1) is zero. We later show that processes in which c cannot g g be taken to be zero can be dealt with by bounding them between processes satisfying condition (2.1) that are close enough for our purposes. We begin by defining M (t) := max j 0: τ∗ t to be the number of contacts in 0 { ≥ j ≤ } the branching process up to time t, and M0(t) M (t) = (t τ∗)l (2.2) l − j j=0 X to be the sum of the l’th powers of their ‘radii’. In the small world process, t M (t) = M (0)+Z ρv( )d M (u)du = M (0)+Z(ρv( )[M (t) M (0)]), 0 0 d−1 0 d d K K − (cid:18) Z0 (cid:19) (2.3) where Z is a unit rate Poisson process, and in the gossip process t M (t) = M (0)+Z ρv( ) M (u)du = M (0)+Z(ρ(d+1)−1v( )[M (t) M (0)]); 0 0 d 0 d+1 d+1 K K − (cid:18) Z0 (cid:19) (2.4) in both cases, the intensity ρ may depend on L. The remaining evolution is governed by the differential equations d M (t) = M (t) for a.e. t, 1 0 dt (2.5) d M (t) = iM (t), i 2. i i−1 dt ≥ These equations can be rewritten in clearer form by defining H (t) := M (t)λi/i!, for λ i i to be suitably chosen, in which case (2.5) reduces to d H (t) = λH (t) for a.e. t, 1 0 dt (2.6) d H (t) = λH (t), i 2; i i−1 dt ≥ for the small world process, we have H (t) = M (0)+Z(d!ρv( )λ−d[H (t) H (0)]) = M (0)+Z(H (t) H (0)), (2.7) 0 0 d d 0 d d K − − 5 if λ = λ := (d!ρv( ))1/d, and, for the gossip process, we have 0 K H (t) = M (0)+Z(d!ρv( )λ−d−1[H (t) H (0)]) = M (0)+Z(H (t) H (0)), 0 0 d+1 d+1 0 d+1 d+1 K − − (2.8) if λ = λ := (d!ρv( ))1/(d+1). Note that, since ρ may depend on L, so may λ . 0 0 K Remark 2.1 The time-scaled process H(u) := H(u/λ) actually satisfies d H (t) = H (t), i 1; He(t) = M (0)+Z(H (t) H (0)), (2.9) i i−1 0 0 r(d) r(d) dt ≥ − where re(d) = d foer the small world aned d + 1 for the gossipe process. eThus, apart from a time change, the processes are the same for all λ. Despite this, we retain λ in the 0 subsequent discussion, in order to emphasize the connection with the original process. In either case, the equations for H = (H ,H ,...,H )T are of the form 1 2 r dH = λ C (H +hˆεr) = λ C [I +(hˆ/H )εr(εr)T]H, (2.10) 0 r 0 r r dt where r = r(d), C is the r-dimensional cyclic permutation matrix, εi denotes the i-th r coordinate vector, and ˆ h(t) := H (t) H (t) = Z(H (t) H (0)) H (t)+M (0). (2.11) 0 r r r r 0 − − − ˆ Without the perturbation h, H would have asymptotically exponential growth at rate λ , 0 and the ratios of its components would all tend to unity, since the dominant eigenvalue 1 of C corresponds to the right eigenvector 1. For the arguments to come, it will be r ˆ important to show that, with high enough probability, the asymptotic effect of h is just to multiply H by some random constant — a branching random variable W — which is not too big. Unless otherwise specified, we henceforth take M (0) = 1 and M (0) = 0 for 0 l all l 1, so that we start with just one point P at t = 0. 0 ≥ 2.1 Growth bounds for the branching process Using the maximum norm for r(d)-vectors, it follows immediately from (2.10) that k·k d H(t) λ u(t) H(t) , 0 dtk k ≤ k k ˆ with u(t) := 1+(h(t)/H (t)) , so that, by a Gronwall argument, r(d) + { } t H(t) H(t ) exp λ u(v)dv , (2.12) 0 0 k k ≤ k k (cid:26) Zt0 (cid:27) for any 0 t t. Thus, in order to bound the growth of H, we shall need to control 0 ≤ ˆ ≤ the quantity h(t)/H (t), which is itself a function of the Poisson process Z. To do so, r(d) we begin with the following lemma, which controls the extreme fluctuations of Z. 6 Lemma 2.2 Let Z be a unit rate Poisson process. Then we have the following bounds, uniformly in t 1: ≥ (1) P supu−1Z(u) 2 c e−t/14; 1 ≥ ≤ (cid:20)u≥t (cid:21) (2) P supu1/3 u−1Z(u) 1 4 c e−t1/3/5; 2 | − | ≥ ≤ (cid:20)u≥t (cid:21) (3) P inf u−1Z(u) 1/2 c e−t/32. 3 u≥t ≤ ≤ (cid:20) (cid:21) Furthermore, for any U 1 and 0 < η 1/3 such that U(2η 1) 42log2, ≥ ≤ − ≥ (4) P sup(u 1)−21(1+η) Z(u) u U c4e−U/28, ∨ | − | ≥ ≤ (cid:20)u≥0 (cid:21) for a constant c . 4 Proof: For any t,ε > 0, set u := t(1+ε)j, j 0. Then it is immediate that j ≥ sup u−1Z(u) Z(u )/u . j+1 j ≤ uj≤u≤uj+1 Hence, and by the Chernoff inequalities ([6], Theorem 3.2), P sup u−1Z(u) 1+2ε P[u−1Z(u (1+ε)) 1+2ε] ≥ ≤ j j ≥ uj≤u≤uj+1 h i exp ε2u /(2+3ε) j ≤ {− } = exp ε2u /(2+3ε) exp ε3u /(2+3ε) , j−1 j−1 {− } {− } and P inf u−1Z(u) 1 2ε P (1+ε)u −1Z(u ) 1 2ε j j uj≤u≤uj+1 ≤ − ≤ { } ≤ − h i ex(cid:2)p ε2u /2 (cid:3) j ≤ {− } = exp ε2u /2 exp ε3u /2 . j−1 j−1 {− } {− } Adding over j 0, it thus follows that ≥ P sup u−1Z(u) 1 2ε C(ε,t)exp ε2t/(2+3ε) , (2.13) | − | ≥ ≤ {− } (cid:20)u≥t (cid:21) with C(ε,t) := 2/ 1 e−ε3t/(2+3ε) . Taking ε = 1/2 gives the first inequality, with { − } c := C(1,1); taking ε = 1/4 gives the third, with c = C(1,1). For the second, with 1 2 3 4 t 1, ε = t−1/3 gives, in particular, ≥ P sup u1/3 u−1Z(u) 1 4 C(1,1)exp t1/3/5 , | − | ≥ ≤ {− } (cid:20)t≤u≤8t (cid:21) 7 and thus P supu1/3 u−1Z(u) 1 4 C(1,1) exp 2jt1/3/5 c exp t1/3/5 , 2 | − | ≥ ≤ {− } ≤ {− } (cid:20)u≥t (cid:21) j≥0 X with c := C(1,1)/(1 e−1/5). The fourth inequality is a little trickier. Taking t 1 and 2 − ≥ ε = U/ 2(2t)1(1−η) , we have 2 { } P sup u21(1−η) u−1Z(u) 1 U C(ε,t)exp 2−3+ηU2tη/(2+3U/2) | − | ≥ ≤ {− } (cid:20)t≤u≤2t (cid:21) C(ε,t)exp Utη/28 . ≤ {− } For this choice of ε, ε2t increases with t, but ε3t decreases; however, it is not difficult to show that 1 1 C(ε,t) C′(ε,t) := 10e , ≤ 5 ∨ ε3t (cid:18) (cid:19) uniformly in t,U 1. Set q(t) := C′(ε,t)exp Utη/28 . Then, in the sum q(2j), ≥ {− } j≥0 the ratios of successive terms are at most P 2(1−3η)/2exp U(2η 1)/28 √2exp U(2η 1)/28 1/2, {− − } ≤ {− − } ≤ by assumption, so that q(2j) 2q(1) 211/210e exp U/28 . Since P[Z(1) > j≥0 ≤ ≤ {− } U 1] ce−U/14 by an exponential moment inequality, with c := xex−1 and x = e1/14, − ≤ P the proof of the fourth inequality is complete. Based on this lemma, we can now prove growth bounds for the Markov branching process. Here, we allow for quite general initial conditions. For ease of reference, for any K 1 and 0 < η < 1, we define the events ≥ A(1) := e−λ0s H(s) K ; A(2) := H (s) K ; (2.14) K,s { k k ≤ } K,s { r(d) ≥ } A(3) := sup (u 1)−21(1+η) Z(u) u K21(1−η) ; (2.15) K,η,s ∨ | − | ≤ (0≤u≤H (s) ) r(d) A′(K,s) := exp λ (t s)(1+ε ) H(t) H(s) for all t > s , (2.16) 0 K { {− − }k k ≤ k k } where ε := 5K−1/3, and we write K (1) (2) (3) A := A A A , (2.17) K,s θK,s ∩ K,s ∩ K,εK,s ∈ Fs where denotes the history of X∗ up to time s, and θ := C e1/80, with C as defined s a a F below. Theorem 2.1 For any K 1 and any 0 s < t, we have ≥ ≤ (1) P[exp λ (t s)(1+ε ) H(t) max C K, H(s) for all t > s] 0 K a {− − }k k ≤ { k k} 1 c e−K1/3/5, a ≥ − for suitable constants C := 3exp (r(d)!)1/r(d) and c . Furthermore, a a { } (2) P[A′(K,s) A(2) ] 1 c e−K1/3/5. |Fs∩ K,s ≥ − 2 8 Proof: For K > 0, define τ := inf t s: H (t) K , and suppose first that K,s r(d) { ≥ ≥ } τ > s. Then H (t) K if s t τ , and thus, for all such t, K,s r(d) K,s ≤ ≤ ≤ H (t) 1+Z(K) 3K, 0 ≤ ≤ by Lemma 2.2(1), on a set A (K) of probability at least 1 c exp K/14 . Hence, since 1 1 − {− } from the definition of H (t) and by H¨older’s inequality, we have i (r(d)!)i/r(d) H (t) H (t)i/r(d)H (t)1−i/r(d), 1 i < r(d), (2.18) i r(d) 0 ≤ i! ≤ it follows that, on A (K), 1 H(t) 3exp (r(d)!)1/r(d) K for all s t τ . (2.19) K,s k k ≤ { } ≤ ≤ Now, from Lemma 2.2(2), it follows that u(t) = 1+(hˆ(t)/H (t)) max 1,K−1 +Z(H (t))/H (t) r(d) + r(d) r(d) ≤ { } 1+5K−1/3 = 1+ε K ≤ for all t τ , on an event A (K) of probability at least 1 c e−K1/3/5. By (2.12), this K,s 2 2 ≥ − implies that, on A (K) A (K), for all t τ , 2 1 K,s ∩ ≥ H(t) H(τ ) exp λ (t s)(1+ε ) . (2.20) K,s 0 K k k ≤ k k { − } If τ > s, by (2.19), this in turn implies that, on A (K) A (K), K,s 2 1 ∩ H(t) 3exp (r(d)!)1/r(d) Kexp λ (t s)(1+ε ) (2.21) 0 K k k ≤ { } { − } for all t s; if τ = s, we simply have H(t) H(s) exp λ (t s)(1+ε ) . This K,s 0 K ≥ k k ≤ k k { − } establishes Part 1. For Part 2, if H (s) K, it follows as above that r(d) ≥ P u(t) 1+ε for all t s 1 c e−K1/3/5, K s 2 ≤ ≥ |F ≥ − (cid:2) (cid:3) and, if this is the case, then exp λ (t s)(1+ε ) H(t) H(s) for all t > s 0 K {− − }k k ≤ k k follows from (2.12). Corollary 2.3 Given any ε > 0, there exists a random variable Hε such that (1) hˆ(t) Hεexp 1λ (1+ε)t a.s. for all t > 0. | | ≤ {2 0 } In addition, for any K 1, ≥ (2) E hˆ(t) I[A′(K,s)] A 2 (θK)1/2 +K exp 1λ (1+ε )t . {| | |Fs∩ K,s} ≤ { } {2 0 K } 9 Proof: Note that, from Theorem 2.1(1), given any ε > 0, H′ := supe−λ0(1+ε)t H(t) < a.s.. (2.22) ε k k ∞ t>0 This in turn implies that ˆ h(t) 1 = Z(H (t)) H (t) sup Z(u) u for all t > 0, r(d) r(d) | − | | − | ≤ 0≤u≤Hε′exp{λ0(1+ε)t}| − | and Part 1 follows from the law of the iterated logarithm for the Poisson process. In similar fashion, from (2.16), we have E hˆ(t) hˆ(s) I[A′(K,s)] A s K,s {| − | |F ∩ } E sup Z(u) u A s K,s ≤ | − | F ∩ (0≤u≤kH(s)kexp{λ0(1+εK)(t−s)} (cid:12) ) 2(θK)1/2exp 1λ (1+ε )t , e (cid:12) (2.23) ≤ {2 0 K } (cid:12) for Z(u) := Z(u+H (s)) Z(H (s)). Since also, on A , r(d) r(d) K,s − e |hˆ(s)−1| ≤ Hr(d)(s)21(1+εK)K21(1−εK) ≤ Kexp{12λ0s(1+εK)}, the proof is completed. Recalling (2.10), and writing W (t) := e−λ0t1TH(t), we have ∗ dW ∗ = λ e−λ0thˆ(t), (2.24) 0 dt and, inviewofCorollary2.3(1), itfollowsthatW ( ) := lim W (t)existsandisfinite ∗ t→∞ ∗ ∞ a.s. The process W ( ), although directly motivated from the differential equations (2.10), ∗ · is not the usual choice for defining such a limit: the branching process martingale is r(d)−1 W(t) := e−λ0t H (t) = W (t)+e−λ0t H (t) H (t) = W (t)+e−λ0thˆ(t), j ∗ 0 r(d) ∗ { − } j=0 X f this last by (2.11). Thus lim W(t) = W ( ) also, because of Corollary 2.3(1). Note t→∞ ∗ ∞ that, in similar fashion, (2.24) can also be written as f dW ∗ = λ e−λ0t H (t) H (t) , (2.25) 0 0 r(d) dt { − } from which, by partial integration, it follows that ∞ W ( ) = W (0)+ e−λ0t dH (t) λ H (t)dt , ∗ ∗ 0 0 r(d)−1 ∞ { − } Z0 identifying W ( ) as r(d)W, where W is the limiting random variable defined in [4], ∗ ∞ Theorem 4.1. 10

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.