ebook img

On Discrete Gibbs Measure Approximation to Runs PDF

0.3 MB·
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview On Discrete Gibbs Measure Approximation to Runs

On Discrete Gibbs Measure Approximation to Runs A. N. Kumar* and N. S. Upadhye** Department of Mathematics, Indian Instituteof Technology Madras, Chennai-600036, India. * Email: [email protected] **Email: [email protected] 7 1 0 2 Abstract n In this paper, some erroneous results for a dependent setup arising from independent sequenceof Bernoulli a J trialsarecorrected. Next,aSteinoperatorfordiscreteGibbsmeasureisderivedusingPGFapproach. Also, 2 an operator for dependent setup is derived and shown as perturbation of the Stein operator for discrete 1 Gibbsmeasure. Finally,usingperturbationtechniqueandexplicitformofdistributionsfromdiscreteGibbs ] measure,newerrorboundsbetweenthedependentsetupandPoisson,pseudo-binomialandnegativebinomial R distributions are obtained bymatching up to first two moments. P . Keywords : Runs, Markov chain embedding technique, waiting time, Poisson, pseudo-binomial and negative h t binomial distribution, perturbations, probability generating function, Stein operator, Stein’s method. a m MSC 2010 Subject Classifications : Primary : 62E17, 62E20 ; Secondary : 60C05,60E05. [ 1 1 Introduction v 4 9 Runs and patterns is an important topic in the areas related to probability and statistics such as, reliability 2 theory,meteorologyandagriculture,statisticaltestingandqualitycontrolamongmanyothers(seeBalakrishnan 3 0 and Koutras [5], Kumar and Upadhye [21] and Dafnis et al. [14]). The researchin this topic initiated with the . 1 runsrelatedtosuccess/failure(seePhilippou et al. [24]andPhilippouandMakri[25]). Aseriesofarticleslater 0 followed in this area, see Aki [1], Aki et al. [2], Antzoulakos et al. [3], Antzoulakos and Chadjiconstantinidis 7 1 [4], Balakrishnan and Koutras [5] and Makri et al. [22] and references therein. Furthermore, Haung and Tsai : v [17] extended the pattern by consideringruns of failures and successes together which is knownas (k ,k )-runs 1 2 i X or modified distribution of order k. Recently, Dafnis et al. [14] studied three different types of (k ,k )-runs, as 1 2 r described below. a Letη ,η ,...,η be afinite sequenceofindependentBernoullitrialswithsuccessprobabilityP(η =1)=pand 1 2 n i failure probability P(η = 0)= q = 1−p. Then, three types of dependent setups can be observed for any pair i of positive integers (k ,k ), excluding (0,0), as follows: 1 2 (T1) at least k consecutive failures followed by at least k consecutive successes. 1 2 (T2) exactly k consecutive failures followed by exactly k consecutive successes. 1 2 (T3) at most (at least one) k consecutive failures followed by at most (at least one) k consecutive successes. 1 2 LetBn ,Mn andNn bethenumberofoccurrencesoftheeventsoffirst,secondandthirdkindoutofn k1,k2 k1,k2 k1,k2 trialsrespectively. However,thedistributionoftheeventoffirsttype(Bn )isstudiedbyHuangandTsai[17] k1,k2 in 1991 where probability generating function (PGF), recursive relations for probability mass function (PMF), Poisson convergence and an extension of this distribution is given. Recently, approximation problem related 1 Bn is studied widely, for example, Poisson approximation to Bn is given by Vellaisamy [29], binomial k1,k2 k1,k2 convoluted Poisson approximation to Bn is studied by Upadhye et al. [28], negative binomial approximation 1,1 to waiting time for Bn and pseudo-binomial approximation to Bn are given by Kumar and Upadhye k1,k2 k1,k2 [20, 21]. In this paper, we focus on the distribution of Mn . First, we rederive recursive relations in PGFs, PMFs and k1,k2 moments and obtain explicit forms of PGF and PMF. We also rederive PGF, recursive relations in PMFs and momentsofwaitingforMn whichcorrectsomeerroneousresultsofDafnisetal. [14](seeRemark3.2). Next, k1,k2 we derive a Stein operator for Mn as perturbation of discrete Gibbs measure (DGM). Further, we obtain k1,k2 new bounds for approximation of Mn to Poisson, pseudo-binomial and negative binomial distributions by k1,k2 matching the parameters. Next, Stein’s method (Stein [27]) is used to derive bounds between Mn and Poisson, pseudo-binomial and k1,k2 negative binomial distributions. Stein’s method can be described in three steps. First, we obtain an operator (known as Stein operatordenotedby A for a randomvariableX) whichis acting ona largeclassof functions X G such that X E[A g(X)]=0, for g ∈G . X X There are several approaches to obtain Stein operators. However, we use PGF approach given by Upadhye et al. [28]. In the second step, we obtain the solution of Stein equation A g(m)=f(m)−Ef(X), m∈Z and f ∈G, X In the last step, replacing m by a random variable Y in Stein equation and taking expectations and supremum lead to d (X,Y):= sup|Ef(X)−Ef(Y)|= sup|E[A g(Y)]|, TV X f∈J f∈J where J = {1(S)| S measurable} and 1(S) is the indicator function of the set S. For more details and applications, see Barbour [6], Barbour and Chen [8], Barbour et al. [7, 9, 10], Čekanavičius [11], Chen et al. [13], Eichelsbacher and Reinert [15], Norudin and Peccati [23], Ley et al. [19], Reinert [26] and references therein. This paper is organized as follows. In Section 2, we first explain Markov chain embedding technique which is useful to derive double generating function of finite integer valued random variables. Next, we discuss some known results for Poisson, pseudo-binomial and negative binomial approximations. In Section 3, we rederive recursive relations in PGFs, PMFs, and moments for Mn and obtain PGF and PMF. Next, we rederive k1,k2 PGF, recursive relations in PMFs and moments of waiting time for Mn . In Section 4, we obtain recursive k1,k2 relations in PGFs and its derivative which are useful in deriving Stein operator for Mn via PGF approach. k1,k2 InSection5,wederiveaSteinoperatorforMn asperturbationofDGM.Finally,inSection6,weobtainnew k1,k2 error bounds in total variation distance between Mn and Poisson, pseudo-binomial and negative binomial k1,k2 distributions by matching first and second moments. 2 Known Results In this section, we first describe Markov chain embedding technique which is useful to obtain the double generating function for the distributions of runs and waiting time distributions of runs. LetZ beanon-negativefiniteintegervaluedrandomvariablewhichisobtainedbyobservingaspecificpattern n from a sequence of Bernoulli trials. Then, a Markov embedding technique (see Fu and Koutras [16], Koutras [18], Balakrishnan and Koutras [5] and Dafnis et al. [14]) is used to obtain the exact distribution for Z and n the technique is as follows: 2 Definition 2.1. The random variable Z is called Markov chain embeddable variable of binomial type (MVB) n if there exists a Markov chain {Y :t≥0} defined on discrete space Ω such that t (i) Ω can be partitioned as Ω=∪ C , C ={c ,c ,...,c } x≥0 x x x,0 x,1 x,m−1 (ii) P(Y ∈C |Y ∈C )=0 for all y 6=x,x+1 and t≥1. t y t−1 x (iii) P(Z =x)=P(Y ∈C ), for n≥0, x≥0. n n x Note that, onlytwo transitionsfromC to C andC arepossible whichyieldthe followingm×mmatrices: x x x+1 A (x)=(P(Y =c |Y =c )) , t t x,j t−1 x,i m×m B (x)=(P(Y =c |Y =c )) , for i,j =0,1...,m−1. t t x+1,j t−1 x,i m×m Let ℓ = sup{x : P(Z = x) > 0}, 1˜ = (1,1,...,1) and the initial distribution of Markov chain is denoted n n 1×m by π =(P(Y =c ),P(Y =c ),...,P(Y =c )), 0≤x≤ℓ . x 0 x,0 0 x,1 0 x,m−1 0 Then, from the convention the P(Z =0)=1, we have π 1˜′ =1 and π 1˜′ =0 for 1≤x≤ℓ . 0 0 x 0 Next, let φ (t) and Φ(t,z) denote the single and double generating function of Z , i.e., n n ℓn ∞ φ (t)= P(Z =m)tm and Φ(t,z)= φ (t)zn. n n n m=0 n=0 X X Also,letM (t)andM(t,z)bethesingleanddoublegeneratingforρ whichisthewaitingtimeofr-thoccurrence r r of Z . i.e., n ∞ ∞ M (t)= P(ρ =m)tm and M(t,z)= M (t)zn. r r r m=0 n=0 X X Now, we have the following result for the double generating for Z and ρ . n r Theorem 2.1. Let I be m×m identity matrix, A=A (x) and B =B (x) for all t≥1 and x≥0, then t t Φ(t,z)=π (I−z(A+tB))−11˜′ and M(t,z)=1+tzπ (I−t(A+zB))−1B1˜′. 0 0 For more details, we refer the reader to Antzoulakoset al. [3], Balakrishnanand Koutras[5], Dafnis et al. [14], Fu and Koutras [16], Koutras [18] and references therein. Next, we discuss some relevant results on Stein’s method. Let X , X and X follow Poisson (with parameter 1 2 3 λ), pseudo-binomial (with parameter αˇ and pˇ) and negative binomial (with parameter αˆ and pˆ) distribution respectively with PMFs e−λλm P(X =m)= , m=0,1,2..., 1 m! 1 αˇ P(X =m)= pˇmqˇαˇ−m, m=0,1,2,...,⌊αˇ⌋, 2 R m (cid:18) (cid:19) αˆ+m−1 P(X =m)= pˆαˆqˆm, m=0,1,..., 3 m (cid:18) (cid:19) where αˇ,αˆ > 0 and 0 < pˇ, pˆ < 1 with qˇ = 1−pˇ, qˆ = 1−pˆ, ⌊αˇ⌋ is the greatest integer function of αˇ and R= ⌊αˇ⌋ αˇ pˇmqˇαˇ−m. Now, throughout this paper, let G be the set of all bounded functions and m=0 m P (cid:0) (cid:1) G ={g : g ∈G such that g(0)=0 and g(x)=0, for x∈/ Supp(X)} X 3 be associated with Stein operator A , where Supp(X)denotes the support of a randomvariable X. From (4), X (5) and (6) of Upadhye et al. [28], Stein operators for X , X and X are given by 1 2 3 A (g(m))=λg(m+1)−mg(m), g ∈G and m=0,1,2,..., X1 X1 A (g(m))=(αˇ−m)pˇg(m+1)−mqˇg(m), g ∈G and m=0,1,...,⌊αˇ⌋ X2 X2 A (g(m))=qˆ(αˆ+m)g(m+1)−mg(m), g ∈G and m=0,1,2,... X3 X3 and the bounds for the solution of the Stein equation for Poisson, pseudo-binomial and negative binomial distributions respectively is given by 2kfk 2kfk 2kfk k∆gk≤ , k∆gk≤ and k∆gk≤ . (1) max(1,λ) ⌊αˇ⌋pˇqˇ αˆqˆ For more details, we refer the reader to Čekanavičius and Roos [12], Upadhye et al. [28] and Vellaisamy et al. [30] and references therein. 3 (k ,k )-runs and Related Distributions 1 2 In this section, we first formulate the distribution of Mn mathematically. Next, we derive the recursive k1,k2 relationsforPGFs,PMFsandmomentsandobtainthePGFandPMFbysolvingtherecursiverelations. Finally, we derive the PGF for waiting time distribution and obtain the recursive relations in PMFs and moments of waiting time distribution. Recall that Mn is the distribution of observing the event exactly k consecutive failures followed by exactly k1,k2 1 k consecutive successes. Mathematically, it can be formulated as 2 k1 k2 I = (1−η ) η (1−η ) k1+k2+1  i j+k1 k1+k2+1 i=1 j=1 Y Y   k1 k2 and I =η (1−η ) η (1−η), k +k +2≤l≤n. l l−k1−k2−1 i+l−k1−k2−1 ! j+l−k2−1 l 1 2 i=1 j=1 Y Y    Then, n Mn = I . (2) k1,k2 l l=k1X+k2+1 Next, we derive the distribution of Mn and waiting time for Mn . k1,k2 k1,k2 3.1 Distribution of Mn k1,k2 We use Markov chain embedding technique, as described in Section 2, to obtain the distribution of Mn (see k1,k2 Dafnis et al. [14]). Using (2), we can write Mn in MVB as follows: k1,k2 Define ℓ = sup{x : P(Mn = x) > 0} = ⌊n/(k +k )⌋ and C = {c ,c ,...,c }, where c = n k1,k2 1 2 x x,0 x,1 x,k1+k2+1 x,i (x,i), 0≤i≤k +k +1. Also, define a Markov chain {Y :t≥0} on Ω=∪ℓn C as Y =(x,j) if the event 1 2 t x=0 x t of the distribution Mn occurred x times in the first t outcomes and k1,k2 • j =0, if η =1 and there is no pattern with exactly k failures before η . If there is pattern with exactly t 1 t k failures then there is no pattern of k −1 successes followed by exactly k failures before η . 1 2 1 t • j =i, 1≤i≤k , if η =η =···=η =0 and the (t−i)-th outcome is a success (if exists). 1 t t−1 t−i+1 4 • j = k+, if there is pattern of more than k failures. i.e., there exists a positive integers l ≥ k +1 such 1 1 1 that η =η =···=η =0. t t−1 t−l+1 • j = k +i, 1 ≤ i ≤ k , if η = η = ··· = η = 0, η = η = ··· = η = 1 and the 1 2 t t−1 t−i+1 t−i t−i−1 t−i−(k1−1) (t−i−k )-th outcome is a success (if exists). 1 Also,wesaythepatterniscompleteifthefailureoccursafterexactlyk consecutivefailuresfollowedbyexactly 1 k consecutive successes. Now, Mn becomes MVB with this setup and π =(1,0,...,0) , 2 k1,k2 0 1×(k1+k2+2) (·,0) (·,1) (·,2) · (·,k1−1) (·,k1) (·,k1+) (·,k1+1) (·,k1+2) · (·,k1+k2−1) (·, k1+k2) p q 0 · 0 0 0 0 0 · 0 0 p 0 q · 0 0 0 0 0 · 0 0   · · · · · · · · · · · ·    p 0 0 · 0 q 0 0 0 · 0 0      A= 0 0 0 · 0 0 q p 0 · 0 0     p 0 0 · 0 0 q 0 0 · 0 0       0 q 0 · 0 0 0 0 p · 0 0     · · · · · · · · · · · ·       0 q 0 · 0 0 0 0 0 · 0 p     p 0 0 · 0 0 0 0 0 · 0 0      where A is (k +k +2)×(k +k +2) matrix and B is the matrix ofentries zeroexcept (k +k +2,2)which 1 2 1 2 1 2 is equal to q. Therefore, from Theorem 2.1, the double generating function of Mn is given by k1,k2 ∞ 1+(qz)k1(pz)k2(1−qz)(1−t) Φ(t,z)= φ (t)zn = . n 1−z+(qz)k1(pz)k2(1−t)(1−qz)(1−pz) n=0 X Let a(p):=qk1pk2 and k :=k1+k2, then 1+a(p)zk(1−qz)(1−t) Φ(t,z)= . (3) 1−z+a(p)zk(1−t)(1−qz)(1−pz) Now, using Φ(t,z), the following recursive relations follow: Theorem 3.1. The PGF of Mn satisfy the following recursive relation k1,k2 1 n≤k, φ (t)= 1+qa(p)(t−1) n=k+1, n   φ (t)+a(p)(t−1)[φ (t)−φ (t)+qpφ (t)] n≥k+2. n−1 n−k n−k−1 n−k−2  Theorem 3.2. Let p = P Mn =m be the PMF of Mn , then p satisfy the following recursive m,n k1,k2 k1,k2 m,n relation (cid:16) (cid:17) 0 n≤k, m>0  1 n≤k, m=0 pm,n = q1a−(pq)a(p) nn==kk++11,, mm==10,, p −a(p)[(p −p ) m,n−1 m,n−k m−1,n−k  −(pm,n−k−1−pm−1,n−k−1)+qp(pm,n−k−2−pm−1,n−k−2)] n≥k+2, m≥0. 5 Theorem3.3. Letµ =E[(Mn )j]bethej-thmomentofMn . Then, forj ≥1,µ satisfythefollowing n,j k1,k2 k1,k2 n,j recursive relation 0 n≤k,  qa(p) n=k+1, µ = n,j  µ +a(p)j−1 j (µ −µ +qpµ ) n≥k+2. n−1,j n−k,l n−k−1,l n−k−2,l l l=0(cid:18) (cid:19)  X The proofs of Theorems 3.1, 3.2 and3.3 follow using steps similar to the proofs of Theorems 3.1, 3.2 and 3.3 of Dafnis et al. [14]. Next, we solve the recursive relations and derive the PGF and PMF of Mn . k1,k2 Theorem 3.4. The PGF of Mn is given by k1,k2 φ (t)=ψ (t)−a(p)(t−1)[ψ (t)−qψ (t)], n n n−k n−k−1 where ⌊n⌋⌊n−lk⌋⌊n−lk−r(k+1)⌋ k k+1 k+2 n−l(k−1)−rk−s(k+1) ψ (t)= (−1)r(qp)s(a(p)(t−1))l+r+s n n−lk−r(k+1)−s(k+2),l,r,s l=0 r=0 s=0 (cid:18) (cid:19) X X X and n−l(k−1)−rk−s(k+1) = (n−l(k−1)−rk−s(k+1))! . n−lk−r(k+1)−s(k+2),l,r,s (n−lk−r(k+1)−s(k+2))! l! r! s! Proo(cid:0)f. For |z+a(p)(t−1)z(cid:1)k(1−z+qpz2)|<1, consider 1 1−z−a(p)(t−1)zk(1−z+qpz2) ∞ = zn(1+a(p)(t−1)zk−1(1−z+qpz2))n n=0 X ∞ n l r n l r = (−1)r−s(qp)s(a(p)(t−1))lzn+r+s+l(k−1) l r s n=0l=0r=0s=0(cid:18) (cid:19)(cid:18) (cid:19)(cid:18) (cid:19) XXXX ∞ ∞ ∞ ∞ n+l+r+s = (−1)r(qp)s(a(p)(t−1))l+r+szn+lk+r(k+1)+s(k+2) n,l,r,s n=0l=0r=0s=0(cid:18) (cid:19) XXXX ∞ ∞ ∞ ∞ n+l+r−s(k+1) = (−1)r(qp)s(a(p)(t−1))l+r+szn+lk+r(k+1) n−s(k+2),l,r,s Xl=0Xr=0Xs=0n=Xs(k+2)(cid:18) (cid:19) ⌊ n ⌋ ∞ ∞ ∞ k+2 n+l+r−s(k+1) = (−1)r(qp)s(a(p)(t−1))l+r+szn+lk+r(k+1) n−s(k+2),l,r,s l=0r=0n=0 s=0 (cid:18) (cid:19) XXX X Using similar steps for l and r, we have 1 1−z−a(p)(t−1)zk(1−z+qpz2) ⌊n⌋⌊n−lk⌋⌊n−lk−r(k+1)⌋ ∞ k k+1 k+2 n−l(k−1)−rk−s(k+1) = (−1)r(qp)s(a(p)(t−1))l+r+s zn  n−lk−r(k+1)−s(k+2),l,r,s  n=0 l=0 r=0 s=0 (cid:18) (cid:19) X X X X   ∞   = ψ (t)zn. n n=0 X Now, from (3), we get ∞ ∞ φ (t)zn =[1+a(p)zk(1−qz)(1−t)] ψ (t)zn. n n n=0 n=0 X X Comparing the coefficients of zn, we get the required result. 6 Theorem 3.5. The PMF of Mn is given by k1,k2 p =p˜ +a(p)[(p˜ −p˜ )−q(p˜ −p˜ )], m,n m,n m,n−k m−1,n−k m,n−k−1 m−1,n−k−1 where ⌊n⌋⌊n−lk⌋⌊n−lk−u(k+1)⌋ k k+1 k+2 n−l(k−1)−uk−v(k+1) l+u+v p˜ = (−1)l−m−va(p)l+u+v(qp)v m,n n−lk−u(k+1)−v(k+2),l,u,v m l=0 u=0 v=0 (cid:18) (cid:19)(cid:18) (cid:19) X X X Proof. Multiplyingbyznzm andtakingsummationovernandminrecursiverelationofTheorem3.2,wehave 1 2 ∞ ∞ 1+a(p)(1−qz )(1−z )zk p znzm = 1 2 1 . (4) m,n 1 2 1−z +a(p)(1−z )(1−z +qpz2)zk n=0m=0 1 2 1 1 1 X X For z +a(p)zk(z −1)(1−z +qpz2) <1, consider 1 1 2 1 1 (cid:12) (cid:12) (cid:12) 1 (cid:12) 1−z −a(p)zk(z −1)(qpz2−z +1) 1 1 2 1 1 ∞ = zn(1+a(p)zk−1(z −1)(qpz2−z +1))n 1 1 2 1 1 n=0 X ∞ ∞ n+l = a(p)l(qpz2−z +1)lzn+lk(z −1)l l 1 1 1 2 n=0l=0(cid:18) (cid:19) XX ∞ ∞ ∞ n+l+m l+m = (−1)la(p)l+m(qpz2−z +1)l+mzn+lk+mk zm l+m m 1 1 1 2 ! m=0 n=0l=0(cid:18) (cid:19)(cid:18) (cid:19) X XX Next, following steps similar to the proof of Theorem 3.4, we get 1 1−z −a(p)zk(z −1)(qpz2−z +1) 1 1 2 1 1 ⌊n⌋⌊n−lk⌋⌊n−lk−u(k+1)⌋ ∞ ∞ k k+1 k+2 n−l(k−1)−uk−v(k+1) = ×  n−lk−u(k+1)−v(k+2),l,u,v m=0n=0 l=0 u=0 v=0 (cid:18) (cid:19) X X X X X   l+u+v (−1)l−m−va(p)l+u+v(qp)v znzm m 1 2 (cid:18) (cid:19) (cid:19) ∞ ∞ = p˜ znzm. m,n 1 2 m=0n=0 X X Substituting in (4) and comparing the coefficients of znzm, we get the required result. 1 2 Remark 3.1. Dafnis et al. [14] also observed the distribution of exactly k consecutive failures followed by 1 exactly k consecutive successes out of n trials, where the concept of failure after this pattern is not considered 2 which is essential to clarify that the pattern is exact or not. The PGF and PMF of Dafnis et al. [14], which is ψ (t) and p˜ respectively, seem to satisfy the condition of PGF and PMF. But the distribution of waiting n m,n time is not satisfying the necessary condition of PGF which is shown in Remark 3.2. 3.2 Waiting Time for Mn k1,k2 Let W be the waiting time for Mn . Then, using Theorem2.1, the double generating function of W can be r k1,k2 r easily calculated. Also, using double generating function, we derive the PGF, recursive relations in PMFs and moments as follows: 7 Theorem 3.6. The PGF of W , for r ≥1, is given by r qt a(p)tk(1−qt)(1−pt) r M (t)= . r 1−pt 1−t+a(p)tk(1−qt)(1−pt) (cid:18) (cid:19) Theorem 3.7. Letf (m)=P(W =m)be thePMF of W , then f (m) satisfies the following recursiverelation r r r r f (m)=f (m−1)+a(p)[f (m−k)−f (m−k)−f (m−k−1)+f (m−k−1)+qpf (m−k−2)−qpf (m−k−2)], r r r−1 r r−1 r r−1 r for r ≥2 and m≥rk+1 with initial condition f (m)=δ , f (m)=0 for m≤rk and 0 m,0 r f (k+1)=qa(p), f (k+2)=qpa(p) 1 1 and f (m)=f (m−1)−a(p)[f (m−k)−f (m−k−1)+qpf (m−k−2)], m≥k+2, 1 1 1 1 1 where δ is the Kronecker delta function. m,0 Theorem 3.8. Let µ˜ =E[(W )j] bethej-thmoment of W . Then µ˜ satisfy thefollowing recursiverelation r,j r r r,j j j µ˜ = [µ˜ +a(p)(kj−l−(k+1)j−l+qp(k+2)j−l)(µ˜ −µ˜ )], l ≥1 r,j r,l r−1,l r,l l l=0(cid:18) (cid:19) X with initial condition µ˜ =δ and 0,j j,0 j j µ˜ = µ˜ [1−a(p)(kj−l−(k+1)j−l+qp(k+2)j−l)]+qa(p)[(k+1)j −q(k+2)j]. 1,j 1,l l l=0(cid:18) (cid:19) X The proofs of Theorems 3.6, 3.7 and 3.8 follow from the Theorems 4.4, 4.5 and 4.6 of Dafnis et al. [14]. Remark 3.2. In Theorem 4.4 of Dafnis et al. [14], the PGF for r-th waiting time is given by (qz)k1(pz)k2(1−qz)(1−pz) r H (z)= (1−pz)−1. r 1−z+(qz)k1(pz)k2(1−qz)(1−pz) (cid:18) (cid:19) The necessary condition of PGF is H (1) = 1 but here, we have H (1) = 1/q 6= 1. So, H (z) is not the PGF r r r for r-th waiting time distribution and also the recursive relations in Theorems 4.5 and 4.6 are not correct. Also, in similar spirit, we can say that Theorem 3.4, 3.5 and 3.6 may not be correct, as both the results derived by using same matrices A and B. Hence, we have corrected some results of Dafnis et al. [14], Theorems 4.4, 4.5 and 4.6. 4 Recursive Relations In this section, we derive the recursive relations between PGF and its derivative for Mn which are used in k1,k2 Section 5. Define a =1, a =−1, a =qp, d =d =n−k−2, d =n−k−1, 1 2 3 1 3 2 −q(k+2) n=k+1, b (n)=n+1, b (n)=n and b (n)= 1 3 2 ( n+1−q n≥k+2. Lemma 4.1. The PGF of Mn satisfies the following recursive relations k1,k2 (i) φ′ (t)=a(p)[(n−k)φ (t)−(n−k−q)φ (t)+qp(n−k−1)φ (t)] n n−k n−k−1 n−k−2 −a(p)(t−1)[(k−1)φ′ (t)−kφ′ (t)+qp(k+1)φ′ (t)] n−k n−k−1 n−k−2 8 3 di (ii) φ′ (t)=a(p) a b (n−s)C (t)φ (t), n i i s n−k−s−i+1 i=1 s=0 X X ⌊s/k⌋⌊sk−+l1k⌋ s−l(k−1)−mk (k+1)s−lk−m(k+1) where C (t)= (−1)m2l[a(p)(t−1)]l+m. s s−lk−m(k+1),l,m (k+2)s−l(k−1)−mk+1 l=0 m=0 (cid:18) (cid:19) X X Proof. (i) From (3), the double PGF of Mn is given by k1,k2 ∞ 1−a(p)(t−1)zk(1−qz) Φ(t,z)= φ (t)zn = n 1−z−a(p)(t−1)zk(1−qz)(1−pz) n=0 X or equivalently ∞ [1−z−a(p)(t−1)zk(1−qz)(1−pz))] φ (t)zn =1−a(p)(t−1)zk(1−qz). (5) n n=0 X Differentiating (5) w.r.t. t and z, we have ∞ ∞ [1−z−a(p)(t−1)zk(1−z+qpz2)] φ′ (t)zn−a(p)zk[1−z+qpz2] φ (t)zn =−a(p)zk(1−qz) (6) n n n=0 n=0 X X ∞ ∞ [1−z−a(p)(t−1)zk(1−z+qpz2)] nφ (t)zn−[z+a(p)(t−1)zk(k−(k+1)z+qp(k+2)z2)] φ (t)zn n n n=0 n=0 X X =−a(p)(t−1)zk(k−q(k+1)z). (7) Multiplying by(z+a(p)(t−1)zk(k−(k+1)z+qp(k+2)z2))in(6),a(p)zk(1−z+qpz2)in(7)andsubtracting, we get ∞ ∞ [z+a(p)(t−1)zk(k−(k+1)z+qp(k+2)z2)] φ′ (t)zn−a(p)zk(1−z+qpz2) nφ (t)zn n n n=0 n=0 X X a(p)zk+1(1−qz)(pa(p)(t−1)zk(1−qz)−1) = . (8) 1−z−a(p)(t−1)zk(1−qz)(1−pz)) Now, adding (6) and (8) yields to ∞ ∞ [1+a(p)(t−1)zk((k−1)−kz+qp(k+1)z2)] φ′ (t)zn−a(p)zk(1−z+qpz2) (n+1)φ (t)zn n n n=0 n=0 X X ∞ =−a(p)zk(1−qz) φ (t)zn. (9) n n=0 X Comparing the coefficients of zn, we get the required result. (ii) Multiplying by (k+1) in (6) and adding with (9), we have ∞ [(k+2)−(k+1)z−a(p)(t−1)zk(2−z)] φ′ (t)zn =qa(p)(k+2)zk+1 n n=0 X ∞ ∞ ∞ +a(p) (n+1)φ (t)zn− (n+1−q)φ (t)zn+qp nφ (t)zn . (10) n−k n−k−1 n−k−2 " # n=k+2 n=k+2 n=k+2 X X X 9 Following steps similar to the proof of Theorem 3.4, it can be easily seen that 1 (k+2)−(k+1)z−a(p)(t−1)zk(2−z) ⌊n−lk⌋ ∞ ⌊n/k⌋ k+1 n−l(k−1)−mk (k+1)n−lk−m(k+1) = (−1)m2l[a(p)(t−1)]l+m zn  n−lk−m(k+1),l,m (k+2)n−l(k−1)−mk+1  n=0 l=0 m=0 (cid:18) (cid:19) X X X   ∞   = C (t)zn. n n=0 X Substituting in (10) and comparing the coefficients of zn, we get n−k−2 n−k−1 n−k−2 φ′ (t)=a(p) (n−s+1)C (t)φ (t)− b (n−s)C (t)φ (t)+qp (n−s)C (t)φ (t) n s n−k−s 2 s n−k−s−1 s n−k−s−2 " # s=0 s=0 s=0 X X X 3 di =a(p) a b (n−s)C (t)φ (t). i i s n−k−s−i+1 i=1 s=0 X X This proves (ii). 5 Discrete Gibbs Measures and Stein Operator In this section, using PGF approach, we derive Stein operator for DGM and a Stein operator for Mn as k1,k2 perturbation of DGM. DGM contains family of discrete distributions and its PMF is given by 1eU(m)wm γ(m)= , m=0,1,...,N, (11) β m! wherew >0isfixed,N ∈N ∪{∞},U :N →Rbeafunctionandβ = N eU(m)wm. Also,thisrepresentation 0 0 m=0 m! is not unique. For example, Poisson distribution has representations with w = λ, U(m) = −λ, β = 1 and P w =λ, U(m)=0, β =eλ. The PGF of DGM is given by N G(t)= γ(m)tm. m=0 X Therefore, Stein operator for DGM can be calculated using PGF approachas follows N N−1 N−1 N−1 γ(m+1) G′(t)= mγ(m)tm−1 = (m+1)γ(m+1)tm = (m+1) γ(m)tm = weU(m+1)−U(m)γ(m)tm γ(m) m=0 m=0 m=0 m=0 X X X X Comparing the coefficients of tm from third and last term, we have (m+1)γ(m+1)=weU(m+1)−U(m)γ(m). Also, this expression can be directly computed by (11). Let g ∈G , then γ N N g(m+1)(m+1)γ(m+1)= g(m+1)weU(m+1)−U(m)γ(m). m=0 m=0 X X 10

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.