ebook img

Upper Bound on the Capacity of Gaussian Channels with Noisy Feedback PDF

0.22 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Upper Bound on the Capacity of Gaussian Channels with Noisy Feedback

Upper Bound on the Capacity of Gaussian Channels with Noisy Feedback Chong Li and Nicola Elia Department of Electrical and Computer Engineering, Iowa State University Ames, IA, 50011 Email: {chongli, nelia}@iastate.edu Abstract—We consider an additive Gaussian channel with 2 additive Gaussian noise feedback. We first derive an upper 1 bound on the n-block capacity (defined by Cover [1]). It is 0 shown that this upper bound can be obtained by solving a 2 convexoptimizationproblem.Withstationarityassumptionson Gaussian noise processes, we characterize the limit of the n- n blockupperboundandprovethatthislimitistheupperbound a of the noisy feedback (shannon) capacity. J IndexTerms—Capacity,Gaussianchannelswithnoisyfeedback, 7 convex optimization, stationarity. ] T I. INTRODUCTION Fig.1. Gaussianchannels withadditive Gaussiannoisefeedback I We consider a time-varying additive Gaussian channel . s with time-varyingadditive Gaussian feedback.See Fig.1. M c beusedforthetimeblock{1,2,···,n}[1].Ifweassumethe [ isamessageindexwhereM∈{1,2,3,···,2nR}.Theadditive ¥ Gaussian channel is modeled as stationarityontheprocess{Wi}i=0, itiswell-knownthatthe 1 nonfeedback (Shannon) capacity is characterized by water- v Y =X +W i=1,2,··· filling on the noise power spectrum. Specifically, i i i 8 58 wfohrearleltnhe∈gZa+us.sSiainmniloairslye,{tWhei}a¥i=d0distaivtiesfiGeasuWssnia∼n fNene(d0b,aKckw,nis) C= 41p Z−pp logmax{SSww((eeiqiq)),l }dq . 1 . modeled as whereS (eiq )isthepowerspectrumdensityofthestationary 1 w 0 Zi=Yi+Vi i=1,2,··· noise process {Wi}¥i=0. The water level l should satisfy :12 fwohrearleltnh∈e gZa+u.ssNiaonisneoVisean{dViW}¥i=a0resaatsissfiuemseVdnto∼bNeni(n0d,eKpve,nn)- 21p Z−pp max{0,l −Sw(eiq )}dq =P. v dent.NoticethatwehavenotassumedstationarityonW and Notethattheinitialideaofwater-fillingcomesfromShannon Xi V. The channel input Xi is generated based on M and Zi−1, [2]. When there is a perfect feedback (i.e. Z =Y for all i), i i satisfying r then-blockfeedbackcapacityis notablycharacterizedin[1] a 1(cid:229) n EX2(M,Zi−1)≤P. as n i i=1 1 det((I +B )K (I +B )T+K ) n n w,n n n s,n C = max log SincethecapacityofthisnoisyfeedbackGaussianchannelis fb,n Bn,Ks,n2n detKw,n difficultto characterize,we wish to find a tightupperbound where the maximum is taken over all positive semidefinite on the capacity in this paper. matrices K and all strictly lower triangular matrices B In retrospect, additive Gaussian channels have been stud- s,n n satisfying ied since the birth of “Information Theory”. When there is tr(K +B K BT)≤nP. no feedback (i.e. Z =0 for all i), the channel input X is s,n n w,n n i i independent of the previous channel outputs. The n-block Similartothenonfeedbackcase,ifweassumethestation- ¥ capacity is characterized in [1] as arityontheprocess{W} ,theperfectfeedback(Shannon) i i=0 capacity is characterized in [3] as 1 det(K +K ) w,n x,n Cn= tr(mKax,xn)≤nP 2nlog detKw,n C =sup 1 p logSs(eiq )+|1+B(eiq )|2Sw(eiq )dq . where the maximum is taken over all positive semidefinite fb Ss,B4p Z−p Sw(eiq ) (1) matrices K . Here, the n-block capacity can be thought of x,n with power constraint as the capacity in bits per transmission if the channel is to 1 p S (eiq )+|B(eiq )|2S (eiq )dq ≤P. (2) ThisworkwassupportedbyNSFundergrantnumberECS-0901846 2p Z−p s w Here B(eiq ) represents all possible strictly causal linear WewouldliketoremarkthatMassey’sdefinitionofdirected filters. When there is an additive Gaussian noise feedback information implicitly restricts the time ordering of random as shown in Fig.1, no characterizationon the capacityCnoise variables (Xn,Yn) as follows fb has been developed yet, to the present author’s knowledge. X ,Y ,X ,Y ,···,X ,Y . (3) So far, only few papers have addressed this problem or its 1 1 2 2 n n variations. [4] and [5] take Cover-Pombra scheme for the So we refer the interested readers to [13] for the definition noisy feedbackcase (coloredGaussian noise)and derive the of Directed Information for an arbitrary time ordering of upper and lower bounds on its maximal achievable rate. random variables. We next define a channel code for com- Other works focus on the additive white Gaussian noise munication channels with noisy feedback. (AWGN) channel with AWGN feedback. For example, [6] Definition 3: (Channel Code) A (n,M,e ) channel code n derivestheupperandlowerboundsonthereliabilityfunction overtime horizonn consists of an indexset{1,2,3,···,M}, and shows that, noise in the feedback link rendersthe noisy an encoding function e: {1,2,···,M}×Zn−1 → Xn, a feedback communication fundamentally different from the decoding function g:Yn→{1,2,···,M} and an error prob- perfect feedback case. [7], [8] and [9] propose specific ability satisfying coding/decoding schemes based on the notable Schalkwijk- Kailath Scheme [10]. 1 (cid:229)M p(w6=g(yn)|w)≤e n Inthispaper,wederiveupperboundsonthen-blocknoisy M w=1 feedbackcapacityandthenoisyfeedback(shannon)capacity, respectively. It is shown that the problem of computing the where limn→¥ en=0. We finally recall the Schur complement which will play derived upper bound on the n-block capacity can be trans- an important role in the paper. formed into a convex form, which can be solved efficiently Definition 4: [14]Consideran n×nsymmetricmatrixX by standard technical tools. partitioned as Notations: Uppercase and corresponding lowercase let- A B ters (e.g.Y,Z,y,z) denote random variables and realizations, X = . respectively. xn represents the vector [x ,x ,···,x ]T and (cid:20)BT C(cid:21) 1 2 n x0=0/.Inrepresentsann×nidentitymatrix.Kn>0(Kn≥0) If detA6=0, the matrix denotes that the n×n matrix K is positive definite (semi- n definite). log denotes the logarithm base 2 and 0log0=0. S=C−BTA−1B The expectation operator over X is presented as E(X). is called the Schur complement of A in X. II. PRELIMINARIES We present some properties of the Schur complement as follows. In this section, we review some definitions and Lemmas 1) detX =detAdetS. in information theory. 2) X ≥0 if and only if A>0 and S>0. Definition 1: [11] The mutual information I(X;Y) be- 3) If A>0, then X ≥0 if and only if S≥0. tween two random variables with joint density f(x,y) is defined as III. AN UPPERBOUND ON THEN-BLOCK CAPACITY f(x,y) I(X;Y)= f(x,y)log dxdy In this section, we first derive an upper bound on the n- Z f(x)f(y) block noisy feedback capacity (Theorem 1). Without loss Let h(X) denote the differential entropy of a random of generality, we characterize this upper bound as an op- variable X. Then it is clear that timization problem by adopting the Cover-Pombra scheme I(X;Y)=h(Y)−h(Y|X) (Theorem 2). We then transform the optimization problem intoaconvexform(Corollary1).Then-blocknoisyfeedback We recall a useful Lemma as follows. capacity is defined as follows [1]. Lemma 1: [11] Let the random vector X ∈Rn have zero Definition 5: mean and covariance K =EXXT (i.e. K =EXX , 1≤ x,n ij i j 1 i,j≤n). Then Cnoise= max I(M;Yn). (4) 1 fb,n n1tr(KX,n)≤Pn h(X)≤ log(2p e)ndetK We nowdefine a new quantityand thenprovethatit is an x,n 2 upper bound of the above n-block capacity. with equality if and only if X ∼N(0,Kx,n). Definition 6: Next, we present the definition of Directed Information 1 given by Massey [12]. C¯noise= max I(Xn→Yn|Vn). (5) Definition 2: The directed information from a sequence fb,n 1ntr(KX,n)≤Pn Xn to a sequenceYn is defined by Theorem 1: For a given power constraint P, n Cnoise≤C¯noise. (6) I(Xn→Yn)=(cid:229) I(Xi;Y|Yi−1). fb,n fb,n i i=1 Proof: I(M;Yn) =h(M)−h(M|Yn) =h(M)−h(M|Yn,Vn)−(h(M|Yn)−h(M|Yn,Vn)) (=a)h(M|Vn)−h(M|Yn,Vn)−I(M;Vn|Yn) =I(M;Yn|Vn)−I(M;Vn|Yn) =h(Yn|Vn)−h(Yn|M,Vn)−I(M;Vn|Yn) n =(cid:229) h(Y|Yi−1,Vn)−h(Y|Yi−1,M,Vn)−I(M;Vn|Yn) i i Fig.2. GaussianchannelswithadditiveGaussiannoisefeedback(Gaussian i=1 signalling andlinear feedback) n (=b)(cid:229) h(Y|Yi−1,Vn)−h(Y|Yi−1,M,Vn,Xi)−I(M;Vn|Yn) i i i=1 (=c)(cid:229) n h(Y|Yi−1,Vn)−h(Y|Yi−1,Xi,Vn)−I(M;Vn|Yn) Proof: First of all, we have i i i=1 I(Xn→Yn|Vn) n =(cid:229) I(Xi;Yi|Yi−1,Vn)−I(M;Vn|Yn) =(cid:229) n I(Xi,Y|Yi−1,Vn) i=1 i =I(Xn→Yn|Vn)−I(M;Vn|Yn) i=1 n =(cid:229) h(Y|Yi−1,Vn)−h(Y|Yi−1,Xi,Vn) i i (a) follows from the fact that M and Vn are independent. i=1 (b) follows from the fact that Xi can be determined by M (=a)(cid:229) n h(Y|Yi−1,Vn)−h(Y|Yi−1,Xi) and the outputs of the feedback link (i.e. Yi−1+Vi−1). (c) i i i=1 follows from the Markov chain M−(Yi−1,Xi,Vn)−Y. i n SincetheconditionalmutualinformationI(M;Vn|Yn)≥0, =(cid:229) h(Y|Yi−1,Vn)−h(X +W|Yi−1,Xi,Wi−1) i i i we have i=1 n I(M;Yn)≤I(Xn→Yn|Vn) =(cid:229) h(Yi|Yi−1,Vn)−h(Wi|Wi−1) i=1 =h(Yn|Vn)−h(Wn) The proof is completed. Next, we characterize the above upper bound. First of where(a)followsfromtheMarkovchainVn−(Yi−1,Xi)−Y. i all, we consider a scheme with linear encoding of the feedback signal and Gaussian signaling of the message We next show that maximizing h(Yn|Vn)−h(Wn) over (Cover-Pombrascheme) as shown in a vector form in Fig.2. codingschemeasshowninFig.2doesnotlosetheoptimality. Since we can not affect the noise entropy (i.e. h(Wn)), The channel input signal: Xn=Sn+B (Wn+Vn) n we need to maximize h(Yn|Vn) over all possible channel The channel output signal:Yn=Sn+B (Wn+Vn)+Wn n inputs {X}n . To begin with, we present two insightful The power constraint:tr(K +B (K +K )BT)≤nP i i=1 s,n n w,n v,n n observations on the channel inputs Xn: where Sn ∼N(0,K ) is the message information vector s,n 1) Yn|Vn shouldbe Gaussian distributionfor maximizing and B is an n×n strictly lower triangular linear encoding n h(Yn|Vn). Since Wn is Gaussian and Yn =Xn+Wn, matrix. Note that the one-step delay in the feedback link is Xn|Vn must be Gaussian. captured by the structure of matrix B . Random variables n 2) Xn depends on Wn+Vn instead of Wn and Vn sep- Sn,Vn,Wn are automatically assumed to be independent. arately since the channel outputs are fed back to the Inthefollowing,weprovethatC¯noise canbecharacterized fb,n encoder without any encoding. by the above coding scheme without losing the optimality. Theorem 2: C¯noise can be obtained as the optimal objec- Therefore,themostgeneralnormalcausaldependenceofXn fb,n on Yn satisfying the above two observations is the form of tive value of the following optimization problem. Xn=Sn+B (Wn+Vn). Then we have n 1 det((I +B )K (I +B )T+K ) maximize log n n w,n n n s,n h(Yn|Vn)−h(Wn) Bn,Ks,n 2n detKw,n =h(Sn+B (Wn+Vn)+Wn|Vn)−h(Wn) subject to tr(K +B (K +K )BT)≤nP n s,n n v,n w,n n =h(Sn+(B +I )Wn|Vn)−h(Wn) n n K ≥0 B is strictly lower triangular s,n n =h(Sn+(B +I )Wn)−h(Wn) (7) n n By Lemma 1, the proof is complete. Remark 1: The key idea of this theorem is to show The main idea of the proof is taken from [3]. We refer interested readers to the Appendix for the detail. Next, we 1 C¯nfbo,inse=allmcodaixngimscihzeemesnI(Xn→Yn|Vn) showthattheabovelimitcharacterizationistheupperbound of the noisy feedback (Shannon) capacity. 1 = maximize I(Xn→Yn|Vn) Theorem 4: Assume that {Wi} and {Vi} are stationary Cover-Pombraschemen processes. ThenCnoise≤C¯noise. fb fb WewouldliketoremarkthattheCover-Pombraschememay Proof: notbetheoptimal(capacity-achieving)codingschemeinthe Cnoise noisy feedback case. We herein adopt this coding scheme fb 1 becauseitcannicelycharacterizetheproposedupperbound. ≤limsup max I(M;Yn) Definitely, the Cover-Pombra scheme may not apply if we n→¥ {Xi}ni=0n look at a different upper bound. (a) 1 Thefollowingcorollaryshowsthattheoptimizationprob- ≤limsup max I(Xn→Yn|Vn) lem (7) can be transformed into a convex form. This result n→¥ {Xi}ni=0n hasbeenshownin[5].Forreader’sconvenience,wegivethe (=b)limsup max 1 logdet((In+Bn)Kw,n(In+Bn)T+Ks,n) proof again in the Appendix. n→¥ Bn,Ks,n2n detKw,n Corollary 1: C¯nfbo,inse can be obtained as the optimal objec- (=c)C¯noise fb tive value of the following convex optimization problem. where (a) follows from Theorem 1, (b) follows from Theo- 1 K−1 BT 1 maximize logdet v,n n − logdet(K−1K ) rem 2 and (c) follows from Theorem 3. Hn,Bn 2n (cid:20) Bn Hn(cid:21) 2n v,n w,n Remark 2: Compared with the perfect feedback capacity subject to tr(Hn−Kw,nBTn −BnKw,n−Kw,n)≤nP characterization (1) and (2), feedback noise Sv(eiq ) only H I +BT BT affects the power allocation. If the noise in the feedback In+nBn nK−w,1nn 0nn ≥0 link increases (i.e. Sv(eiq ) grows large in some sense), the feedback benefit in increasing reliable transmission rate B 0 K−1  n n v,n vanishes. That is, the noisy feedback system behaves like B is strictly lower triangular n a nonfeedback system since, due to the power constraint, B(eiq ) approaches 0 as S (eiq ) grows. v IV. AN UPPERBOUND ON THECAPACITY V. SIMULATION RESULTS As we have shown, the upper bound of the n-block In this section, we show some simulation results to gain capacity is numerically solvable due to its convex form insightonthecapacityofGaussianchannelswithnoisyfeed- (interior-point method). As for the Shannon (infinite-block) back. The simulation results herein are taken from [5]. For capacity, however, there is still much work to be done. The reader’s convenience, we re-present some simulation results problem is that the formula (7) may not have a limit as here and give a brief discussion. We refer the interested n→¥ , due to the time-varying nature of the noises {W} i readersto[5]formoresimulationresults.Weassumethatthe and {V}. This is the main difficulty for us to develop an i forward channel is created by a first order moving average upper bound for the Shannon capacity. We herein handle (1st-MV) Gaussian process. That is, this problem by assuming stationarity on noise {W} and i {Vi}.Inthissection,wefirstshowthatunderthestationarity Wi=Ui+a Ui−1 assumption the limit of formula (7) exists and we develop where U is a white Gaussian process with zero mean and i its limit characterization. Then we prove that the limit unit variance. We also assume that the feedback link is characterization is the upper bound of the noisy feedback createdbyanadditivewhite Gaussiannoisewith K =s I v,n n (Shannon) capacity. (s ≥ 0). Due to the practical computation limit, we take coding block length n =30 and power limit P =10. We Theorem 3: Assume that {Wi} and {Vi} are stationary computed the upper bound of n-block capacity derived processes. Then the limit of formula (7) exists and can be in our paper and the lower bound (Theorem 2 in [5]) characterized as for averaging statistic a =0.1 in the 1st-MV channel, as 1 p S (eiq )+|1+B(eiq )|2S (eiq ) shown in Fig. 3. Generally, the plots show that the n-block C¯noise=sup log s w dq fb Ss,B4p Z−p Sw(eiq ) (8) cloawpaecrityb,ouwnhdisc,hshisaripnlythdeecrreegaisoens baestwseengrothwes.upWpherenansd with power constraint grows large enough (e.g. s =0.8 in Fig.3), the feedback 1 p rate-increasing enhancement almost shuts off and, thus, the S (eiq )+|B(eiq )|2(S (eiq )+S (eiq ))dq ≤P. (9) feedback system behaves like a nonfeedback system. Based 2p Z−p s w v on this observation, we may claim that the n-block capacity Here,S (eiq ),S (eiq )andS (eiq )arethepowerspectralden- of the Gaussian channel with noisy feedback is sensitive to s w v sityof{S},{W}and{V}respectively.B(eiq )=(cid:229) ¥ b eikq the feedback noise. i i i k=1 k is a strictly causal linear filter. (2). 1.775 1.77 K ≥0⇔H −(I +B )K (I +B )T−B K BT ≥0 s,n n n n w,n n n n v,n n 1.765 H I +BT BT 1.76 upper bound n n n n 1.755 lnoowiseer lebsosu nfededback capacity ⇔InB+Bn K0−w,1n K0−n1≥0 e 1.75 non−feedback capacity  n n v,n at R1.745 By taking simple replacements on the original formula, 1.74 the proof is complete. 1.735 1.73 B. Proof of Theorem 3 1.725 Before showing the proof of the theorem, we need the 0 0.2 0.4 0.6 0.8 1 1.2 following lemma. s Lemma 2: ConsiderthecodingschemeasshowninFig.2, Fig.3. Thebounds onCnoise ofthe1st-MVchannel witha =0.1. fb,n I(Sn;Yn|Vn)=I(Xn→Yn|Vn) VI. CONCLUSION Proof: We have derived an upper bound on the n-block capacity of additive Gaussian channels with additive Gaussian noise I(Sn;Yn|Vn) feedback. As it is shown, this n-block upper bound can =h(Yn|Vn)−h(Yn|Sn,Vn) be obtained by solving a convex optimization problem. n By assuming stationarity on the Gaussian noises, we have =(cid:229) h(Y|Yi−1,Vn)−h(Y|Yi−1,Sn,Vn) characterizedthe limit of the n-block upperbound,which is i i i=1 the upper bound of the noisy feedback (shannon) capacity. n In [15], the authors showed that for strong converse (=a)(cid:229) h(Yi|Yi−1,Vn)−h(Yi|Yi−1,Sn,Vn,Xi) finite-alphabet channels with noisy feedback, the capacity i=1 n is characterized by (=b)(cid:229) h(Y|Yi−1,Vn)−h(Y|Yi−1,Xi,Vn) i i 1 i=1 Cnoise=suplim I(Xn→Yn|Vn) n FB X n→¥ n =(cid:229) I(Xi;Yi|Yi−1,Vn) i=1 Therefore,we conjecture thatthe upper boundcharacterized =I(Xn→Yn|Vn) in this paper should be the true capacity. However, how to prove the achievability of this upper bound remains to be (a) follows from the fact that Xi can be determined by Si seen. and the outputs of the feedback link (i.e. Yi−1+Vi−1). (b) follows from the Markov chain Sn−(Yi−1,Xi,Vn)−Y. i VII. APPENDIX A. The proof of Corollary 1 Now, we are ready to give the proof of the theorem. Proof: Let Hn = (In +Bn)Kw,n(In +Bn)T +Ks,n + Proof: (sketch) Define C˜nfboise as formula (8). By the Szego¨- BnKv,nBTn, we have Kolmogorov-Kreintheorem, we have 1logdet((In+Bn)Kw,n(In+Bn)T+Ks,n) C˜nfboise= sup h(Y|V)−h(W) 2 detKw,n {Xi}−stationary 1 det(H −B K BT) = log n n v,n n . where the supremum is taken over all stationary Gaus- 2 detK w,n sian process {X}¥ of the form X =S +(cid:229) i b (W + i i=0 i i k=1 k i−k ¥ We also have V ) where {S} is stationary and independent of i−k i i=0 ({W}¥ ,{V}¥ ) such that E[X2]≤P. tr(K )≤nP⇔tr(K +B (K +K )BT)≤nP i i=0 i i=0 i x,n s,n n v,n w,n n We first show that ⇔tr(H −K BT−B K −K )≤nP. n w,n n n w,n w,n C¯noise≤C˜noise (10) Next,we havethe followingequivalencesby applyingthe fb,n fb Schur complement. (1).det(cid:20)KB−vn,n1 BHTnn(cid:21)=det(Hn−BnKv,nBTn)detK−v,n1. fsoidrearlal nb.loFcikx-wnisaendwhasitseupmreoc(eKss∗s,{n,SBi}∗n(i)=k+kan1c+)hn1i,ev0e≤sCk¯<nfbo,i¥nse.,iCnodne-- pendent and identically distributed according to N (0,K∗ ). Taking e →0, we obtain n s,n 2nC¯noise liminfC¯noise≥C˜noise fb,n n→¥ fb,n fb =I(X1n→Y1n|V1n)+I(Xn2+n1→Yn2+n1|Vn2+n1) The technical discussion on power constraint is identical to (=a)I(Sn;Yn|Vn)+I(S2n ;Y2n |V2n ) that in [3], so we herein omit it. Combined with inequality 1 1 1 n+1 n+1 n+1 (10), we know that the limit ofC¯noise exists and =h(Sn|Vn)+h(S2n |V2n )−h(Sn|Yn,Vn)−h(S2n |Y2n ,V2n ) fb,n 1 1 n+1 n+1 1 1 1 n+1 n+1 n+1 =h(S12n|V12n)−h(S1n|Y1n,V1n)−h(Sn2+n1|Yn2+n1,Vn2+n1) nl→im¥ C¯nfbo,inse=C˜nfboise. ≤h(S2n|V2n)−h(S2n|Y2n,V2n) 1 1 1 1 1 =I(S12n;Y12n|V12n) REFERENCES (=b)I(X2n→Y2n|V2n) [1] T. M. Cover and S. Pombra, “Gaussian feedback capacity,” IEEE 1 1 1 Transactions onInformationTheory,vol.35,no.1,pp.37–43,1989. (=c)h(Y2n|V2n)−h(W2n) [2] C.E.Shannon,“Communication inthepresenceofnoise,”Proc.IRE, 1 1 1 vol.37,pp.10–21,1949. where (a) and (b) follows from Lemma 2. (c) follows from [3] Y.H.Kim,“Feedbackcapacityofstationarygaussianchannels,”IEEE Transactions onInformationTheory,vol.56,no.1,pp.57–85,2010. the proof of Theorem 2. By repeating the same argument, [4] Z. Chance and D. J. Love, “A noisy feedback encoding scheme for we have the gaussian channel,” IEEE International Conference on Acoustics Speech andSignalProcessing,pp.3482–3485,2010. 1 C¯noise≤ (h(Ykn|Vkn)−h(Wkn)) [5] C.LiandN.Elia, “Bounds ontheachievable rate ofnoisyfeedback fb,n kn 1 1 1 gaussianchannelsunderlinearfeedbackcodingscheme,”IEEEInter- national Symposium onInformation Theory,pp.71–75,2011. for all k. Next, we use the same technical skill as [6] Y. H. Kim, A. Lapidoth, and T. Weissman, “The gaussian channel [3] to show the inequality (10). Define the time-shifted withnoisyfeedback,” IEEEInternational Symposium onInformation process¥ {Xi(t)}¥i=−¥ ¥ where Xi(t) = X¥ i+t. Similarly define [7] TAh.eDor.yW,ypnpe.r1,4“1O6n–1d4ig2i0ta,l2c0o0m7.municationoveradiscere-timegaussian {Yi(t)}i=−¥ , {Wi(t)}i=−¥ and {Vi(t)}i=−¥ . Introduce a ran- channelwithnoisyfeedback,”TheBellSystemTechnicalJournal,pp. dom variable T, uniformly distributed over {1,2,3,···,n} 3173–3186, Dec.1969. and independentof everythingelse. Then it is easy to check [8] N. C. Martins and T. Weissman, “Coding for additive white noise ¥ channels with feedback corupted by quantization or bounded noise,” that {Xi(T),Yi(T),Wi(T),Vi(T)}i=−¥ is jointly stationary. IEEETransactions onInformation Theory, vol. 54, no.9, pp. 4274– Next, we define {X˜i,Y˜i,W˜i,V˜i}¥i=−¥ as a jointly Gaussian 4282,Sep.2008. process with the same mean and autocorrelation as the [9] Z.ChanceandD.J.Love,“Concatenatedcodingfortheawgnchannel ¥ withnoisyfeedback,” ArxivpreprintarXiv:0909.0105,2011. stationary process {Xi(T),Yi(T),Wi(T),Vi(T)}i=−¥ . Thus, [10] J. P. M. Schalkwijk and T. Kailath, “A coding scheme for additive noise channels with feedback i: No bandwidth constraint,” IEEE C¯nfbo,inse Transactions on Information Theory, vol. IT-12, no. 2, pp. 172–182, 1 1966. ≤ (h(Ykn(T)|Vkn(T),T)−h(Wkn(T)|T)) [11] T.M.CoverandJ.A.Thomas,ElementsofInformationTheory,2nd kn 1 1 1 edition. NewYork:Wiley,2006. (=a) 1 (h(Ykn(T)|Vkn,T)−h(Wkn)) [12] J.L.Massey,“Causality,feedbackanddirectedinformation,”Proc.of kn 1 1 1 theIEEEConfonDecisionandControl, 2002. [13] S.TatikondaandS.Mitter,“Thecapacityofchannelswithfeedback,” 1 ≤ (h(Ykn(T)|Vkn)−h(Wkn)) IEEETransactionsonInformationTheory,vol.55,no.1,pp.323–349, kn 1 1 1 2009. 1 [14] S. Boyd and L. Vandenberghe, Convex Optimization. Cambridge: =kn(h(Y˜1kn|V1kn)−h(W1kn)) Cambridge university, 2004. [15] C. Li and N. Elia, “The information flow where(a)followsfromthestationarityassumptiononnoises and capacity of channels with noisy feedback,” V andW. Taking k→¥ , we obtain http://arxiv.org/PScache/arxiv/pdf/1108/1108.2815v1.pdf, 2011. C¯noise≤h(Y˜|V)−h(W)≤C˜noise fb,n fb Wenowshowthemainideaofprovingtheotherdirection. Given e >0, we let {X˜i}¥i=−¥ achieveC˜nfboise−e . Define the corresponding channel outputs as {Y˜i}¥i=−¥ . Then, liminfC¯noise n→¥ fb,n =liminf max h(Yn|Vn)−h(Wn) n→¥ {Xi}ni=1 1 1 1 ≥liminf(h(Y˜n|Vn)−h(Wn)) n→¥ 1 1 1 =limh(Y˜n|Vn)−h(Wn) n→¥ 1 1 1 =h(Y˜|V)−h(W) =C˜noise−e fb

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.