On the Capacity of Symmetric Gaussian Interference Channels with Feedback Lan V. Truong Hirosuke Yamamoto Information Technology Specialization Department (ITS) Dept. of Complexity Science and Engineering FPT University, Hanoi, Vietnam The University of Tokyo, Japan E-mail: [email protected] E-mail: [email protected] Abstract—In this paper, we propose a new coding scheme for symmetric Gaussian interference channels with feedback based on the ideas of time-varying coding schemes. The proposed scheme improves the Suh-Tse and Kramer inner boundsofthechannelcapacityforthecasesofweakandnot 5 verystronginterference.Thisimprovementismoresignificant 1 when the signal-to-noise ratio (SNR) is not very high. It is 0 shown theoretically and numerically that our coding scheme 2 canoutperformtheKramercode.Inaddition,thegeneralized degrees-of-freedom of our proposed coding scheme is equal r p to the Suh-Tse scheme in the strong interference case. The A numerical results show that our coding scheme can attain better performance than the Suh-Tse coding scheme for 1 all channel parameters. Furthermore, the simplicity of the 2 encoding/decoding algorithms is another strong point of our proposed coding scheme compared with the Suh-Tse coding ] scheme. More importantly, our results show that an optimal T codingschemeforthesymmetricGaussianinterferencechan- I nels with feedback can be achieved by using only marginal Figure1. SymmetricRateComparisionatHighSNR s. posterior distributions under a better cooperation strategy c between transmitters. [ Index Terms—Gaussian Interference Channel, Feedback, Posterior Matching, Iterated Function Systems. 5 v 9 I. INTRODUCTION 6 The capacity of the interference channel with feedback 5 3 has been still unknown for many decades although there 0 weresomeprogressestowardsolvingthisproblem.Kramer 1. developed a feedback strategy and derived an outer bound 0 oftheGaussianchannel[6],[9].However,thegapbetween 5 theouterandtheinnerboundsbecomeslargeunboundedly 1 as signal to noise ratio (SNR) and interference to noise : v ratio (INR) increase. Furthermore, Suh and Tse [1], [2] i characterized the capacity region within 2 bits/s/Hz and X the symmetric capacity within 1 bit/s/Hz for the two-user r a Gaussian interference channel with feedback. They also Figure2. SymmetricRateComparisionatLowSNR indicatedthatfeedbackprovidesmultiplicativegainathigh SNR. However, their coding scheme does not work well when the SNR is close to the INR. Its symmetric coding parameters, and therefore it overcomes all the weak-points ratebecomeslessthantheKramercodewhenthiscondition of the Suh-Tse coding scheme and improves the Suh- happens.Inaddition,italsohaslowerperformancethanthe Tse and Kramer inner bounds. For the strong interference Kramercodewhenα=logINR/logSNRisnotverylarge case, our code can achieve the same generalized degrees- and the SNR is low (cf. Figs. 1-2 of this paper or Fig. 14 of-freedom as the Suh-Tse coding scheme. Furthermore, in [2]). Later, the Suh-Tse coding scheme was extended to numerical results show that our coding scheme indeed M-user Gaussian interference channels with feedback for has better/equal performance than/to the Suh-Tse coding M ≥3 [7]. scheme for all channel parameters. In this paper, we propose a coding scheme which achieves better performance than the Suh-Tse code when II. CHANNELMODEL α = log INR/log SNR is not very large (See Figs.1- WeconsidertheGaussianinterferencechannelshownin 2 of this paper). In addition, our code can attain better Fig. 3, which has two senders and two receivers. Sender 1 symmetric rate than the Kramer code for all channel sends a source of information message points Θ , which 1 The rate pair is achieved within input power constraints P ,P if the following is satisfied: 1 2 n limsup 1 (cid:88)E[X(m)]2 ≤P ,m∈{1,2}. (2) n k m n→∞ k=1 The symmetric capacity is defined by C := sup{R : sym (R,R)is achievable}. An optimal fixed rate decoding rule for the two-user Gaussian interference channel with feedback for rate pair (R ,R ) is the one that decodes a pair of fixed length 1 2 intervals {(J1,J2) : |Jm| = 2−nRm form ∈ {1,2}}, which maximizes posteriori probabilities, i.e., (cid:52)(m)(y(n,m))= argmax P (J |y(n,m)). n Θm|Yn m Figure3. GaussianInterferenceChannelwithFeedback Jm∈E:|Jm|=2−nRm It is easy to see that the optimal fixed rate decoding rule for the Gaussian interference channel with feedback is the is uniformly distributed in (0,1), to receiver 1. Sender 2 traditional MAP, MMSE decoding rule. sends a source of information points Θ , which is also 2 An optimal variable rate decoding rule with target error uniformlydistributedin(0,1),toreceiver2.Assumingthat probabilitiesp(m)(n)=δ(m) istheonethatdecodesapair Θ is independent of Θ and each channel interferes with e n 1 2 of minimal-length intervals (J ,J ) such that accumulated the other. Specially, we assume that 1 2 marginalposterioriprobabilitiesexceedscorrespondingtar- Y1 =X1+aX2+Z1, gets, i.e., Y2 =X2+aX1+Z2, (cid:52)(nm)(y(n,m))= min |Jm|. where Z ∼ N(0,σ2) and Z ∼ N(0,σ2) are Gaussian Jm∈E:PΘm|Yn(Jm|y(n,m))≥1−δn(m) 1 1 2 2 noise random variables, and the input power constraints Bothdecodingrulesusethemarginalposteriordistribution are P and P , respectively. In this paper, we consider the of the message point P which can be calculated 1 2 Θm|Yn symmetric interference channel such that P = P = P online at the transmitters and the receivers. Refer [6] for 1 2 and σ2 = σ2 = 1. Hence, the signal-to-noise ratio and more details. 1 2 interference-to-noise ratio can capture channel gains SNR = P and INR = a2P. We also assume that output symbols Lemma1:Theachievabilityinthedefinition(1)and(2) are casually feedbacked to the corresponding sender and implies the achievability in the standard framework. thetransmittedsymbolX(m) attimencandependonboth Proof: See the detailed proof in [3], [4], and [5]. n themessageΘ andthepreviouschanneloutputsequence Notations: The cumulative distribution function (c.d.f.) m Y(n−1,m) :=(Y1(m),Y2(m),...,Yn(−m1)) for m∈{1,2}. of a random variable X is given by FX(x) = A transmission scheme for the two-user Gaussian inter- PX((−∞,x]), and their inverse c.d.f. is defined as ference channel with feedback is sequences of measurable FX−1(t) := inf{x : FX(x) > t}. The composition functions {g(m) : (0,1)×Rn−1 → R}∞ ,m ∈ {1,2} so function is defined by (f ◦ g)(x) = f(g(x)), and the n n=1 that the input to the channel generated by the transmitter sign function is defined as sgn(x) := 1 if x ≥ 0 and is given by sgn(x) := −1 if x < 0. We also use (x)+ := max(x,0) and log+(x):=max(logx,0). X(m) =g(m)(Θ ,Y(n−1,m)). n n m III. ACODINGSCHEMEFORGAUSSIAN A decoding rule for the two-user Gaussian interference INTERFERENCECHANNELSWITHFEEDBACK channel with feedback are sequences of measurable map- pings{∆(m) :Rn →E}∞ ,m∈{1,2}whereE istheset Inthissection,weproposeatime-varyingcodingscheme n n=1 of all open intervals in (0,1) and ∆(m)(y(n,m)) refers to forthesymmetricGaussianinterferencechannelwithfeed- n back as following: the decoded interval at receiver m. The error probabilities at time n associated with a transmission scheme and a A. Encoding decoding rule, is defined as • Step 1: Transmitter m sends X1(m) = FX−1(Θm) p(nm)(e):=P(Θm ∈/ ∆(nm)(Y(n,m))),m∈{1,2}, where m ∈ {1,2} and X ∼ N(0,P1) for some and the corresponding rate pair (R(1),R(2)) at time n is P1 >0. We also set n n defined by E[F−1(Θ )F−1(Θ )] ρ := X 1 X 2 =0. 1 (cid:12) (cid:16) (cid:17)(cid:12) 1 P R(m) :=− log(cid:12)∆(m) Y(n,m) (cid:12). 1 n n (cid:12) n (cid:12) • Step n+1 for n≥1: We say that a transmission scheme together with a de- Both transmitters estimate coding rule achieves a rate pair (R1,R2) over a Gaussian 1 interference channel if for m∈{1,2} we have ρn+1 = β2 {ρn−2bnsgn(ρn)(|ρn|+|a|) n (cid:16) (cid:17) nl→im∞P Rn(m) <Rm =0,nl→im∞p(nm)(e)=0. (1) +b2nsgn(ρn)[|ρn|(1+|a|2)+2|a|](cid:9). Transmitter 1 sends Xn(1+)1sgn(ρn+1) where • Using the Law of Iterated Expectations (Fubini’s Theorem), we can show that 1 X(1) := (X(1)−b sgn(ρ )Y(1)). n+1 βn n n n n P(Rn(m) <R)≤A(cid:15)2nR(β+(cid:15))(n−N(cid:15))|J1(m)|. Transmitter 2 sends Xn(2+)1sgn(a) where • We can also show that (cid:32) (cid:32) (cid:33)(cid:33) Xn(2+)1 := β1n(Xn(2)−bnsgn(a)Yn(2)). p(nm)(e)=O exp −|J18(mP)|2 . Receiver 1 receives (cid:16) (cid:17) By choosing |J(m)| = o 2n(log(β+(cid:15))−1−R) ≈ Y(1) =X(1) sgn(ρ )+|a|X(2) +Z(1) . 1 n+1 n+1 n+1 n+1 n+1 o(cid:0)2n(Rsym−R)(cid:1), and tm = −sm = |J1(m)|/2, the Receiver 2 receives aforementioned condition (1) is satisfied. Here, sym- bolsO(x)ando(x)areLandausymbols.Besides,the Y(2) =sgn(a)X(2) +asgn(ρ )X(1) +Z(2) . n+1 n+1 n+1 n+1 n+1 choice of sequences (b ,β ) following the rule (3) n n Both receivers feedback their received signals to the leads to the fact that the input power constraints are corresponding transmitters. also satisfied. Here, (β > 0,b ) should be chosen to satisfy the n n following constraints: Theorem 1: The non-degraded symmetric Gaussian interference channel (a (cid:54)= 0) can achieve the following N limsup 1 (cid:88)E[X(m)]2 ≤P, ∀m∈{1,2}. (3) symmetric rate: N n N→∞ n=1 1 R (bits/channeluse)= max × B. Decoding sym 2ρ∈[0,ρ0],b∈{b∗1,b∗2} • At each time slot n, receiver m ∈ {1,2} selects a (cid:20) P (cid:21) fixed interval J(m) = (s ,t ) ⊂ R as the decoded ×log , 1 m m P +b2[1+P +|a|2P +2|a|Pρ]−2bP[1+|a|ρ] interval with respect to X(m). n+1 where • T(cid:16)hen, set the de(cid:17)coded interval Jn(m) = (cid:115) Tn(m)(sm),Tn(m)(tm) , as the decoded interval a2P2+P −(cid:112)P[2a2P2+P] 0<ρ := <1, with respect to X(m), where 0 a2P2 1 T(m)(s):=w(m)◦w(m)◦···◦w(m)(s), ∀s∈R and n 1 2 n 2Pρ+|a|P +|a|Pρ2 and b∗1,2 = 2|a|P +2Pρ+2|a|2Pρ+ρ+2|a|Pρ2 w(1) :=β x+b sgn(ρ )Y(1), n n n n n (cid:112) P2|a|2ρ4−2ρ2(|a|2P2+P)+|a|2P2 w(2) :=β x+b sgn(a)Y(2). ± . n n n n 2|a|P +2Pρ+2|a|2Pρ+ρ+2|a|Pρ2 • Receiver m sets the decoded interval for the message Proof: From our transmission strategy, we have Θ as follows: m 1 (cid:16) (cid:17) X(1) = (X(1)−b sgn(ρ )Y(1)), (4) ∆(m) Y(n,m) =F (J(m)). n+1 β n n n n n Xm n n 1 We call this coding strategy the Gaussian interference X(2) = (X(2)−b sgn(a)Y(2)). (5) time-varyingfeedbackcodingstrategy,whichisanoptimal n+1 βn n n n variable rate decoding rule with doubly exponential decay Denote of targeted error probabilities (see the proof of the Lemma E[X(1)X(2)] 2 in this paper). P =E[X(1)]2 =E[X(2)]2,ρ = n n . n n n n P n IV. ANEWACHIEVABLERATEREGION (We can show by induction that E[X(1)]2 =E[X(2)]2 for n n Lemma 2: Under the condition that 0 < all n). Therefore, it is easy to see that limsup β < 1, the time-varying coding scheme n→∞ n for the symmetric Gaussian interference channel with E[X(2)Y(1)]=P (|ρ |+|a|), n n n n feedback achieves the following symmetric rate: E[X(1)Y(2)]=P sgn(a)sgn(ρ )(|ρ |+|a|), n n n n n R =−limsuplogβ (bits/channeluse). sym n→∞ n E[Yn(1)Yn(2)]=Pnsgn(a)(cid:2)|ρn|(1+|a|2)+2|a|(cid:3). Proof: We provide a sketch of the proof. The detailed Note that one can be found in papers [3], [4], and [5]. 1 (cid:110) • Define β := limsupn→∞βn. It is easy to see that E[Xn(1+)1Xn(2+)1]= β2 E[Xn(1)Xn(2)] R =logβ−1. n sym • For any R < Rsym, we can find an (cid:15) > 0 such that −bnsgn(ρn)E[Xn(2)Yn(1)]−bnsgn(a)E[Xn(1)Yn(2)] R<log(β+(cid:15))−1. (cid:111) • Choose an N(cid:15) ∈N such that supn≥N(cid:15)βn <β+(cid:15). +b2nsgn(ρn)sgn(a)E[Yn(1)Yn(2)] . Finally, we obtain Note that since Lemma 2 holds only for 0 < β < 1, the superscript + is necessary to deal with general cases. 1 Pn+1ρn+1 =Pnsgn(ρn)β2 {|ρn|−2bn(|ρn|+|a|) To complete the proof, we show a procedure to force n |ρ | = ρ(ρ = (−1)nρ) for any ρ ∈ [0,ρ ], b = b, n n 0 n +b2[|ρ |(1+|a|2)+2|a|](cid:9). (6) P =P, β =β for any n≥2. Indeed, from (6) and (7), n n n n we firstly force P =P,ρ =ρ, and β =β by setting 2 2 2 Similarly, observe that P E[Xn(1)Yn(1)]=Pnsgn(ρn)[1+|a||ρn|], Pρ= β21{2|a|(b21−b1)}, (11) 1 E[X(2)Y(2)]=P sgn(a)[1+|a||ρ |], n n n n 1 P = {P −2b P +b2(1+P +a2P )}. (12) E[Y(1)]2 =1+P +a2P +2|a||ρ |P , β2 1 1 1 1 1 1 n n n n n 1 E[Y(2)]2 =1+P +a2P +2|a||ρ |P . This procedure is feasible because (11) and (12) have at n n n n n least one solution which is a triplet (b ,P > 0,β > 0) 1 1 1 From the relations (4) and (5) we have for each ρ∈[0,ρ ]. Indeed, 0 1 P = {P −2P b [1+|a||ρ |] • For ρ=0, we can choose b1 =0,P1 =P,β1 =1. n+1 βn2 n n n n • For ρ(cid:54)=0, from (11) and (12) we have the following +b2 (cid:2)1+P +a2P +2|a||ρ |P (cid:3)(cid:9). (7) quadratic equation in b1 n n n n n b2[(1+P +a2P )ρ−2|a|P ]−2(ρ−|a|)P b +P ρ=0. From(6)and(7),ifwecanforceP →P,b →b,β →β 1 1 1 1 1 1 1 n n n and |ρ | → ρ ∈ [0,1], we obtain the following equations (13) n with three unknowns (b,ρ,β): The discriminant of this quadratic equation can easily be showntobeequaltoE(ρ)=a2(1−ρ2)P2−P ρ2.Forthe P = 1 (cid:2)P −2bP(1+|a|ρ)+b2(1+P +a2P +2|a|ρP)(cid:3), case ρ(cid:54)=|a|, we can choose P such that1this d1iscriminant β2 1 (8a) is equal to zero by setting P1 =ρ2/(a2(1−ρ2)). By this −ρ= 1 (cid:2)ρ−2b(ρ+|a|)+b2(cid:0)ρ(1+|a|2)+2|a|(cid:1)(cid:3). (8b) choice of P1, we obtain β2 (ρ−|a|)P b = 1 . By eliminating β and considering ρ as a running variable, 1 (1+P +a2P )ρ−2|a|P 1 1 1 we have the following quadratic equation in b for each fixed choice of ρ: In order for (11) and (12) to have solution β1, we need to show that the above choices of P ,b satisfy b2−b >0. 1 1 1 1 b2[2|a|P +2Pρ+2|a|2Pρ+ρ+2|a|Pρ2] Clearly for ρ<|a|, this requirement is satisfied by noting that b <0 since ρ−|a|<0 and −2b[2Pρ+|a|P +P|a|ρ2]+2Pρ=0. (9) 1 2|a|ρ2 2|a|P After some simple derivations, the discriminant of this ρ> = 1 . a2+ρ2 1+P +a2P quadratic equation is given by 1 1 ∆=P2|a|2ρ4−2ρ2(|a|2P2+P)+|a|2P2 :=f(ρ). (10) For ρ>|a| (|a|<1, of course), observe that Since f(0) = |a|2P2 > 0 and f(1) = −2P < 0, there ρ2 ρ P = > . exists the minimum value ρ ∈(0,1) such that f(ρ )=0. 1 a2(1−ρ2) |a|−a2ρ 0 0 Specifically, the value of ρ is given by 0 Itfollowsthat(ρ−|a|)P >(1+P +a2P )ρ−2|a|P >0 1 1 1 1 (cid:115)|a|2P2+P −(cid:112)P[2a2P2+P] or b1 >1. This also means that b21−b1 >0. For the case 0<ρ = <1. ρ = |a| any choice of P > 1/(1−a2) works since we 0 a2P2 1 have E(|a|)>0. Besides, the sum of two solutions of the On the other hand, since the derivative of f(ρ) satisfies quadratic equation (13) in b is equal to zero, and their 1 product is not equal to zero (by using Vieta’s formula). f(cid:48)(ρ)=4P2|a|2(ρ3−ρ)−4Pρ≤0, Hence, we must find at least one b <0 or b2−b >0. 1 1 1 forallρ∈[0,1),wehave∆=f(ρ)≥0forallρ∈[0,ρ0]. Finally, we only need to set βn = β,Pn = P,bn = b For all these values of ρ, we can easily show that (9) can which is a solution of (8a) and (8b) for each choice of havetwopositivesolutionsb∗,b∗ asthetheoremstatement. ρ ∈ [0,ρ0] and for all n ≥ 3. Here b should be chosen 1 2 As a result, (8a) and (8b) have at least one solution (b,β) to minimize β for each choice of ρ ∈ [0,ρ0] in order for each fixed ρ ∈ [0,ρ ]. Therefore, if we force all the to maximize the achievable symmetric rate of our coding 0 sequences (b ,β ,P ,|ρ |) to converges to (b,β,P,ρ), scheme. Last but not least, we can show that our code has n n n n thesymmetricGaussianinterferencechannelwithfeedback betterperformancethantheKramercode[6],andtherefore can achieve the following rate by the Lemma 2: the superscript + can be got rid of from the achievable symmetric rate formula. (cid:18) (cid:19)+ 1 Remark1:ForthedegradedGaussianinterferencechan- R = −log( min β) = max × sym ρ∈[0,ρ0] 2ρ∈[0,ρ0],b∈{b∗1,b∗2} nel with feedback (a=0), (8a) and (8b) become (cid:18) (cid:19) P 1 1 log+ −ρ= ρ(b−1)2,P = [P(b−1)2+b2]. (14) P +b2[1+P +|a|2P +2|a|Pρ]−2bP[1+|a|ρ] β2 β2 From (14), we must have ρ = 0 and the achievable positive number γ < α/2 although this ρ may not be ∗ symmetric rate becomes optimal. Indeed, for P sufficiently large, we have ρ ≈ 1 ∗ (cid:20) (cid:21) and 1 P 1 R = maxlog = log(1+P). √ (cid:112) √ sym 2 b∈R P(b−1)2+b2 2 ρ∗ < 1−P−γ+ P−(α−1)−P−(α−1)/2 = 1−P−γ. √ This result coincides with the well-known capacity of this Hence ρ2 <1−P−γ <1− 3P−α/2 <ρ2. On the other ∗ 0 channel with no interference. hand, since 1−ρ2−2ρ P(1−α)/2 =P−γ, we obtain ∗ ∗ Corollary 1: The proposed time-varying code outper- R (ρ ) forms the Kramer code for all channel parameters. d(α)≥ lim sym ∗ =γ. Proof: A variant of Kramer code is constructed by SNR,INR→∞ logSNR choosing triplet (b,β,ρ), which is a solution of (8a) and Since γ can take any arbitrary value less than α/2, we (8b), as follows: have d(α)≥α/2. From the result of [2], it is known that (cid:115) d(α)≤α/2. Hence, we must have d(α)=α/2. P(1+|a|ρ) a2P(1−ρ2)+1 b= ,β = , P(1+a2+2|a|ρ)+1 P(1+a2+2|a|ρ)+1 V. NUMERICALEVALUATION In order to evaluate R in Theorem 1 numerically, sym and ρ is the unique solution in (0,1) of the next equation: we determined the optimal ρ by increasing it from zero by 2|a|3P2ρ4+a2Pρ3−4|a|P(a2P +1)ρ2 incremental step 10−5. The numerical results are shown in Figs. 1-2 in Section I. We note that our proposed code −(2a2P +P +2)ρ+2|a|P(a2P +1)=0. (15) can achieve better/equal symmetric rate than/to the Suh- Tse and Kramer codes for all channel parameter (a,P). (See also in [2], [6], [9]). Therefore, it is inferior to the This improvement is more significant when SNR is not proposed code in this paper. too high. Remark2:Thechoiceofbandβ fortheKramercodeis to maximize the achievable rate (or minimize β) by using VI. CONCLUSION only(8a)foreachfixedvalueofρ.Inordertosatisfy(8b), The inner bound of the capacity is improved compared the choice of b, β in the above proof is applicable only with the Suh-Tse and Kramer inner bounds by analyzing to the fixed value of ρ which is the unique solution in the performance of our proposed code as an optimized (0,1) of (14). For other values of ρ, from (8a) and (8b), form of the Kramer code. Our result also shows that we see that b and β are given by two different functions an optimal coding scheme for the Gaussian interference of ρ. Enlarging the set of possible choices of ρ (then b, β) channel with feedback can be achieved by using marginal increases the achievable symmetric rate in this paper. posterior distributions. Corrolary 2: For α = logINR/logSNR > 1, the generalized degrees of freedom of the proposed coding ACKNOWLEDGMENT scheme is given by This work was supported in part by JSPS KAKENHI R (SNR,INR) α Grant Number 25289111. d(α):= lim sym = . SNR,INR→∞ logSNR 2 REFERENCES (Here,theunitofR isbits/s/Hz).Asaconsequence,the sym [1] ChanghoSuhandDavidTse,“SymmetricFeedbackCapacityofthe proposed coding scheme has the same generalized degrees GaussianInterferenceChanneltoWithinOneBit,”inProc.Int.Symp. offreedomastheSuh-Tsecodingscheme[1],[2]andalso InformationTheory,Jun.2009. [2] Changho Suh and David Tse, “Feedback Capacity of the Gaussian providesthemultiplicativegainathighSNR.Notethatthe InterferenceChanneltoWithin2Bits,”IEEETrans.Inf.Theory,vol. Kramer code achieves only d(α) = (1+α)/4 for α ≥ 1 57,No.5,pp.2667-2685,May2011. (cf. (48) [2]). [3] Lan V. Truong, “Posterior Matching Scheme for Gaussian Multiple Access Channel with Feedback,” [Online]. Available: Proof: Since α = logINR/logSNR > 1, we have http://arxiv.org/abs/1204.4249. |a|2P = Pα, or |a| = P(α−1)/2. For P sufficiently large [4] Lan V. Truong, “Posterior Matching Scheme for Gaussian Multiple and ρ≈1, observe that AccessChannelwithFeedback,”inProc.IEEEInformationTheory Workshop,Nov.2014. P(α+1)/2(1+ρ2) P(α+1)/2(1−ρ2) [5] Lan V. Truong, “A Novel Time-Varying Coding Scheme for the b∗ ≈ ± . Gaussian Broadcast Channel with Feedback,”. [Online]. Available: 1,2 2Pαρ 2Pαρ http://arxiv.org/abs/1404.2520. [6] Gerhard Kramer, “Feedback Strategies for White Gaussian Interfer- By choosing b=b∗1 ≈P(1−α)/2ρ, we obtain enceNetworks,”IEEETrans.Inf.Theory,vol.48,pp.1423-1438,Jan. (cid:18) (cid:19) 2002. P R (ρ)≈log+ [7] RaviTandon,SoheilMohajer,andH.VincentPoor,“OntheSymmet- sym P +b2Pα−2bP(1+P(α−1)/2ρ) ricFeedbackCapacityoftheK-UserCyclicZ-InterferenceChannel,” IEEETrans.Inf.Theory,vol.59,no.5,pp.2713-2733,May2013. =−log(cid:16)1−ρ2−2ρP(1−α)/2(cid:17) (bits/s/Hz). (16) [8] RaviTandon,SoheilMohajer,andH.VincentPoor,“OntheFeedback CapacityoftheFullyConnectedK-UserInterferenceChannel,”IEEE Trans.Inf.Theory,vol.59,no.5,pp.2863-2881,May2013. From Theorem 1, we also have [9] G.Kramer,“Correctionto‘FeedbackStrategiesforWhiteGaussian (cid:112) √ InterferenceNetworks’,andaCapacityTheoremforGaussianInter- ρ2 =1+P−α− 2P−α+P−2α >1− 3P−α/2, ferenceChannelswithFeedback,”IEEETrans.Inf.Theory,vol.50, 0 no.6,pp.1373-1374,Jun2004. and the ρ , that maximizes the achievable rate, must be [10] O.ShayevitzandM.Feder,“OptimalFeedbackCommunicationvia ∗ in the interval [0,ρ ]. This condition is satisfied by setting PosteriorMatching,”IEEETrans.Inf.Theory,vol.57,no.3,pp.1186- √ 0 1221,Mar.2011. ρ = 1−P−γ +P−(α−1)−P−(α−1)/2 for an arbitrary ∗