New Outer Bounds on the Capacity Region of Gaussian Interference Channels Xiaohu Shang Gerhard Kramer Biao Chen Syracuse University Bell Labs Syracuse University Department of EECS Alcatel-Lucent Department of EECS Email: [email protected] Email: [email protected] Email: [email protected] 8 Abstract—RecentouterboundsonthecapacityregionofGaus- superposition coding and joint decoding. A simplified form 0 sian interference channels are generalized to m-user channels of the Han-Kobayashi region was given by Chong-Motani- 0 with m > 2 and asymmetric powers and crosstalk coefficients. 2 Garg-El Gamal [10], [11]. Various outer bounds have been The bounds are again shown to give the sum-rate capacity for developed in [12]–[16]. Kramer in [14] presented two outer n Gaussian interference channels with low powers and crosstalk a coefficients. The capacity is achieved by using single-user detec- bounds. The first is obtained by providing each receiver with J tionateachreceiver,i.e.,treatingtheinterferenceasnoiseincurs justenoughinformationtodecodebothmessages.Thesecond 4 no loss in performance. isobtainedbyreducingtheICtoadegradedbroadcastchannel. 1 Index terms — capacity, Gaussian noise, interference. Both bounds dominate the bounds by Sato [12] and Carleial ] I. INTRODUCTION [13]. The recent outer bounds by Etkin, Tse, and Wang in T [15] are also based on genie-aided methods, and they show Thispaperextendstheresultsof[1]toasymmetricGaussian I that Han and Kobayashi’s inner bound is within one bit or a . ICs.ThepaperfurtherhasanewTheorem(Theorem4)thatis s factor of two of the capacity region. This result can also be c notin[2]orinotherrecentworks(seeSectionVandMotahari establishedbythemethodsofTelatarandTse[16].Weremark [ and Khandani [3], and Annapureddy and Veeravalli [4]). thatneitherof theboundsof [14]and [15]implieseachother. The interference channel (IC) models communication sys- 1 Numerical results show that the bounds of [14] are better at v tems where transmitters communicate with their respective low SNR while those of [15] are better at high SNR. The 5 receiverswhile causing interferenceto all other receivers.For boundsof[16]arenotamenabletonumericalevaluationsince 8 a two-user Gaussian IC, the channel output can be written in the optimaldistributionsof the auxiliaryrandomvariablesare 1 the standard form [5] 2 unknown. 1. Y1 = X1+√aX2+Z1, In this paper, we present new outer boundson the capacity 0 Y = √bX +X +Z , regionof Gaussian ICs thatgeneralizeresultsof [1]. The new 8 2 1 2 2 bound is based on a genie-aided approach and an extremal 0 where √a and √b are channel coefficients, X and Y are inequalityproposedbyLiuandViswanath [17].Based onthis i i : v the transmit and receive signals. The user/channel input se- outer bound, we obtain new sum-rate capacity results for ICs i quence X ,X , ,X is subject to the power constraint satisfying some channelcoefficientand power constraintcon- X n (Xi12) i2 ··n·P , iin = 1,2. The transmitted signals ditions. We show that the sum-rate capacity can be achieved r j=1E ij ≤ i a X1 and X2 are statistically independent. The channel noises by treating the interference as noise when both the channel P Z and Z are possibly correlated unit variance Gaussian gain and the power constraint are weak. We say that such 1 2 random variables, and (Z ,Z ) is statistically independent of channelshavenoisyinterference.Forthisclassofinterference, 1 2 (X ,X ). In the following, we denote this Gaussian IC as the simple single-user transmission and detection strategy is 1 2 IC(a,b,P ,P ). sum-rate optimal. 1 2 The capacity region of an IC is defined as the closure of Thispaperisorganizedasfollows.InSectionII,wepresent the set of rate pairs (R ,R ) for which both receivers can an outer boundand the resultingsum-rate capacityfor certain 1 2 decodetheirownmessageswitharbitrarilysmallpositiveerror 2-user Gaussian ICs. An extension of the sum-rate capacity probability. The capacity region of a Gaussian IC is known under noisy interference to m-user ICs is providedin Section only for three cases: (1) a = 0, b = 0. (2) a 1, b 1: see III. Numerical examples are given in Section IV, and Section ≥ ≥ [6]–[8]. (3) a = 0, b 1; or a 1, b = 0: see [9]. For the V concludes the paper. ≥ ≥ second case both receivers can decode the messages of both II. AGENIE-AIDED OUTERBOUND transmitters.ThusthisICactsastwomultipleaccesschannels (MACs), and the capacity regionfor the IC is the intersection A. General outer bound of the capacity region of the two MACs. However, when the The following is a new outer bound on the capacity region interference is weak or moderate, the capacity region is still of Gaussian ICs. Note that in contrast to [1] these bounds unknown. The best inner bound is obtained in [8] by using permit P =P and a=b. 1 2 6 6 1 P 1 1 (P +ρ σ )2 R +µR min log 1+ 1∗ log aP +1 ρ2 + log 1+P +aP 1 1 1 1 2 ≤ (σρ12i,∈σ[220),1∈]Σ2 (cid:18) σ12(cid:19)− 2 (cid:0) 2∗ − 1(cid:1) 2 (cid:18) 1 2− P1+σ12 (cid:19) µ P µ µ (P +ρ σ )2 + log 1+ 2∗ log bP +1 ρ2 + log 1+P +bP 2 2 2 (1) 2 σ2 − 2 1∗ − 2 2 2 1− P +σ2 (cid:18) 2(cid:19) (cid:18) 2 2 (cid:19) 1 bη 1 η (cid:0) bη 1 (cid:1) η R +η R log 1+ 1− 1 log 1+ 1− + 1 log(1+bP +P ) (2) 1 1 2 1 2 ≤ 2 b bη − 2 1 η 2 (cid:18) − 1(cid:19) (cid:18) − 1 (cid:19) 1 1 a η η a η R +η R log(1+P +aP ) log 1+ − 2 + 2 log 1+ − 2 . (3) 1 2 2 1 2 ≤ 2 − 2 η 1 2 aη a (cid:18) 2− (cid:19) (cid:18) 2− (cid:19) Theorem 1: If the rates (R ,R ) are achievable for where ǫ 0 as n . For h(Yn Xn+Nn), zero-mean IC(a,b,P ,P ) with 0<a<1,01<2b<1, they must satisfy Gaussian→Xn and X→n w∞ith covarian1ce| m1atrices1 P I and P I 1 2 1 2 1 2 the following constraints (1)-(3) for µ > 0, 1+bP1 η 1 are optimal, and we have b+bP1 ≤ 1 ≤ b and a≤η2 ≤ a1++aaPP22, where h(Y1n|X1n+N1n) n (P +ρ σ )2 Σ= σ12,σ22 |σ12 >0,0<σ22 ≤ 1−aρ21 , if µ≥1, (4) ≤ 2 log"2πe 1+aP2+P1− 1P1+1σ121 !#. (10) n(cid:0)σ12,σ22(cid:1)|0<σ12 ≤ 1−bρ22,σ22 >0o, if µ<1, From the extremal inequality introduced in [17, Theorem 1, n(cid:0) (cid:1) o Corollary 4], we have and if µ 1 we define ≥ h(Xn+Nn) µh √bXn+Zn Nn (11) P1∗ = P01,−1,ρb22µ−−bbµσ12, hσσ11(221−≤>µµh)1P−(b11µρ−22+µµ,)1P−b1µρ+22i1+−bµ<ρ22iσ+12,≤ 1−bµρ22,(5) anµ≤dhn2(X1long+(cid:2)2Nπ1en(cid:0))−P1∗h+(cid:16)σ√12a(cid:1)X(cid:3)−n1 +n2µZln2o|gN(cid:2)2n2π(cid:17)e(cid:0)bP1∗+1−ρ22((cid:1)1(cid:3)2,) P2∗ =P2, σ22 ≤ 1−aρ21, (6) ≤ n2µ2log 2π2e −P2∗+(cid:0) σ22 2− µ2 l1o|g 12π(cid:1)e aP2∗+1−ρ21 , where (x)+ ,max x,0 , and if 0<µ<1 we define where equa(cid:2)lities(cid:0)hold whe(cid:1)n(cid:3)X1n and(cid:2)X2n (cid:0)are both Gauss(cid:1)i(cid:3)an { } withcovariancematricesP IandP Irespectively.From(9)- 1∗ 2∗ 1 ρ2 (12) we obtain the outer bound (1). P1∗ =P1, σ12 ≤ −b 2, (7) The outer bound in (2) (resp. (3)) is obtainedby letting the P2, σ22 ≤ (µ−1)P2+ µ(1−aρ21) +, gtoenrieecepirvoevritdweos)i,daenidnfaoprpmlyaitniognthXe2extotreremceeliyveirneoqnuea(lirteys,pi..eX.,1 (cid:20) (cid:21) P2∗ = µ(1−aρ−21a)µ−aσ22, (cid:20)(µ−1)P2+<µ(σ12−aρ21)µ(cid:21)(+1−ρ21), (8) n≤(RI(1X+1nη;1YR1n2,)X−2nn)ǫ+η1I(X2n;Y2n) Proof0:,We give a skeσt22ch>ofµ(t1h−aeρp21r)o.of.2A≤genieaprovides =h+n(ηlXo1hg1n(+YP˜2n1Z)+1n)1−η1hn(cid:16)η1√lboXg1nb+P˜1Z+2n(cid:17)1−h(Z1n) ≤ 2 − 2 thetworeceiverswithsideinformationX1+N1 andX2+N2 +nη1(cid:16)log(1+(cid:17)bP +P ),(cid:16) (cid:17) (13) respectively,whereNiisGaussiandistributedwithvarianceσi2 2 1 2 and (N Z )=ρ σ ,i=1,2.StartingfromFano’sinequality, we hEavei i i i where P˜1 = bbη1b−η11 for 1b++bbPP11 ≤η1 ≤ 1b. This is the bound in (2). Similarly,−we obtain bound (3). Remark 1: The bounds (1)-(3) are obtained by providing n(R +µR ) nǫ 1 2 − differentgenie-aidedsignals to the receivers.There is overlap I(Xn;Yn)+µI(Xn;Yn) ≤ 1 1 2 2 oftherangeofµ,η1,andη2,andnoneoftheboundsuniformly ≤I(X1n;Y1n,X1n+N1n)+µI(X2n;Y2n,X2n+N2n) dominates the other two bounds. = h(Xn+Nn) µh √bXn+Zn Nn h(Nn) Remark 2: Equations (2) and (3) are outer bounds for the 1 1 − 1 2| 2 − 1 capacity region of a Z-IC, and a Z-IC is equivalent to a h+ µh(X2n+N2n)−(cid:16)h √aX2n+Z1n|N(cid:17)1in −µh(N2n)degraded IC [9]. For such channels, it can be shown that +h(Yn Xn+Nn)+µh(Yn Xn+Nn), (9)(2) and (3) are the same as the outer bounds in [18]. For (cid:2) 1 | 1 1 (cid:0) 2 | 2 2 (cid:1)(cid:3) η = 1+bP1 and η = a+aP2, the bounds in (2) and (3) are the sum-rate capacity is 1 b+bP1 2 1+aP2 (ti1g3h)t,faonrdatZh-eIrCe(isorndoegproawdeerdsIhCa)rbinegcabuestewPe˜1en=thPe1,trPa˜n2s=miPtt2erisn. C = 1log 1+ P1 + 1log 1+ P2 . (16) 2 1+aP 2 1+bP Consequently, 1+bP1 and a+aP2 arethenegativeslopesofthe (cid:18) 2(cid:19) (cid:18) 1(cid:19) b+bP1 1+aP2 tangent lines for the capacity region at the corner points. Proof: By choosing Remark3:Theboundsin (2)-(3) turnoutto bethe same as 1 theboundsin[14,Theorem2].Thiscanbeshownbyrewriting σ2 = b(aP +1)2 a(bP +1)2+1 1 2b 2 − 1 the boundsin [14, Theorem2] in the formof a weightedsum rate. [(cid:8)b(aP +1)2 a(bP +1)2+1]2 4b(aP +1)2 2 1 2 ± − − Remark 4: The bounds in [14, Theorem 2] are obtained q (cid:27) 1 by getting rid of one of the interference links to reduce the σ2 = a(bP +1)2 b(aP +1)2+1 IC into a Z-IC. In addition, the proof in [14] allowed the 2 2a 1 − 2 transmitters to share their power, which further reduces the [(cid:8)a(bP +1)2 b(aP +1)2+1]2 4a(bP +1)2 1 2 1 Z-IC into a degraded broadcast channel. Then the capacity ± − − q (cid:27) region of this degraded broadcast channel is an outer bound ρ = 1 aσ2 (17) for the capacity region of the original IC. The bounds in 1 − 2 q (2) and (3) are also obtained by reducing the IC to a Z-IC. ρ = 1 bσ2, (18) Although we do not explicitly allow the transmitters to share 2 − 1 q their power, it is interesting that these bounds are equivalent the bound (1) with µ=1 is to the boundsin [14, Theorem2] with power sharing.In fact, 1 P 1 P R +R log 1+ 1 + log 1+ 2 . (19) a careful examination of our derivation reveals that power 1 2 ≤ 2 1+aP 2 1+bP (cid:18) 2(cid:19) (cid:18) 1(cid:19) sharing is implicitly assumed. For example, for the term But one can achieve equality in (19) by treating the interfer- h(X1n+Z1n)−η1h √bX1n+Z2n of(13),user1usespower enceasnoiseatbothreceivers.Inorderthatthechoicesofσi2 P˜1 = bbη1b−η11 ≤ P1 ,(cid:16)while for the(cid:17)term η1h(Y2n) user 1 uses and ρ2i are feasible, (15) must be satisfied. all the p−ower P . This is equivalent to letting user 1 use the Remark 7: Consider the bound (1) with µ = 1, we further 1 power P˜ for both terms, and letting user 2 use a power that let 1 exceeds P . 2 1 ρ2 aσ2, 1 ρ2 bσ2. (20) Remark 5: Theorem 1 improves [15, Theorem 3]. Specifi- − 1 ≥ 2 − 2 ≥ 1 cally, the bound in (2) is tighter than the first sum-rate bound From (5) and (6) we have P =P ,P =P . Thus, 1∗ 1 2∗ 2 of [15, Theorem3]. Similarly, the boundin (3) is tighter than 1 P 1 thesecondsum-rateboundof[15,Theorem3].Thethirdsum- R log 1+ 1 log(aP +1 ρ2) 1 ≤ 2 σ2 − 2 2 − 1 rate bound in [15, Theorem 3] is a special case of (1). (cid:18) 1(cid:19) 1 (P +ρ σ )2 Remark 6: Our outer bound is not always tighter than that 1 1 1 + log 1+aP +P of [15] for all rate points. The reason is that in [15, last two 2 (cid:20) 2 1− P1+σ12 (cid:21) equations of (39)], different genie-aided signals are provided 1 P (1+aP ) 1 ρ 2 P 1 2 1 1 tothesamereceiver.Ourouterboundcanalsobeimprovedin = log +1+ a similar and more general way by providing different genie- 2 "1+aP2−ρ21 (cid:18)σ1 − 1+aP2(cid:19) 1+aP2# aidedsignalstothereceivers.Specificallythestartingpointof ,f(ρ1,σ1). (21) the bound can be modified to be Therefore, for any given ρ , when 1 k n(R1+µR2) ≤ λiI(X1n;Y1n,Ui) ρ1σ1 =1+aP2, (22) Xi=1m thenf(ρ1,σ1) achievesits minimumwhich is user 1’ssingle- + µ I(Xn;Yn,W )+nǫ, (14) user detection rate. Similarly, we have ρ2σ2 =1+bP1. Since i 2 2 j the constraint in (20) must be satisfied, we have j=1 X where ki=1λi =1, mj=1µj =µ,λi >0,µj >0. 1+aP2 1−ρ22, 1+bP1 1−ρ21. (23) ρ ≤ b ρ ≤ a 1 r 2 r B. SumP-rate capacityPfor noisy interference Aslongasthereexistsaρ (0,1)suchthat(23) issatisfied, i The outer bound in Theorem 1 is in the form of an ∈ we can choose σ to satisfy (22) and hence the bound in (1) optimization problem. Four parameters ρ ,ρ ,σ2,σ2 need to i 1 2 1 2 is tight. By cancelling ρ ,ρ , we obtain (15). 1 2 be optimized for different choices of the weights µ,η ,η . 1 2 Remark 8: The most special choices of ρ ,ρ are in (17) 1 2 When µ = 1, the bound (1) of Theorem 1 leads directly to and (18), since (11) and (12) with µ=1 become the following sum-rate capacity result. Theorem 2: For the IC(a,b,P ,P ) satisfying h(Xn+Nn) h √bXn+Zn Nn = nlog√b 1 2 1 1 − 1 2| 2 − √a(bP +1)+√b(aP +1) 1, (15) h(Xn+Nn) h(cid:16)√aXn+Zn Nn(cid:17) = nlog√a. 1 2 ≤ 2 2 − 2 1| 1 − (cid:0) (cid:1) Therefore, we do not need the extremal inequality [17] to firstcase isab>1.Inthiscase, P canbeanypositivevalue. 1 prove Theorem 2. The second case is ab < 1 and P a 1. For both cases, 1 ≤ 1−ab Remark 9: The sum-rate capacity for a Z-IC with a = 0, the signals from user 2 can be decoded fi−rst at both receivers. 0<b<1isaspecialcaseofTheorem2since(15)issatisfied. III. SUM-RATECAPACITY FORm-USERICWITH NOISY The sum-rate capacity is therefore given by (16). INTERFERENCE Remark 10: Theorem 2 follows directly from Theorem 1 with µ = 1. It is remarkable that a genie-aided bound For an m-user IC, the receive signal at user i is defined as is tight if (15) is satisfied since the genie provides extra m signals to the receivers without increasing the rates. This Yi =Xi+ √cjiXj +Zi, i=1,2,...,m, (28) situationisreminiscentoftherecentcapacityresultsforvector j=1,j=i X6 (cid:0) (cid:1) Gaussianbroadcastchannels(see[19]).Furthermore,thesum- where c is the channel gain from jth transmitter to ith rate capacity (16) is achieved by treating the interference as ji receiver, Z is unit-variance Gaussian noise, and the transmit noise. We therefore refer to channels satisfying (15) as ICs signalshaviethe blockpowerconstraints n (X2) nP . with noisy interference. l=1E il ≤ i We have the following sum-rate capacity result. Remark11:ForasymmetricICwherea=b,P1 =P2 =P, Theorem 4: Foran m-userICdefinedPin(28),if thereexist the constraint (15) implies that ρ (0,1),i = 1,...,m, such that the following conditions i ∈ 1 √a 2a are satisfied a , P − . (24) ≤ 4 ≤ 2a2 m c (1+Q )2 Noisy interference is therefore weaker than weak interference ji ρ2 j ≤ 1−ρ2i (29) as defined in [9] and [20], namely a≤ √1+22PP−1 or j=X1m,j6=i j c 1 1 1 2a ij , (30) a , P − . (25) 1+Q ρ2 ≤ P +(1+Q )2/ρ2 ≤ 2 ≤ 2a2 j=1,j=i j − j i i i X6 Recallthat[20]showedthatfor“weakinterference”satisfying where Q is the interference power at receiver i, defined as i (25), treating interference as noise achieves larger sum rate m than time- or frequency-division multiplexing (TDM/FDM), Q = c P , (31) i ji j and [9]claimed that in “weak interference”the largestknown j=1,j=i achievablesum rate isachievedby treatingtheinterferenceas X6 the sum-rate capacity is noise. m 1 P C. Capacity region corner point C = log 1+ i (32) 2 1+Q The bounds(2) and (3) of Theorem 1 lead to the following i=1 (cid:18) i(cid:19) X sum-rate capacity result. Therefore, if there exist ρ ,...,ρ , such that (29) and (30) Theorem 3: For an IC(a,b,P ,P ) with a>1, 0<b<1, 1 m 1 2 are satisfied for all i=1,...,m, the sum-rate capacity of an the sum-rate capacity is m-user IC can be achieved by treating interference as noise. 1 1 P C = log(1+P )+ log 1+ 2 (26) The proof is omitted due to the space limitation. It can be 1 2 2 (cid:18) 1+bP1(cid:19) shown that Theorem 2 is a special case of Theorem 4. when the following condition holds Consider a uniformly symmetric m-user IC where cji =c, for all i,j = 1,...,m,i = j, and P = P. The bounds (29) i (1 ab)P a 1. (27) 6 1 and (30) with ρ =ρ for all i reduce to − ≤ − i A similar resultfollowsbyswappinga and b, andP1 and P2. 1 (m 1)c 2(m 1)c c , P − − − . (33) ≤ 4(m 1) ≤ 2(m 1)2c2 Thissum-ratecapacityisachievedbyasimplescheme:user − p − 1 transmits at the maximum rate and user 2 transmits at the IV. NUMERICAL EXAMPLES rate that both receivers can decode its message with single- A comparison of the outer bounds for a Gaussian IC is user detection. Such a rate constraint was considered in [9, given in Fig. 1. Some part of the outer bound from Theorem Theorem 1] which established a corner point of the capacity 1 overlaps with Kramer’s outer bound due to (2) and (3). region. However it was pointed out in [20] that the proof in SincethisIChasnoisyinterference,theproposedouterbound [9] was flawed. Theorem 3 shows that the rate pair of [20] is coincides with the inner bound at the sum rate point. in fact a corner point of the capacity region when a>1,0< The lower and upper bounds for the sum-rate capacity of b < 1 and (27) is satisfied, and this rate pair achieves the the symmetric IC are shown in Fig 2 for high power level. sum-rate capacity. The upper bound is tight up to point A. The bound in [15, The sum-ratecapacityof the degradedIC (ab=1,0<b< Theorem 3] approaches the bound in Theorem 1 when the 1)isa specialcase ofTheorem3. Besidesthisexample,there power is large, but there is still a gap. Fig. 2 also provides a are two other kinds of ICs to which Theorem 3 applies. The definitive answer to a question from [20, Fig. 2]: whether the sum-rate capacity of symmetric Gaussian IC is a decreasing 11 functionofa,orthereexistsabumplikethelowerboundwhen ETW upper bound Kramer upper bound a variesfrom0 to 1.In Fig.2, ourproposedupperboundand 10.5 USapspoenr bloowuenrd b ino uTnhdeorem 1 Sason’s inner bound explicitly show that the sum capacity is e 10 not a monotone function of a (this result also follows by the nel us 9.5 bounds of [15]). an h er c 9 A VW.eCdOerNivCeLdUSaInONouSt,eErXbToEuNnSdIOfoNrStAheNDcaPpAaRciAtyLLrEegLioWnOoRfK2- e in bit p 8.5 user Gaussian ICs by a genie-aided method. From this outer m rat 8 u bound,thesum-ratecapacitiesforICsthatsatisfy(15)or(27) S 7.5 are obtained. The sum-rate capacity for m-user Gaussian ICs 7 that satisfy (29) and (30) are also obtained. Finally, we wish to acknowledge parallel work. After sub- 6−.53 0 −25 −20 −15 −10 −5 0 a in dB mittingour2-userboundsandcapacityresultsonthearXiv.org e-Print archive [2], two other teams of researchers - Motahari Fig. 2. Lower and upper bounds for the sum-rate capacity of symmetric and Khandanifrom the University of Waterloo, Annapureddy Gaussian ICs with a = b,P1 = P2 = 5000. The channel gain at point A and Veeravalli from the University of Illinois at Urbana- isa=−26.99dB. Sason’s boundisaninnerboundobtained fromHanand Kobayashi’s boundbyaspecialtimesharingscheme[20,TableI]. Champaign - let us know that they derived the same 2-user sum-rate capacity results (Theorem 2). [3] A.S.MotahariandA.K.Khandani, “CapacityboundsfortheGaussian Acknowledgement interference channel,” tobesubmittedtoIEEETrans.Inform.Theory. The work of X. Shang and B. Chen was supported in part [4] V. S. Annapureddy and V. Veeravalli, “Sum capacity of the Gaussian interference channel inthe low interference regime,” submitted toITA by the NSF under Grants 0546491 and 0501534, and by the 2008. AFOSR under Grant FA9550-06-1-0051, and by the AFRL [5] A.B. Carleial, “Interference Channels,” IEEE Trans. Inform. Theory, under Agreement FA8750-05-2-0121. vol.24,pp.60–70,Jan.1978. [6] A.B. Carleial, “A case where interference does not reduce capacity,” G.KramergratefullyacknowledgesthesupportoftheBoard IEEETrans.Inform.Theory,vol.21,pp.569–570,Sep.1975. of Trustees of the University of Illinois Subaward no. 04-217 [7] H.Sato,“ThecapacityoftheGaussianinterferencechannelunderstrong underNSFGrantCCR-0325673andtheArmyResearchOffice interference,” IEEETrans.Inform. Theory, vol.27, pp.786–788, Nov. 1981. under ARO Grant W911NF-06-1-0182. [8] T.S. Han and K. Kobayashi, “A new achievable rate region for the interference channel,” IEEETrans.Inform.Theory,vol.27,pp.49–60, 2.5 Jan.1981. ETW outer bound Kramer outer bound [9] M.H.M. Costa, “On the Gaussian interference channel,” IEEE Trans. Proposed outer bound HK inner bound Inform.Theory,vol.31,pp.607–615,Sept.1985. 2 [10] H.F. Chong, M. Motani, H.K. Garg, and H.E. Gamal, “On the Han- KobayashiRegionfortheinterference channel,” submittedtotheIEEE e us Trans.Inform.Theory,2006. nel 1.5 [11] G. Kramer, “Review of rate regions for interference channels,” in n ha International ZurichSeminar,Zurich,Feb.22-242006,pp.162–165. er c [12] H. Sato, “Two-user communication channels,” IEEE Trans. Inform. p bit 1 Theory,vol.23,pp.295–304,May1977. R in 2 [13] AIE.EBE.CTarralneisa.l,In“foOrumte.rTbhoeuonrdy,svoonl.th2e9,cpappa.c6i0ty2–o6f0i6n,teJrufelyre1n9ce83c.hannels,” [14] G. Kramer, “Outer bounds on the capacity of Gaussian interference 0.5 channels,” IEEETrans.onInform. Theory,vol.50,pp.581–586, Mar. 2004. [15] R.H.Etkin,D.N.C.Tse,andH.Wang,“Gaussianinterferencechannel 0 0 0.2 0.4 0R.6 in bit0 .p8er cha1nnel us1e.2 1.4 1.6 1.8 capacitytowithinonebit,”submittedtotheIEEETrans.Inform.Theory, 1 2007. [16] E. Telatar and D. Tse, “Bounds on the capacity region of a class Fig.1. InnerandouterboundsforthecapacityregionofGaussianICswith of interference channels,” in Proc. IEEE International Symposium on a=0.04,b=0.09,P1 =10,P2 =20. TheETWbound is by Etkin, Tse Information Theory2007,Nice,France,Jun.2007. and Wang in [15, Theorem 3]; the Kramer bound is from [14, Theorem 2]; [17] T. Liu and P. Viswanath, “An extremal inequality motivated by theHKinnerboundisbasedon[8]byHanandKobayashi. multiterminal information-theoretic problems,” IEEE Trans. Inform. Theory,vol.53,no.5,pp.1839–1851, May2006. [18] H. Sato, “On degraded Gaussian two-user channels,” IEEE Trans. REFERENCES Inform.Theory,vol.24,pp.634–640,Sept.1978. [19] H.Weingarten,Y.Steinberg,andS.Shamai(Shitz),“Thecapacityregion [1] X.Shang,G.Kramer,andB.Chen,“Outerboundandnoisy-interference oftheGaussianmultiple-inputmultiple-outputbroadcastchannel,”IEEE sum-rate capacity forsymmetric Gaussian interference channels,” sub- Trans.Inform.Theory,vol.52,no.9,pp.3936–3964, Sep.2006. mittedtoCISS2008. [20] I. Sason, “On achievable rate regions for the Gaussian interference [2] X. Shang, G. Kramer, and B. Chen, “A new outer bound channels,” IEEE Trans. Inform. Theory, vol. 50, pp. 1345–1356, June and the noisy-interference sum-rate capacity for Gaussian interfer- 2004. ence channels,” submitted to the IEEE Trans. Inform. Theory. http://arxiv.org/abs/0712.1987, 2007.