ebook img

Strong Secrecy for Cooperative Broadcast Channels PDF

0.69 MB·
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Strong Secrecy for Cooperative Broadcast Channels

1 Strong Secrecy for Cooperative Broadcast Channels Ziv Goldfeld, Gerhard Kramer, Haim H. Permuter and Paul Cuff 6 1 0 2 Abstract g u A broadcast channel (BC) where the decoders cooperate via a one-sided link is considered. One common and A two private messages are transmittedand the private message to the cooperative user should be kept secret from the 7 cooperation-aideduser.Thesecrecylevelismeasuredintermsofstrongsecrecy,i.e.,avanishinginformationleakage. 1 An inner bound on the capacity region is derived by using a channel-resolvability-based code that double-bins the ] T codebook of the secret message, and by using a likelihood encoder to choose the transmitted codeword. The inner I boundisshowntobetightforsemi-deterministicandphysicallydegradedBCsandtheresultsarecomparedtothose . s c ofthecorresponding BCswithoutasecrecyconstraint.BlackwellandGaussianBCexamplesillustratetheimpactof [ secrecy on therateregions. Unlike thecase without secrecy, where sharing information about both private messages 2 via the cooperative link is optimal, our protocol conveys parts of the common and non-confidential messages only. v 6 Thisrestrictionreducesthetransmissionratesmorethantheusualratelossduetosecrecyrequirements.Anexample 8 that shows this loss can be strict is also provided. 2 1 0 Index Terms . 1 0 Broadcast channel, channel resolvability, conferencing, cooperation, likelihood encoder, physical-layer security, 6 strong secrecy. 1 : v i X I. INTRODUCTION r a User cooperation and security are two essential aspects of modern communication systems. Cooperation can increase transmission rates, whereas security requirements can limit these rates. To shed light on the interaction between these two phenomena, we study broadcast channels (BCs) with one-sided decoder cooperation and one confidentialmessage (Fig.1).Cooperationis modeledasconferencing,i.e.,informationexchangevia a rate-limited link that extends from one receiver (referred to as the cooperative receiver) to the other (the cooperation-aided receiver).Thecooperativereceiverpossessesconfidentialinformationthatshouldbekeptsecretfromtheotheruser. The work of Z. Goldfeld and H. H. Permuter was supported by the Israel Science Foundation (grant no. 1012/14), an ERC starting grant and the Cyber Security Research Grant at Ben-Gurion University of the Negev. The work ofG. Kramer was supported byan Alexander von HumboldtProfessorshipendowedbytheGermanFederalMinistryofEducationandResearch.TheworkofP.CuffwassupportedbytheNational Science Foundation (grants CCF-1350595 and CCF-1116013) and the Air Force Office of Scientific Research (grants FA9550-15-1-0180 and FA9550-12-1-0196). This paper was presented inpart at the2015 IEEEInternational Symposium onInformation Theory, Hong-Kong, andin partatthe2016International ZurichSeminaronCommunications, Zurich,Switzerland. 2 Y1 Mˆ(1),Mˆ Channel Decoder 1 0 1 (cid:0) (cid:1) (M ,M ,M ) X 0 1 2 Encoder Q M Y1,Y2|X 12 Y2 Decoder 2 Mˆ0(2),Mˆ2 (cid:0)M (cid:1) 1 Fig. 1: Cooperative BCs with one confidential message. SecretcommunicationovernoisychannelswasmodeledbyWynerwhointroducedthe degradedwiretapchannel (WTC)andderiveditssecrecy-capacity[1].Wyner’swiretapcodereliedonacapacity-basedapproach,i.e.,thecode is a union of subcodes that operate just below the capacity of the eavesdropper’s channel. Csisza´r and Ko¨rner [2] generalized Wyner’s result to a general BC. Multiuser settings with secrecy have since been extensively treated in theliterature.Broadcastandinterferencechannelswithtwoconfidentialmessageswerestudiedin[3]–[7].Gaussian multiple-input multiple-output (MIMO) BCs and WTCs were studied in [8]–[13], while [14]–[16] focus on BCs with an eavesdropper as an external entity from which all messages are kept secret. Theabovepapersconsidertheweaksecrecymetric,i.e.,avanishinginformationleakageratetotheeavesdropper. Althoughtheleakageratevanishesasymptoticallywiththeblocklength,theeavesdroppercandecipheranincreasing number of bits of the confidential message. This drawback was highlighted in [17]–[19] (see also [20]), which advocated using the information leakage as a secrecy measure referred to as strong secrecy. We consider strong secrecy by relying on work by Csisza´r [20] and Hayashi [21] to relate the coding mechanism for secrecy to channel-resolvability. Theproblemofchannelresolvability,closelyrelatedtotheearlyworkofWyner[22],wasformulatedbyHanand Verdu´ [23] in terms of total variation (TV). Recently, [24] advocated replacing the TV metric with unnormalized relative entropy. In [25], the coding mechanism for the resolvability problem was extended to various scenarios under the name soft-covering lemma. These extensions were used to design secure communication protocols for several source coding problems under different secrecy measures [26]–[29]. A resolvability-based wiretap code associates with each message a subcode that operates just above the resolvability of the eavesdropper’s channel. Using such constructions, [30] extended the results of [2] to strong secrecy for continuous random variables and channels with memory. In [31] (see also [32, Remark 2.2]), resolvability-based codes were used to establish the strong secrecy-capacities of the discrete and memoryless (DM) WTC and the DM-BC with confidential messages by using a metric called effective secrecy. Our inner bound on the strong secrecy-capacity region of the cooperative BC is based on a resolvability-based Marton code. Specifically, we consider a state-dependent channel over which an encoder with non-causal access to the state sequence aims to make the conditional probability mass function (PMF) of the channel output given the state a product PMF. The resolvability code coordinates the transmitted codeword with the state sequence by 3 means of multicoding, i.e., by associating with every message a bin that contains enough codewords to ensure joint encoding (similar to a Gelfand-Pinsker codebook). Most encoders use joint typicality tests to determine the transmitted codeword. We adopt the likelihood encoder, recently proposed as a coding strategy for source coding problems [33], as our multicoding mechanism. Doing so significantly simplifies the distribution approximation analysis. We prove that the TV between the induced output PMF and the target product PMF approaches zero exponentially fast in the blocklength, which implies convergence in unnormalized relative entropy [34, Theorem 17.3.3]. Next, we construct a BC code in which the relation between the codewords correspondsto the relation between the channel states and the channel inputs in the resolvability problem. To this end we associate with every confidentialmessage a subcodethat adheresto the structure of the aforementionedresolvability code. Accordingly, the confidential message codebook is double-binned to allow joint encoding via the likelihood encoder (outer bin layer) and preserves confidentiality (inner bin layer). The bin sizes are determined by the rate constraints for the resolvability problem, which ensures strong secrecy. The inner bound induced by this coding scheme is shown to be tight for semi-deterministic (SD) and physically-degraded(PD) BCs. Ourprotocolusesthecooperationlinktoconveyinformationaboutthenon-confidentialmessageandthecommon message. Without secrecy constraints, the optimal scheme shares information on both private messages as well as the commonmessage [35]. We show thatthe restricted protocolresults in an additionalrate loss on top of standard losses due to secrecy. To this end we compare the achievable regions induced by each cooperation strategy for a cooperativeBCwithoutsecrecy.WeshowthattherestrictedprotocoldoesnotloseratewhentheBCisdeterministic or PD, but it is sub-optimal in general. To the best of our knowledge, we present here the first resolvability-based Marton code. This is also a first demonstration of the likelihood encoder’s usefulness in the context of secrecy for channel coding problems. From a broader perspective, our resolvability result is a tool for proving strong secrecy in settings with Marton coding. As a specialcase, we derivethe secrecy-capacityregionof the SD-BC (withoutcooperation)wherethe message of the deterministic user is confidential - a new result that has merit on its own. The structure of the obtained region provides insight into the effect of secrecy on the coding strategy for BCs. A comparison between the cooperative PD-BC with and without secrecy is also given. The results are visualized by considering a Blackwell BC (BBC) [36], [37] and a Gaussian BC. An explicit strong secrecy-achieving coding strategy for an extreme point of the BBC region is given. Although the BBC’s input is ternary, to maximize the transmission rate of the confidential message only a binary subset of the input’s alphabet is used. As a result, a zero-capacity channel is induced to the other user, who, therefore, cannot decode any of the secret bits. Further, we show that in the BBC scenario, an improved subchannel (given by the identity mapping) to the legitimate receiver does not increase the strong secrecy-capacity region. Thispaperis organizedasfollows.SectionIIprovidespreliminariesandrestatessomeusefulbasic properties.In SectionIIIwe state a resolvabilitylemma.SectionIV introducesthe cooperativeBC withone confidentialmessage and gives an inner bound on its strong secrecy-capacity region. The secrecy-capacity regions for the SD and PD 4 scenarios are then characterized.In Section V the effect of secrecy constraints on the optimal cooperationprotocol is discussed. Section VI compares the capacity regions of SD- and PD-BCs with and without secrecy. Blackwell and Gaussian BCs visualise the results. Finally, proofsare providedin Section VII, while Section VIII summarizes the main achievements and insights of this work. II. NOTATION AND PRELIMINARIES We use the following notations. As customary N is the set of naturalnumbers(which does not include 0), while R denotes the reals. We further define R = {x ∈ R|x ≥ 0}. Given two real numbers a,b, we denote by [a:b] + the set of integers n∈N ⌈a⌉≤n≤⌊b⌋ . Calligraphic letters denote discrete sets, e.g., X, while the cardinality of a set X is deno(cid:8)ted by(cid:12)|X|. Xn stands(cid:9)for the n-fold Cartesian product of X. An element of Xn is denoted (cid:12) by xn = (x ,x ,...,x ), and its substrings as xj = (x ,x ,...,x ); when i = 1, the subscript is omitted. 1 2 n i i i+1 j Whenever the dimension n is clear from the context, vectors (or sequences) are denoted by boldface letters, e.g., x. Let Ω,F,P be a probability space, where Ω is the sample space, F is the σ-algebra and P is the probability measur(cid:0)e. Rando(cid:1)m variables over Ω,F,P are denoted by uppercase letters, e.g., X, with conventionsfor random vectors similar to those for dete(cid:0)rministic(cid:1)sequences. Namely, Xj represents the sequence of random variables i (X ,X ,...,X ), while X stands for Xn. The probability of an event A ∈ F is denoted by P(A), while i i+1 j P(A B)denotesconditionalprobabilityof AgivenB.We use 1 to denotethe indicatorfunctionof A. Theset of A allp(cid:12)robabilitymass functions(PMFs) on a finite set X is denotedby P(X). PMFsare denotedby the capitalletter (cid:12) P, with a subscript that identifies the random variable and its possible conditioning. For example, for two random variables X and Y we use P , P and P to denote, respectively,the marginalPMF of X, the joint PMF of X X,Y X|Y (X,Y) and the conditional PMF of X given Y. In particular, P represents the stochastic matrix whose entries X|Y are P (x|y)=P X =x|Y =y . We omit subscriptsif the argumentsof the PMF are lowercase versionsof the X|Y random variables. T(cid:0)he support of a(cid:1)PMF P and the expectation of a random variable X are denoted by supp(P) and EX, respectively. For a countable measurable space (Ω,F), a PMF Q ∈ P(Ω) gives rise to a probability measure on (Ω,F), whichwedenotebyP ;accordingly,P A)= Q(ω)foreveryA∈F.Forasequenceofrandomvariables Q Q ω∈A Xn we also use the following: If the entr(cid:0)ies of XPn are drawn in an independent and identically distributed (i.i.d.) manner according to PX, then for every x ∈ Xn we have PXn(x) = ni=1PX(xi) and we write PXn(x) = PXn(x). Similarly, if for every (x,y) ∈ Xn ×Yn we have PYn|Xn(y|xQ) = ni=1PY|X(yi|xi), then we write PYn|Xn(y|x)=PYn|X(y|x). We often use QnX or QnY|X when referring to an i.Qi.d. sequence of random variables. The conditional product PMF Qn given a specific sequence x∈Xn is denoted by Qn . Y|X Y|X=x The empirical PMF ν of a sequence x∈Xn is x N(a|x) ν (a), (1) x n where N(a|x) = n 1 . We use Tn(P ) to denote the set of letter-typical sequences of length n with i=1 {xi=a} ǫ X P 5 respect to the PMF P ∈P(X) and the non-negative number ǫ [38, Ch. 3], [39], i.e., we have X Tn(P )= x∈Xn : ν (a)−P (a) ≤ǫP (a), ∀a∈X . (2) ǫ X x X X n (cid:12) (cid:12) o (cid:12) (cid:12) Definition 1 (Total Variation and Relative Entropy) Let P,Q ∈ P(X), be two PMFs on a countable sample space1 X. The TV and the relative entropy between P and Q are 1 ||P −Q|| = P(x)−Q(x) (3) TV 2 x∈X X(cid:12) (cid:12) (cid:12) (cid:12) and P(x) D(P||Q)= P(x)log (4) Q(x) x∈sXupp(P) respectively. We often make use of the relative entropy chain rule which reads as follows: For two PMFs P ,Q ∈ X,Y X,Y P(X ×Y), we have D P Q =D P Q +D P Q P , (5) X,Y X,Y X X Y|X Y|X X (cid:0) (cid:12)(cid:12) (cid:1) (cid:0) (cid:12)(cid:12) (cid:1) (cid:0) (cid:12)(cid:12) (cid:12) (cid:1) (cid:12)(cid:12) (cid:12)(cid:12) (cid:12)(cid:12) (cid:12) where D P Q P = P (x)D P Q . Y|X Y|X X x∈X X Y|X=x Y|X=x (cid:0) (cid:12)(cid:12) (cid:12) (cid:1) P (cid:0) (cid:12)(cid:12) (cid:1) Remark 1 Pins(cid:12)k(cid:12)er’s in(cid:12)equality shows that relative entro(cid:12)(cid:12)py is larger than TV. A reverse inequality is sometimes valid. For example, if P ≪ Q (i.e., P is absolutely continuous with respect to Q), and Q is an i.i.d. discrete distribution of variables, then (see [25, Equation (29)]) 2 1 D(P||Q)∈O n+log ||P −Q|| . (6) TV ||P −Q|| (cid:18)(cid:20) TV(cid:21) (cid:19) Inparticular,(6)impliesthatanexponentialdecayoftheTVinnproducesanexponentialdecayoftheinformational divergence with the same exponent. III. A CHANNEL RESOLVABILITY LEMMA FORSTRONG SECRECY Consider a state-dependent discrete memoryless channel (DMC) over which an encoder with non-causal access to the i.i.d. state sequence transmits a codeword (Fig. 2). Each channel state is a pair (S ,S) of random variables 0 drawn according to Q ∈P(S ×S). The encoder superimposes its codebookon S and then uses a likelihood S0,S 0 0 encoder with respect to S to choose the channel input sequence. The structure of a subcode that is superimposed on some s ∈Sn is also illustrated in Fig. 2. The conditional PMF of the channel output, given the states, should 0 0 approximate a conditional product distribution in terms of unnormalized relative entropy. A formal description of the setup is as follows. 1Countable samplespaces areassumedthroughout thiswork. 2f(n)∈O(cid:0)g(n)(cid:1)meansthatf(n)≤k·g(n),forsomek independent ofnandsufficiently largen. 6 B (s ): generated ∼Qn W ∼Unif 1:2nR˜ n 0 U|S0=s0 u(s ,1,i): i chosen by (cid:2) (cid:3) 0 likelihood encoder U S ,W,I(S),B V∼P(Bn) Code B 0 n Qn V|S0,S n (cid:0) (cid:1) V|U,S0,S ... w=1 w=2 w =2nR˜ (S0,S) 2nR′ u-codewords Qn S0,S Fig.2:CodingproblemforapproximatingP(Bn) ≈Qn underaresolvabilitycodebookthatissuperimposed V|S0,S0 V|S0,S ons ∈Sn:Foreachs ∈Sn,thecodebookB (s )contains2n(R˜+R′) u-codewordsdrawnindependentlyaccording 0 0 0 0 n 0 toQn .Thecodewordsarepartitionedinto2nR˜ bins,eachassociatedwithacertainw ∈ 1:2nR˜ .Totransmit U|S0=s0 W =w the likelihood encoder from (7) is used to choose a u-codeword for the wth bin. (cid:2) (cid:3) Let S , S, U and V be finite sets. Fix any Q ∈ P(S ×S ×U ×V) and let W be a random variable 0 S0,S,U,V 0 uniformly distributed over W = 1:2nR˜ that is independent of (S ,S)∼Qn . 0 S0,S Codebook: For every s ∈ Sn(cid:2), let B(cid:3)(s ) , U(s ,w,i) , where I = 1 : 2nR′ , be a collection 0 0 n 0 0 (w,i)∈W×I of3 2n(R˜+R′) conditionally independent random (cid:8)vectors of le(cid:9)ngth n, each distributed (cid:2)according(cid:3)to Qn . A U|S0=s0 realization of B (s ), for s ∈ Sn, is denoted by B (s ) , u(s ,w,i,B ) . Each codebook B (s ) n 0 0 0 n 0 0 n (w,i)∈W×I n 0 can be thought of as comprising 2nR˜ bins, each associated w(cid:8)ith a different(cid:9)message w ∈ W and contains 2nR′ u-codewords. We also denote B , B (s ) , which is referred to as the random resolvability codebook, n n 0 s0∈S0n and use Bn for its realization. (cid:8) (cid:9) Encoding and Induced PMF: Consider the likelihood encoder described by conditional PMF Qn s u(s ,w,i,B ),s Pˆ(Bn)(i|w,s ,s)= S|U,S0 0 n 0 . (7) 0 i′∈IQnS|U,S(cid:0)0(cid:12)(cid:12)s u(s0,w,i′,Bn),(cid:1)s0 P (cid:0) (cid:12) (cid:1) Upon observing (w,s ,s), an index i ∈ I is drawn randomly(cid:12)according to (7). The codeword u(s ,w,i,B ) is 0 0 n passed through the DMC Qn . For a fixed codebook B , the induced joint distribution is V|U,S0,S n P(Bn)(s ,s,w,i,u,v)=Qn (s ,s)2−nR˜Pˆ(Bn)(i|w,s ,s)1 Qn (v|u,s ,s). (8) 0 S0,S 0 0 u(s0,w,i,Bn)=u V|U,S0,S 0 Lemma 1 (Sufficient Conditions for Approximation) ForanyQ (cid:8) ∈P(S ×(cid:9) S×U×V),if(R˜,R′)∈R2 S0,S,U,V 0 + satisfies R′ >I(U;S|S ) (9a) 0 R′+R˜ >I(U;S,V|S ), (9b) 0 3To simplify notation, from here on out we assume that quantities of the form 2nR, where n ∈N and R∈R+, are integers. Otherwise, simplemodifications ofsomeofthesubsequent expressions usingflooroperations areneeded. 7 then EBnD PV(B|nS)0,S QnV|S0,S QnS0,S −n−→−−∞→0. (10) (cid:16) (cid:12)(cid:12) (cid:12) (cid:17) (cid:12)(cid:12) (cid:12) The proof of Lemma 1 is given in Section VII-(cid:12)A(cid:12) and it s(cid:12)hows that the TV decays exponentially fast with the blocklength n. By Remark 1 this implies an exponential decay of the desired relative entropy. Another useful property is that the chosen u-codeword is jointly letter-typical with (S ,S) with high probability. 0 Lemma 2 (Typical with High Probability) If (R˜,R′)∈ R2 satisfies (9), then for any w ∈ W and ǫ > 0, we + have EBnPP(Bn) S0,S,U(S0,w,I) ∈/ Tǫn(QS0,S,U) −n−→−−∞→0. (11) (cid:16)(cid:0) (cid:1) (cid:17) The proof of Lemma 2 is given in Section VII-B. IV. COOPERATIVEBROADCAST CHANNELS WITH ONECONFIDENTIAL MESSAGE A. Problem Definition The cooperative DM-BC with one confidential message is illustrated in Fig. 1. The channel has one sender and two receivers. The sender chooses a triple (m ,m ,m ) of indices uniformly and independently from the set 0 1 2 1 : 2nR0 × 1 : 2nR1 × 1 : 2nR2 and maps it to a sequence x ∈ Xn. The sequence x is transmitted over a (cid:2)BC with t(cid:3)rans(cid:2)ition prob(cid:3)abil(cid:2)ity QY1,Y2(cid:3)|X : X → P(Y1 ×Y2). If QY1,Y2|X factors as 1{Y1=g(X)}QY2|X, for some deterministic g : X → Y , or Q Q then we call the BC SD or PD, respectively. Furthermore, a BC is 1 Y1|X Y2|Y1 said to be deterministic if Q = 1 , for some deterministic functions g as before and Y1,Y2|X {Y1=g(X)}∩{Y2=h(X)} h : X → Y . The output sequence y ∈ Yn, where j = 1,2, is received by decoder j. Decoder j produces 2 j j a pair of estimates mˆ(j),mˆ of (m ,m ). Furthermore, the message m is to be kept secret from Decoder 2. 0 j 0 j 1 There is a one-sided(cid:0)noiseless(cid:1)cooperationlink of rate R12 fromDecoder 1 to Decoder2. By conveyinga message m12 ∈ 1:2nR12 over this link, Decoder 1 can share with Decoder 2 information about y1, mˆ(01),mˆ1 , or both. (cid:2) (cid:3) (cid:0) (cid:1) Definition 2 (Code) An (n,R ,R ,R ,R ) code c for the BC with cooperation and one confidential message 12 0 1 2 n has: 1) Four message sets M12 = 1:2nR12 and Mj = 1:2nRj , for j =0,1,2. 2) A stochastic encoder f :M(cid:2)0×M1×(cid:3) M2 →P(X(cid:2)n). (cid:3) 3) A decoder cooperation function g :Yn →M . 12 1 12 4) Two decoding functions φ :Yn →M ×M and φ :M ×Yn →M ×M . 1 1 0 1 2 12 2 0 2 The joint distribution induced by an (n,R ,R ,R ,R ) code c is: 12 0 1 2 n P(cn) m ,m ,m ,x,y ,y ,m , mˆ(1),mˆ , mˆ(2),mˆ =2−n(R0+R1+R2)f(x|m ,m ,m ) 0 1 2 1 2 12 0 1 0 2 0 1 2 (cid:16) ×Qn (y(cid:0),y |x)1 (cid:1) (cid:0) (cid:1)(cid:17) . (12) Y1,Y2|X 1 2 mˆ12=g12(y1) ∩ mˆ(01),mˆ1 =φ1(y1) ∩ mˆ(02),mˆ2 =φ2(m12,y2) (cid:8) (cid:9) (cid:8)(cid:0) (cid:1) (cid:9) (cid:8)(cid:0) (cid:1) (cid:9) 8 Theperformanceofc isevaluatedintermsofitsratetuple(R ,R ,R ,R ),theaveragedecodingerrorprobability n 12 0 1 2 and the strong secrecy metric. Definition 3 (Error Probability) The average error probability for an (n,R ,R ,R ,R ) code c is 12 0 1 2 n P (c )=P (Mˆ(1),Mˆ(2),Mˆ ,Mˆ )6=(M ,M ,M ,M ) , e n P(cn) 0 0 1 2 0 0 1 2 (cid:16) (cid:17) where Mˆ(1),Mˆ =φ (Y ) and Mˆ(2),Mˆ =φ g (Y ,Y ) . 0 1 1 1 0 2 2 12 1 2 (cid:16) (cid:17) (cid:16) (cid:17) (cid:0) (cid:1) Definition 4 (Information Leakage) The information leakage at receiver 2 under an (n,R ,R ,R ,R ) code 12 0 1 2 c is n L(c )=I(M ;M ,Yn), (13) n 1 12 2 where the mutual information calculated with respect to the marginal PMF P(Bn) induced by (12). M1,M12,Y2 Definition 5 (Achievability) A rate tuple (R ,R ,R ,R ) ∈ R4 is achievable if for any ǫ > 0 there is an 12 0 1 2 + (n,R ,R ,R ,R ) code C , such that for any n sufficiently large 12 0 1 2 n P (c )≤ǫ (14a) e n L(c )≤ǫ. (14b) n The strong secrecy-capacity region C is the closure of the set of the achievable rates. S B. Strong Secrecy-Capacity Bounds and Results We state an inner bound on the strong secrecy-capacity region C of a cooperative BC with one confidential S message. Theorem 1 (Inner Bound) Let Q be a BC and let R be the closure of the union of rate tuples Y1,Y2|X I (R ,R ,R ,R )∈R4 satisfying: 12 0 1 2 + R ≤I(U ;Y |U )−I(U ;U ,Y |U ) (15a) 1 1 1 0 1 2 2 0 R +R ≤I(U ,U ;Y )−I(U ;U ,Y |U ) (15b) 0 1 0 1 1 1 2 2 0 R +R ≤I(U ,U ;Y )+R (15c) 0 2 0 2 2 12 R +R +R ≤I(U ,U ;Y )+I(U ;Y |U )−I(U ;U ,Y |U ) (15d) 0 1 2 0 1 1 2 2 0 1 2 2 0 where the union is over all PMFs Q , each inducing a joint distribution Q Q . Then the U0,U1,U2,X U0,U1,U2,X Y1,Y2|X following inclusion holds: R ⊆C . (16) I S 9 Furthermore, R is convex and one may choose |U |≤|X|+5, |U |≤|X| and |U |≤|X|. I 0 1 2 The proof of Theorem 1 relies on a channel-resolvability-based Marton code and is given in Section VII-C. Two key ingredientsallow us keeping M secret while still utilizing the cooperationlink to help Receiver 2. First, 1 the cooperation strategy is modified compared to the case without secrecy that was studied in [35], where M 12 conveyed information about both private messages as well as the common message. Here, the confidentiality of M restricts the cooperation message from containing any information about M , and therefore, we use an M 1 1 12 that is a function of (M ,M ) only. Since the protocol requires Receiver 1 to decode the information it shares 0 2 with Receiver 2, this modified cooperation strategy results in a rate loss in R when compared to [35]; the loss is 1 expressed in the first mutual information term in (15a) being conditioned on U rather than having U next to U . 0 0 1 Thesecondingredientisassociatingwith eachm ∈M a resolvability-subcodethatadheresto theconstruction 1 1 forLemmas1 and2 describedin Section III. By doingso, the relationsbetweenthe codewordsin the Martoncode correspondto those between the channel states and its input in the resolvability problem. Marton coding combines superposition coding and binning, hence the different roles the state sequences S and S play in our resolvability 0 setup. Reliability is established with the help of Lemma 2, while invoking Lemma 1 ensures strong secrecy. The inner bound from Theorem 1 is tight for SD- and PD-BCs, giving rise to the new strong secrecy-capacity results stated in Theorems 2 and 3. Theorem 2 (SD-BC Secrecy-Capacity) The strong secrecy-capacity region C(SD) of a cooperative SD-BC S 1 Q with one confidentialmessage is the closure of the union of rate tuples (R ,R ,R ,R )∈R4 {Y1=g(X)} Y2|X 12 0 1 2 + satisfying: R ≤H(Y |W,V,Y ) (17a) 1 1 2 R +R ≤H(Y |W,V,Y )+I(W;Y ) (17b) 0 1 1 2 1 R +R ≤I(W,V;Y )+R (17c) 0 2 2 12 R +R +R ≤H(Y |W,V,Y )+I(V;Y |W)+I(W;Y ) (17d) 0 1 2 1 2 2 1 where the unionis overallPMFs Q with Y =g(X),eachinducinga jointdistribution Q Q . W,V,Y1,X 1 W,V,Y1,X Y2|X Furthermore, C(SD) is convex and one may choose |W|≤|X|+3 and |V|≤|X|. S The direct part of Theorem 2 follows from Theorem 1 by setting U =W, U =Y and U =V. The converse 0 1 1 2 is proven in Section VII-D. Theorem 3 (PD-BC Secrecy-Capacity) The strong secrecy-capacity region C(PD) of a cooperative PD-BC S Q Q with on confidential message is the closure of the union of rate tuples (R ,R ,R ,R ) ∈ R4 Y1|X Y2|Y1 12 0 1 2 + satisfying: R ≤I(X;Y |W)−I(X;Y |W) (18a) 1 1 2 10 R +R ≤I(W;Y )+R (18b) 0 2 2 12 R +R +R ≤I(X;Y )−I(X;Y |W) (18c) 0 1 2 1 2 wheretheunionisoverallPMFsQ ,eachinducingajointdistributionQ Q Q .Furthermore,C(PD) W,X W,X Y1|X Y2|Y1 S is convex and one may choose |W|≤|X|+2. The achievability of C(PD) is a consequence of Theorem 1 by taking U = W, U = X and U = 0. For the S 0 1 2 converse see Section VII-E. Remark 2 (Converse) We use two distinct converse proofs for Theorems 2 and 3. In the converse of Theorem 2, the boundin(17d) doesnotinvolveR since theauxiliaryrandomvariableW containsM . With respectto this 12 i 12 choice of W , showing that W −X−(Y ,Y ) forms a Markov chain relies on the SD property of the channel.For i 1 2 the PD-BC, however, such an auxiliary is notfeasible as it violates the Markov relation W−X−Y −Y induced 1 2 bythechannel.Tocircumventthis,intheconverseofTheorem3we defineW withoutM andusethestructureof i 12 the channelto keep R from appearingin (18c). Specifically, this argumentrelies on the relation M =f (Y ) 12 12 12 1 and that Y is a degraded version of Y , implying that all three messages (M ,M ,M ) are reliably decodable 2 1 0 1 2 from Y only. 1 Remark 3 (Weak versus Strong Secrecy) The results of Theorems 1, 2 and 3 remain unchanged if the strong secrecy requirement (see (13) and (14b)) is replaced with the weak secrecy constraint. As weak secrecy refers to a vanishing normalized information leakage, to formally define the corresponding achievability, one should replace the left-hand side (LHS) of (14b) with 1L(c ). To see that the results of the preceding theorems coincide under n n both metrics, first notice that strong secrecy implies weak secrecy (which validates the claim from Theorem 1). Furthermore,theconverseproofsofTheorems2 and3 (givenin SectionsVII-DandVII-E,respectively)are readily reformulated under the weak secrecy metric by replacing ǫ with nǫ in (66)-(67) and (79)-(80). Remark 4 (Cardinality Bounds) The cardinality bounds on the auxiliary random variables in Theorems 1, 2 and 3 are established using the perturbation method [40] and the Eggleston-Fenchel-Carathe´odory theorem [41, Theorem 18]. V. SUB-OPTIMAL COOPERATION WITHOUT SECRECY The cooperation protocol for the BC with a secret M uses the cooperative link to convey information that is 1 a function of the non-confidential message and the common message. Without secrecy constraints, it was shown in [35] that the best cooperation strategy uses a public message that comprises parts of both private messages as well as the common message. To understand whether the restricted protocol reduces the transmission rates beyond standard losses due to secrecy (which are discussed in Section VI), we compare the achievable regions induced by each scheme for the cooperative BC without secrecy. The formal description of this BC instance (see [35]) closely follows the definitions from Section IV-A up to removingthe security requirement(14b) from Definition 5

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.