A bound on the mutual information, and properties of entropy reduction, for quantum channels with inefficient measurements Kurt Jacobs1 1Centre for Quantum Computer Technology, Centre for Quantum Dynamics, School of Science, Griffith University, Nathan 4111, Brisbane, Australia The Holevoboundis a boundon themutualinformation for a given quantumencoding. In 1996 Schumacher, Westmoreland and Wootters [Schumacher, Westmoreland and Wootters, Phys. Rev. Lett. 76, 3452 (1996)] derived a bound which reduces to the Holevo bound for complete measure- ments, but which is tighter for incomplete measurements. The most general quantum operations may be both incomplete and inefficient. Here we show that the bound derived by SWW can be further extended to obtain one which is yet again tighter for inefficient measurements. This allows us in addition to obtain a generalization of a bound derived by Hall, and to show that the average 6 reduction in the von Neumann entropy during a quantum operation is concave in the initial state, 0 for all quantum operations. This is a quantum version of the concavity of the mutual information. 0 We also show that both this average entropy reduction and the mutual information for pure state 2 ensembles,areSchur-concaveforunitarilycovariantmeasurements;thatis,forthesemeasurements, n information gain increases with initial uncertainty. a J PACSnumbers: 03.67.-a,03.65.Ta,89.70.+c,02.50.Tt 4 2 I. INTRODUCTION furtherreduced. In1996Schumacher,Westmorelandand 2 Woottersshowedthatwhenthereceiversmeasurementis v incomplete,itispossibletotakethisintoaccountandde- 6 The celebrated Holevo bound, conjectured by Gor- rive a more stringent bound on the information. If the 0 don [1] and Levitin [2] and proved by Holevo in 1973 [3] receiver’s measurement is the POVM described by the 0 gives a bound on the information which may be trans- 2 mitted from A to B (strictly, the mutual information, operators {Aj} (with jA†jAj = 1), so that the mea- 1 M, between A and B) when A encodes information in a surement outcomes are labeled by the index j, then the 4 quantum system using a set of states {ρ }, chosen with SWW bound is [6] P 0 i probabilities{P(i)},andBmakesasubsequentmeasure- / h ment upon the system. The Holevo bound is M(I:J)≤χ− P(j)χj, (2) p j - X t M(I:J)≤χ≡S(ρ)− P(i)S(ρ ), (1) n i where P(j) is the probability of outcome j [7], and χ a Xi is the Holevo quantity for the ensemble that the systemj u q where ρ = iP(i)ρi (and which we will refer to as the remainsin(fromthe pointofview ofthe receiver),given : ensemble state). We write the mutual information as outcome j. This bound can be at least partially under- v M(I:J) toPsignify that it is the mutual information be- stood by noting that if the system still remains in some i X tweentherandomvariablesI andJ,whosevaluesiandj ensemble of possible states after the measurement, then r label respectively the encoding used by A, and outcome future measurements can potentially extract further in- a of the measurement made by B. More recent proofs of formation about the encoding, and so the information the Holevo bound may be found in Refs. [4, 5, 6] The obtainedbythefirstmeasurementmustthereforebe less bound is achieved if and only if the encoding states, ρ , than the maximum possible by at least by this amount. i commute with each other, and the receiver, B, makes a What the SWW bound tells us is that the bound on the von Neumann measurement in the basis in which they information is reduced not only by the amount of infor- are diagonal. (A von Neumann measurement is one that mation which could be further extracted after outcome projects the system onto one of a complete set of mu- j has been obtained, but by the Holevo bound on this tually orthogonal states. In this case the set of states information, χj. is chosen to be the basis in which the coding states are If the initial state of the system is ρi, then after diagonal.) With this choice of coding states and mea- outcome j the final state of the system is given by surement the channel is classical, in that it can be im- ρ˜j|i = AjρiA†j/Tr[A†jAjρi]. Thus the states which make plemented with a classical system. The Holevo bound up the final ensemble that remains after outcome j, are takes into account that the sender may only be able to {ρ˜ }, and the probability of each state in the ensemble j|i sendmixedstates,andthismixingreducestheamountof is P(i|j) = P(j|i)P(i)/P(j), with P(j|i) = Tr[A†A ρ ]. j j i informationthat canbe transmitted. However,if the re- The Holevo quantity for ensemble j is thus ceiverisnotabletoperformmeasurementswhichalways project the system to a pure state (so called complete χ =S(ρ˜ )− P(i|j)S(ρ˜ ), (3) j j j|i measurements), then in general the information will be i X 2 whereρ˜ =A ρA†/Tr[A†A ρ]. Ifatleastoneofthemea- welearnaboutthefinalstateofthesystem. Equivalently, j j j j j surement operators A are higher than rank 1, then the itcanbe saidtomeasurethe degreeof“state-reduction” j measurement is incomplete. If the measurement is com- which the measurement induces. plete,thenforeachj allthefinalstatesρ˜ areidentical, While it is the mutual informationwhich is important j|i χ is zero and the SWW bound reduces to the Holevo for communication, the reduction in the von Neumann j bound. entropyisimportantforfeedbackcontrol. Feedbackcon- The most general kind of measurement can also be trol is the process of performing a sequence of measure- inefficient. A measurement is described as inefficient if mentsonasystem,andapplyingunitaryoperationsafter the observer does not have full information regarding each measurement in order control the evolution of the which of the outcomes actually occurred. The name in- system. Such a procedure is useful for controlling sys- efficient comes from that fact that the need to consider tems which are drivenby noise. If the ability to perform such measurements first arose in the study of inefficient unitary operations is unlimited, then the von Neumann photo-detectors. [8] An inefficient measurement may be entropy provides a measure of the level of control which described by labeling the measurement operators with canbeachieved: ifthesystemhasmaximalentropythen two indices, so that we have A . The receiver has com- theunitaryoperationshavenoeffectonthe systemstate kj plete information about one of the indices, j, but no in- whatsoever; conversely, if the state is pure then the sys- formation about the other, k. [9] As a result the final tem can be controlled precisely - that is, any pure state state for each j (given the value of i) is now can be prepared. Thus the entropy measures the extent towhichapurestate,orpureevolutioncanbeobtained, ρ′ = P(k|j) AkjρiA†kj . (4) andthusthelevelofpredictabilitywhichcanbeachieved j|i Tr[A† A ρ ] overthe future behaviorofthe system[10]. Theprimary Xk kj kj i role of measurement in feedback control is therefore to reduce the entropy of the system. As such the average Sinceinefficiencyrepresentsalossofinformation,wewish reductioninvonNeumannentropyprovidesarankingof toaskwhetheritispossibletotakethisintoaccountand the effectiveness of different measurements for feedback obtain a more stringent bound on the mutual informa- control,otherthingsbeingequal. Furtherdetailsregard- tion. IfwemerelyapplytheSWWboundtothemeasure- ingquantumfeedbackcontrolandvonNeumannentropy mentA ,thentheboundinvolvesthe Holevoquantities kj can be found in reference. [11] of the ensembles that remain when both the values of k The entropy reduction is also relevant to the transfor- andj areknown(thefinalensemblesthatresultfromthe mation of pure-state entanglement, since the von Neu- efficient measurement). That is mannentropymeasuresthe entanglementofpure states. Asaresultthisquantitygivestheamountbywhichpure- M(I:J)≤χ− P(k,j)χ . (5) kj state entanglement is broken by a local measurement. kj X We give two corollaries of the general information One therefore wishes to know whether it is possible to bound derived in the first part which involve h∆S(ρ)i. derive a bound which instead involves the Holevo quan- The first is a generalization of a bound derived by tities of the ensembles that remain after the inefficient Hall [12, 13] to inefficient measurements. Hall’s bound measurement is made, that is, for the receiver who only states that for efficient measurements the mutual infor- has access to j. mation is bounded by h∆S(ρ)i. We show that for ineffi- In the first part of this paper we answer this question cient measurements this becomes in the affirmative - for an inefficient measurement where M(I:J)≤h∆S(ρ)i− P(i)h∆S(ρ )i, (7) the known outcomes are labeled by j, the bound given i byEq.(2)remains true,wherenowtheχj aretheHolevo Xi quantitiesfortheensembleofstatesρ′ whichresultfrom j|i where h∆S(ρi)i is the average entropy reduction which the inefficient measurement. would have resulted if the initial state had been ρ , and i In the second part of the paper, we consider the aver- as above ρ= P(i)ρ . i i age reductionin the vonNeumann entropy induced by a The second is the fundamental property that, for all measurement: quantum operPations, the average reduction in von Neu- mann entropy is concave in the initial state ρ. That is h∆S(ρ)i≡S(ρ)− P(j)S(ρ′). (6) j i X h∆S(ρ)i≥ P(i)h∆S(ρ )i. (8) Here ρ′ is the state thatresults fromoutcomej, given i j i thattheinitialstateisρ. SincethevonNeumannentropy X is a measure of how much we know about the state of Finally,inthethirdpartofthispaper,weusetheabove the system, this is the difference between what we knew result to show that for measurements which are uniform aboutthesystemstatebeforewemadethemeasurement, in their sensitivity across state-space (that is, measure- andwhatweknow(onaverage)aboutthesystemstateat ments which are unitarily covariant), the amount which the end of the measurement;it thus measures how much one learns about the final state always increases with 3 the initial uncertainty, where this uncertainty is charac- Proof. Webeginbycollectingvariouskeyfacts. Thefirst terized by majorization. This is a quantum version of is that any efficient measurement on a system Q, de- the much simpler classical result (which we also show) scribed by N = N N operators, A , (j = 1,...,N 1 2 kj 1 that the mutual information always increases with the and k = 1,...,N ) can be obtained by bringing up an 2 initial uncertainty for classical measurements which are auxiliary system A of dimension N, performing a uni- permutation symmetric. In addition we show that, for tary operation involving Q and A, and then making a unitarily covariant measurements, the mutual informa- von Neumann measurement on A. [14, 15] If the initial tionforpure-stateensemblesalsohasthis property. One state of Q is ρ(Q), then the final joint state of A and Q can sum up these results by saying that the statement after the von Neumann measurement is that information gain increases with initial uncertainty can fail to hold only if the measurement is asymmetric σ(AQ) =|kjihkj|(A)⊗ Akjρ(Q)A†kj. (13) in its sensitivity. P(k,j) where|kjiisthestateofAselectedbythevonNeumann II. AN INFORMATION BOUND FOR measurement. Thesecondfactisthatthestatewhichre- GENERAL QUANTUM OPERATIONS sults fromdiscarding allinformationabout the measure- ment outcomes k and j can be obtained by performing We now show that the bound proved by SWW can be a unitary operation between A and another system E generalized to obtain a more stringent bound for chan- which perfectly correlates the states |kji of A with or- nelsinwhichthereceiversmeasurementisinefficient. To thogonal states of E, and then tracing out E. The final show this it turns out that we can use the same method keyfactwerequireisaresultprovenbySWW[6],which employed by SWW, but with the addition of an extra isthattheHolevoχquantityisnon-increasingunderpar- quantum system which allows us to include the ineffi- tial trace. That is, if we have two quantum systems A ciency of the measurement. and B, and an ensemble of states ρ(AB) with associated i probabilities P , then Theorem 1. For a quantum channel in which the en- i coding ensemble is ε = {P(i),ρi}, and the measurement χ(A) = S(ρ(A))− S(ρ(A)) performed by the receiver is described by operators A i kj i ( A† A =1), where the measurement is in general X kj kj kj ≤ S(ρ(AB))− S(ρ(AB))=χ(AB), (14) inefficient so that the receiver knows j but not k, then i thPe mutual information, M(I:J), is bounded such that Xi M(I:J)≤χ− P(j)χj, (9) where ρ(iA) =TrB[ρ(iAB)]. To provethis result SWW use strong subadditivity. [16] j X We now encode informationin system Q using the en- where P(j) is the overall probability for outcome j, χ = semble ε, and consider the joint system which consists S(ρ)− iP(i)S(ρi) is the Holevo quantity for the initial of the three systems Q, A, E and a forth system M, ensemble and with dimension N . We now start with A, E and M in P 1 purestates,sothattheHolevoquantityforthejointsys- χ =S(σ )− P(i|j)S(σ ), (10) j j j|i tem is χ(QAEM) = χ(Q). We then perform the required is the Holevo quantity for tXhe ensemble, ε , that remains unitary operation between Q and A, and a unitary op- j (from thepoint of view of the receiver) oncethe measure- eration between A and E which perfectly correlates the ment has been made, so that the receiver has learned the states |kji(A) of A with orthogonalstates of E. Unitary outcome j, but not the value of k. Here the receiver’s operations do not change the Holevo quantity. Then we overall final state is trace over E, so that we are left with the state σ = kAkjρA†kj = P(i,k|j)σ , (11) |ψihψ|(M)⊗ P(k,j)|k,jihk,j|(A)⊗Akjρ(Q)A†kj. (15) j P(j) kj|i P(k,j) P Xik Xjk where P(i,k|j) is the probability for both i and outcome After the two unitaries and the partial trace over E, the k given j, and σkj|i is the final state that results given Holevoquantityfortheremainingsystems,whichwewill the initial state ρi, and both outcomes j and k. The re- denotebyχ′(QAM),satisfiesχ′(QAM) ≤χ(QAEM) =χ(Q). maining ensemble εj ={P(i|j),σj|i}, where We now perform one more unitary operation, this time between M and A, so that we correlate the states of M, kAkjρiA†kj which we denote by |jihj|(M) with the second index of σ = P(k|j,i)σ = , (12) j|i kj|i P(j|i) the states of A, giving k P X and where P(k|j,i) is the probability for outcome k given |jihj|(M)⊗ P(k,j)|k,jihk,j|(A)⊗σ(Q) (16) kj j and the initial state ρ . i j k X X 4 where σ(Q) = A ρ(Q)A† /P(k,j) is the final state re- ρ, does increase, the average increase in the entropy for kj kj kj sulting from knowing both outcomes k and j, with no each of the coding states ρi is always more that this by knowledge of the initial choice of i. Finally we trace out at least the mutual information. A, leaving us with the state The secondresultthatweobtainfromEq.(20)is that, becausethe mutualinformationisnonnegative,we have σ(QM) = |jihj|(M)⊗ P(k,j)σ(Q) (17) kj Xj Xk h∆S(ρ)i≥ P(i)h∆S(ρi)i. (22) i After this final unitary, and the partial trace over A, X the Holevo quantity for the remaining systems Q and That is, the reduction in the von Neumann entropy is M, which we will denote by χ′′(QM), satisfies χ′′(QM) ≤ concave in the initial state. This parallels the fact that χ′(QAM) ≤ χ(Q). We have gone through the above pro- themutualinformationisalsoconcaveintheinitialstate. cess using the initial state ρ, but we could just as eas- The fact that this is true for inefficient measurements, ily have started with any of the initial states, ρi, in the meansthatoncewehavemadeanefficientmeasurement, ensemble, and we will denote the final states which we nomatterwhatinformationwethrowawayregardingthe obtain using the initial state ρ as σ(QM). Calculating final outcomes (i.e. which outcomes we average over), i i χ′′(QM) we have h∆S(ρ)iisalwaysgreaterthantheaverageoftheentropy reductionswhichwouldhavebeenobtainedthroughmea- surement in each of the coding states, when we throw χ′′(QM) = S(σ(QM))− P(i)S(σ(QM)) away the same information regarding the measurement i i results. X = H[J]− P(i)H[J|i] i X IV. INFORMATION GATHERING AND + P(j) S(σ )− P(i|j)σ (18) STATE-SPACE SYMMETRY j j|i " # j i X X = M(J :I)+ P(j)χ(Q) ≤χ(Q). (19) Inthissectionweshowthatmeasurementswhoseabil- j ity to extract information is uniform over the avail- j X able state-space (that is, does not vary from point to Rearranging this expression gives the desired result. pointinthestate-space)alwaysextractmoreinformation (strictly, never extract less information) the less that is known before the measurement is made. Thus, in this III. PROPERTIES OF ENTROPY REDUCTION sense, one may regard “the more you know, the less you get” as a fundamental property of measurement. We will show that this is true both for the information ob- Wenowrewritetheaboveinformationboundusingthe tainedregardingthe finalstate(being h∆S(ρ)i), andthe fact that P(i|j)P(j)=P(j|i)P(i). The result is mutual information for a measurement on an ensemble of pure states. We will consider here efficient measure- M(I:J)≤h∆S(ρ)i− P(i)h∆S(ρ )i (20) i ments only; no doubt inefficient measurements will also i X have this property, but only if the information which is thrown away is also uniform with respect to the state- where ρ = P ρ . Ozawa has shown that for efficient i i i space, and we do not wish to burden the treatment with measurements h∆S(ρ)i is always positive[17] (for more P this additional complication. recentproofsofthisresultsee[18,19]). Forefficientmea- surements Eq.(20) is therefore in general stronger than, To proceed we must make precise the notion that the andgivesimmediately,Hall’sbound[12,13],whichstates sensitivity ofa measurementis uniformoverstate-space. thatthemutualinformationisboundedbythereduction This is captured by stating that such a measurement in the von Neumann entropy. The inequality in Eq.(20) should be invariant under reversible transformations of isthenageneralizationofHall’sboundtoinefficientmea- the state-space. For classical measurements (which are surements. Since the mutual information is always pos- simplyquantummeasurementsinwhichalloperatorsand itive, but for inefficient measurements the reduction in density matrices commute [20]) this means that the set the von Neumann entropy can be negative (that is the ofmeasurementoperatorsisinvariantunderallpermuta- entropy of the quantum state can increase as a result of tionsoftheclassicalstates: wewillrefertotheseascom- the measurement), the relation pletely symmetric measurements. Note that in this clas- sical case, this is equivalent to saying that the measure- M(I:J)≤h∆S(ρ)i (21) mentdistinguishesallstatesfromallotherstatesequally well. The quantum generalization of this is invariance is not necessarily satisfied for such measurements. How- under all unitary transformations. Such measurements ever,Eq.(20)tellsusthatiftheentropyoftheintialstate, are referred to as being unitarily covariant. [21, 22] 5 Wemustalsoquantifywhatwemeanbytheobserver’s cause of the unitary covariance of the measurement, we lack of knowledge, or uncertainty, before the measure- see from the form of h∆S(ρ)i that it is invariantunder a mentismade. Thisiscapturedbythesimpleandelegant unitary transformationofρ. As a result,it only depends concept of majorization. [23, 24] If two sets of probabil- upon the eigenvalues of ρ. Since the permutations are ities p ≡ {P } and q ≡ {Q } satisfy the set of relations a subgroup of the unitaries, it is also a symmetric func- i i tion of its arguments (the eigenvalues), and thus Schur- concave. k k P ≥ Q , ∀k, (23) Wewishfinallytoshowthatthemutualinformationis i i also Schur-concave in ρ for unitarily covariant measure- i=1 i=1 X X ments on ensembles of pure states. This requires a little whereitisunderstoodthattheelementsofbothsetshave more work. First we need to show that once we have beenplacedindecreasingorder(e.g.,P >P ,∀i),then i i+1 fixed a set of encoding states, the mutual information is p is said to majorize q, and this is written q ≺ p. While concave in the vector of the ensemble probabilities P(i). atfirstEq.(23)looksalittlecomplicated,afewmoments This is straightforward if we first note that the mutual considerationreveals that it captures precisely what one information, because it is, in fact, symmetric between i means by uncertainty - if p majorizes q, then p is more and j, can be written in the reverse form sharplypeakedthanq,andconsequentlydescribesastate of knowledge containing less uncertainty. What is more, M(I:J) = H[P(j)]− P(i)H[P(j|i)], (24) majorization implies an ordering with Shannon entropy j X H[·]. Thatis,if pmajorizesq, thenH[p]≤H[q].[23,24] Since,forafixedmeasurement,themutualinformationis Inasense,majorizationisamorebasicnotionofuncer- afunctionoftheensembleprobabilitieswewillwriteitas taintythanentropyinthatitcapturesthatconceptalone M({P(i)}). Denotingthepurestatesintheencodingen- – the Shannon entropy on the other hand characterizes semble as ρ =|ψ ihψ |, and choosing the ensemble state the more specific notion of information. To characterize i i i ρ = P σ , where the σ are built from the encoding the uncertainty of a density matrix, we can apply ma- k k k k states so that σ = P |ψ ihψ |, then jorization to the vector consisting of its eigenvalues. If ρ k i i|k i i P and σ are density matrices, then we will write σ ≺ ρ if M({P(i)}) P ρ’s eigenvalues majorize σ’s. Various applications have = H[ P(k)P(j|k)]− P(i|k)P(k)H[P(j|i)] beenfoundformajorizationinquantuminformationthe- ory. [18, 19, 25, 26, 27, 28] Xk Xi Xk Wethusdesiretoshowthatformeasurementswiththe ≥ P(k)H[P(j|k)]− P(k) P(i|k)H[P(j|i)] specified symmetry, h∆S(σ)i≥h∆S(ρ)i whenever σ ≺ρ k k i X X X (and similarly for the mutual information). Functions = P(k)M({P(i|k)}), (25) with this property (of which the von Neumann entropy, k S(ρ), is one example) are referred to as being Schur- X concave. To show that a function is Schur-concave, it is being the desired concavity relation. The inequality in sufficient to show that it is concave, and symmetric in the third line is merely a result of the concavity of the its arguments [23, 24], which in our case are the eigen- Shannon entropy. Note that while we have written the values of the density matrix ρ (if our functions did not measurement’s outcomes explicitly as being discrete in depend only onthe eigenvaluesof ρ, then they couldnot the about derivation, the result also follows if they are be Schur-concave, since the majorization condition only a continuum (as in the case of UC measurements) by involves these eigenvalues). replacing the relevant sums with integrals. The desired result for classical completely symmetric Now we need to note some further points about UC measurementsisnowimmediate. Intheclassicalcasethe measurements: A UC measurement may be generated mutualinformationistheuniquemeasureofinformation by taking all unitary transformations of any single oper- gain, and M(I : J) = h∆S(ρ)i. The mutual informa- ator A, and dividing them by a common normalization tion is concave in the initial classical probability vector factor. The resulting measurement operators are thus P = (P1,...,Pn) (being the vector of the eigenvalues of AU ∝ UAU†, where U ranges over all unitaries. The ρinourquantumformalism),asisindeedimpliedbythe normalizationfortheAU comesfrom UA†AU†dµ(U)= concavity of h∆S(ρ)i. Since all operators commute with Tr[A†A]I where dµ(U) is the (unitarily invariant) Haar R thedensitymatrix,h∆S(ρ)iisonlyafunctionofthe{P }. measure [22, 29] over unitaries. i From the form of h∆S(ρ)i we see that a permutation of ItisnothardtoshowthatallUCmeasurementscanbe the elements of P is equivalent to a permutationapplied obtainedbymixingdifferentUCmeasurements,eachgen- tothemeasurementoperators,andsincetheseareinvari- eratedby adifferent operator. (Mixing a setofmeasure- antundersuchanoperation,h∆S(ρ)i,andthusM(I:J), ments means assigning to each a probability, and then is a symmetric function of its arguments. Thus M(I:J) making one measurement from the set at random based is Schur-concave. on these probabilities [30]). TheSchur-concavityofh∆S(ρ)iforunitarilycovariant Next, we need to show that for all UC measurements (UC) quantum measurements is just as immediate. Be- the mutual information depends only on the eigenvalues 6 of the ensemble density matrix, and we state this as the V. CONCLUSION following lemma. Lemma 1. The mutual information for a UC measure- In using a quantum channel, if there are limitations onthe completeness(or alternativelythe strength,inthe ment on a pure-state ensemble, ε={P(i),|ψ i} depends i terminology of [19]) or efficiency of the measurements on the ensemble only through the eigenvalues of the den- that the receiver can perform, then it is possible to give sity matrix ρ= P(i)|ψ ihψ |. i i i a bound on the mutual information which is stronger Proof. WefirstsPhowthisforUCmeasurementsgenerated than the Holevo bound. Further, this bound has a very from a single operator. Writing the mutual information simple form in terms of the Holevo χ quantity, and the in the reverse form one has χquantitiesofthe ensembles,oneofwhichremainsafter the measurement is made. M(I:J)) = H[P(U)]− P H[P(U|i)], (26) This bound also allows us to obtain a relationship be- i i tween the mutual information and the average von Neu- X mannentropyreductioninduced by ameasurement,and whereU isthecontinuumindexforthemeasurementop- encompassesthe factthat this vonNeumannentropyre- erators (and thus the measurementoutcomes) which are duction is concave in the initial state. A =UAU† for some appropriately normalized A. Nat- U Fromthe concavityofthe mutualinformationandthe urally all this means is that P(U|i) is a function of U, von Neumann entropy reduction, it follows that these whereU rangesoverallunitaries. Sincethemeasurement quantities are Schur-concave (the former naturally for isunitarilycovariant,H[P(U|i)]isthesameforallinitial pure-state ensembles) for completely symmetric classi- states |ψ i, and therefore the second term is the same i cal measurements, and for unitarily covariant quantum for all initial ensembles. Thus M depends only on the measurements. Thus the possibility that either of these first term H[P(U)] = H[Tr[UA†AU†ρ]], which depends kinds of information gain decreases with increasing ini- only on ρ, and is invariant under all unitary transforma- tial uncertainty is associated with the asymmetry of the tions of ρ. Thus M depends only on the eigenvalues of measurement in question. ρ. Since the mutual information for a mixture of mea- surements is merely a function of the respective mutual informations for each measurement (in particular it is a Acknowledgments linear combination of them), the result holds for all UC measurements. The author would like to thank Gerard Jungman, Since M depends only on ρ, in establishing the Schur Howard Barnum, Howard Wiseman, Terry Rudolph and concavity of M with respect to ρ, we need only consider Michael Hall for helpful discussions. The author is also one ensemble for each ρ. We therefore choose the eigen- grateful both to Vlatko Vedral for hospitality during a ensemble{λ ,|φ i},whereλ and|φ iaretheeigenvalues visit to ImperialCollege, andLucien Hardy for hospital- i i i i andeigenvectorsofρrespectively. Weknowthatthemu- ity during a visit to the Perimeter Institute where some tualinformationisconcaveinthevectorofinitialensem- of the initial stages of this work was carried out. This ble probabilities, and for the ensemble we have chosen, work was supported by the Australian Research Coun- the initial probabilities are the eigenvalues of ρ. As a re- cil and the State of Queensland. Note added: After sultthe mutualinformationisconcaveinthe eigenvalues submitting this manuscript, which was first posted as ofρ. SinceM isinvariantunderunitarytransformations, eprint quant-ph/0412006, it was brought to my atten- and since unitary transformations include permutations tion that the work presented here overlaps with concur- asasubgroup,itisalsoasymmetricfunctionoftheeigen- rent work by Barchielli and Lupeiri (quant-ph/0409019 values. Thus M is Schur-concave. and quant-ph/0412116). [1] J.P.Gordon in Quantum Electronics and Coherent Light, Transm. (USSR)9, 177 (1973)]. Proceedings of the International School of Physics ‘En- [4] H.P. Yuen and M. Ozawa, Phys. Rev. Lett. 70, 363 rico Fermi’ XXX1, edited by P.A. Miles (Accademic (1993). Press, NewYork, 1964). [5] C.A. Fuchs and C.M. Caves, Phys. Rev. Lett. 73, 3047 [2] L.B. Levitin, in Proceedings of the All-Union Confer- (1994). ence on Information Complexity and Control in Quan- [6] B. Schumacher, M. Westmoreland and W.K. Wootters, tum Physics, (Mockva-Tashkent, Tashkent, 1969), Sec. Phys. Rev.Lett. 76, 3452 (1996). II (in Russian); L.B. Levitin, in Information Complexity [7] While it is an abuse of notation to denote the ensem- andControlinQuantumPhysics,editedbyA.Blaquieve, ble probabilities by P(i), and the (in general unrelated) S.DinerandG.Lochak (Springer,NewYork,1987), pp. outcome probabilities by P(j), we use it systematically 15-47. throughout, since we feel it keeps the notation simpler, [3] A.S.Holevo,Probl.PeredachiInf.9,3(1973)[Probl.Inf. and thusultimately clearer. 7 [8] H.Carmichael, An Open Systems Approach to Quantum Vol. 190 (Springer-Verlag, Berlin, 1983). Optics (Springer-Verlag, Berlin, 1993); H.M. Wiseman [16] E.H. Lieb, Ad. Math. 11, 267 (1973); E.H. Lieb and and G.J Milburn, Phys.Rev.A 47, 642 (1993). M.B. Ruskai, Phys. Rev. Lett. 30, 434 (1973); J. Math. [9] If the observer has only partial information about the Phys. 14, 1938 (1973); In addition, a much simpler outcome of a measurement, then if we label the out- proof of strong subadditivity has been obtained by Petz comesbyn(withassociatedmeasurementoperatorsBn), [Rep. on Math. Phys. 23, 57 (1986)] and is described in the most general situation is one in which the observer M.A. Nielsen and D. Petz, Eprint: quant-ph/0408130. knows instead the value of a second variable m, where [17] M. Ozawa, J. Math. Phys. 27, 759 (1986). mis related tonbyan arbitrary conditional probability [18] M.A. Nielsen, Phys. Rev.A 63, 022114 (2001). P(m|n). This general case is encompassed by the two- [19] C.A. Fuchs and K. Jacobs, Phys. Rev. A 63, 062305 index formulation we use in the text. To see this, one (2001). sets k = n, j = m and chooses Anm(≡ Akj) = αnmBn. [20] A discussion of this point may be found in K. Jacobs, Then by giving the observer complete knowledge of j, Quant. Information Processing 1, 73 (2002). andnoknowledgeofk,wereproducepreciselythegeneral [21] H.Barnum,Information-disturbance tradeoffinquantum case described above by choosing αnm so that |αnm|2 = measurement on the uniform ensemble and on the mutu- P(m|n). ally unbiased bases, Eprint: quant-ph/0205155. [10] ThevonNeumannentropyisnottheonlyquantitywhich [22] G. Cassinelli, E. De Vito, A. Toigo, Positive operator can be used to measure the achieved level of control. valued measures covariant with respect to an irreducible The von Neumann entropy specifically gives the mini- representation, Eprint: quant-ph/0302187 mum possible entropy of the results of a measurement [23] A.W.Marshall andI.Olkin,Inequalities: Theory of Ma- onthesystem.Itthereforemeasuresthemaximuminfor- jorization and Its Applications, (Academic Press, New mation(strictly,theminimuminformationdeficit)which York, 1979). the user who is performing the control has about of the [24] R. Bhatia, Matrix Inequalities, (Springer, Berlin, 1997). future behavior of the system under measurement. An [25] M. A.Nielsen, Phys.Rev.Lett. 83, 436 (1999). example of another measure of control is the maximum [26] D.JonathanandM.B.Plenio,Phys.Rev.Lett.83,1455 eigenvalue of the density matrix. Under the assumption (1999); 83, 3566 (1999); G. Vidal, Phys. Rev. Lett. 83, thatallunitaryoperationsareavailabletothecontroller, 1046 (2000). this measures the probability that the controlled system [27] M. A. Nielsen and J. Kempe, Phys. Rev. Lett. 86, 5184 will befound in the desired state. (2001). [11] A. Doherty, K. Jacobs and G. Jungman, Phys. Rev. A [28] A. Chefles, Phys. Rev.A 65, 052314 (2002). 63, 062306 (2001). [29] K.R.W Jones, Phys.Rev.A 50, 3682 (1994). [12] M.J.W. Hall, Phys.Rev.A 55, 100 (1997). [30] G. M. D’Ariano, P. Lo Presti, P. Perinotti, ‘Clas- [13] K.Jacobs, Phys.Rev.A 68, 054302(BR) (2003). sical randomness in quantum measurements’, Eprint: [14] B. Schumacher,Phys. Rev.A 54, 2614 (1996). quant-ph/0408115. [15] K. Kraus, States, Effects and Operations: Fundamental Notions of Quantum Theory, Lecture Notes in Physic