ebook img

Dependence of a quantum mechanical system on its own initial state and the initial state of the environment it interacts with PDF

0.7 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Dependence of a quantum mechanical system on its own initial state and the initial state of the environment it interacts with

When does a quantum mechanical system depend on the initial conditions of the system or the environment? Adrian Hutter1,∗ and Stephanie Wehner1 1Centre for Quantum Technologies, National University of Singapore, 2 Science Drive 3, 117543 Singapore (Dated: December 8, 2011) Deriving rigorous bounds for the time scales that are needed for thermalization forms one of the most vexing problems when it comes to understanding statistical mechanics from the principles of quantummechanics. Onecentralaspectinobtainingsuchboundsistodeterminehowlongasystem retains memory of its initial conditions. By viewing this problem from an quantum information theory perspective, we are able to simplify part of this task in a very natural and easy way. We first show that for any interaction between the system and the environment, and almost all initial states of the system, the question of how long such memory lasts can be answered by studying the temporal evolution of just one special initial state. This special state thereby depends only on 1 our knowledge of macroscopic parameters of the system. We provide a simple entropic inequality 1 for this state that can be used to determine whether mosts states of the system have, or have not 0 become independent of their initial conditions after time t. Analyzing the rate of entropy change 2 over time for a particular kind of interaction then allows us to place rigorous bounds on such time c scales. We make a similar statement for almost all initial states of the environment, and finally e provide a sufficient condition for which a system never thermalizes, but remains close to its initial D state for all times. 6 We are all familiar with thermalization on a macro- However, thereisofcourseonemorepressingquestion ] h scopic level - simply consider what happens when you when it comes to thermalization: just how long does it p leave your cup of coffee untouched for a while. Yet, un- take for a system to thermalize? Indeed, the problem of - derstanding this process from a microscopic level forms deriving rigorous bounds for the time scales which are t n a challenging endeavour. How could we hope to justify needed for thermalization has recently been called the a thermalization from the rules of quantum mechanics? mostimportantopenproblemintheprojectofjustifying u statistical physics from first principles of quantum me- q To tackle this problem it is helpful to break it up into [ chanics [2, 3]. Only rather weak bounds are known on smaller,moremanageable,components. As[1]pointout, such time scales in general [4], alongside some analyti- 3 the straighforward-looking process of thermalization ac- cal [5, 6] and numerical [7–9] results for specific systems. v tually consists of four aspects which may be addressed Note that we may pose the question of time-scales to 0 independently. Roughly speaking, they deal with several each of the aspects above individually. For example, we 8 different questions that we might ask about a system S 0 couldasknotonlyif asystemlosesmemoryofitsinitial afteritisplacedintocontactwithanenvironment(bath) 3 state, but just how fast this memory loss occurs. This E. The first of these is whether the system equilibrates. . is the approach we will take here, where we indeed focus 1 That is, does it eventually evolve towards some particu- 1 on the system’s memory of the initial conditions which lar equilibrium state and remain close to it? Note that 1 plays a crucial role in understanding thermalization [8]. when we only ask about equilibration, we do not care 1 To study the question of time-scales it is thereby not what form this equilibrium state actually takes. In par- : enough to study long time averages [1], but ideally we v ticular, it may depend on the initial state of the system i want to make statements about the actual state of the X and/ortheenvironmentanddonotneedtobeathermal system at a particular fixed time t. Instead of asking state. Asecondquestionisthuswhetherthisequilibrium r questions about the equilibrium state, we thus ask a state is indeed independent of the intial state of the sys- tem. Note that one may also think of this question as • Independence of the initial state of the system. At asking whether the system retains at least some amount time t, does the state of the system depend on the ofmemoryofitspreciseinitialconditionsinequilibrium. precise initial state of the system? (or only on its Similarly,thethirdquestionaskswhethertheequilibrium macroscopic parameters?) state depends on the precise details of the intial state of the environment, or only on its macroscopic parameters • Independence of the initial state of the environ- such as temperature. Finally, if we find that the equilib- ment. At time t, does the state of the system de- rium state of the system is indeed independent of such pendonthepreciseinitialstateoftheenvironment? initial states, we may then ask whether it actually takes (or only on its macroscopic parameters?) on the familiar Boltzman form. Since independence of such initial conditions is a neces- sary condition for the system to be in a thermal state, analyzing said time-scales places a lower bound on the ∗Electronicaddress: [email protected] time that it takes to thermalize. 2 Beforestatingourresults,letusfirstdescribeoursetup morecarefully. ConsiderasystemS andanenvironment E described by Hilbert spaces H and H respectively. S E Macroscopicconstraintsimposedonthesystemortheen- vironment take the form of a subspaces H ⊆ H and ΩS S H ⊆ H respectively. Before placing them into con- ΩE E tact, the system and the environment are uncorrelated. That is, the initial state of H ⊗H at time t=0 takes S E the form |φ(cid:105) ⊗|ψ(cid:105) , where to explain our result we will S E for simplicity assume that |φ(cid:105) ∈ H and |ψ(cid:105) ∈ H S ΩS E ΩE are pure states [38]. Interaction between the system and the environment is governed by the Hamiltonian H . SE We will also need to quantify the amount of entropy in the system and its environment. The relevant quan- tities for a single experiment (a.k.a. single shot) are the min- and max-entropies, well established in quantum in- formation theory. For a single system these can easily FIG.1: Ifforthestateτ (t)theentropyoftheenvironment be expressed in terms of the eigenvalues {λ } of the SE j j exceeds the entropy of the system at time t, then the system (cid:80) state ρ = λ |j(cid:105)(cid:104)j| as H (S) =−logmax λ and S j j min ρ j j has lost memory of its initial state and is close to τS(t), for (cid:80) (cid:112) H (S) = 2log λ . Both quantities enjoy nice almost all possible initial states |φ(cid:105) . Note that we make no max ρ j j S operational interpretations in quantum information [10] statement about equilibration merely that almost all states “follow” τ (t), which may or may not equilibrate. as well as thermodynamics [11, 12]. We will also refer S to smoothed versions of these quantities Hε and Hε min max which can be thought of as equal to the original quan- where π denotes the maximally mixed state on H . tity, except up to an error ε. We refer to the appendix ΩS ΩS More precisely, we prove that if for a particular time t, for a detailed introduction. Both quantities converge to (cid:80) we have that thevonNeumannentropyH(S) =− λ logλ inthe ρ j j j asymptotic limit [13] of many experiments. Hε (E) (cid:38)Hε (S) , (2) Finally, whenwesaythattwoquantumstatesρandσ min τ max τ areclose,w√emeanthattheirtracedistance(cid:107)ρ−σ(cid:107)1 with then the system will be independent of its initial state, (cid:107)A(cid:107) = tr A†A is very small. As the trace distance for all but exponentially few initial states |φ(cid:105) . All but 1 S quantifies how well we can distinguish ρ from σ when exponentiallyfewtherebymeansthatvolumeofstateson given with equal probability [14], this says that there H thatdoesnot havethispropertyisnegligible,aswe ΩS existsnophysicalprocessthatcaneasilytellthemapart. will outline in more detail in the methods section below. In this case, the state of the system will be close to τ (t) S in trace distance, as illustrated in Figure 1. Note that I. RESULTS this state only depends on the macroscopic constraints of the system, and not of any of the individual initial states that we might place the system in. However, if We are now ready to state our results. We emphasize that whereas our explanations here are rather informal Hε (E) (cid:46)Hε (S) , (3) for the purpose of illustration, our results are fully max τ min τ rigorous and precise statements as well as technical then for a substantial fraction of initial states |φ(cid:105) the S details can be found in the appendix. system does (still) depend on it. Indeed, this is the case at time t=0 and it does of course depend on the details Independence of the initial state of the systemWe of H whether (2) can ever be satisfied at a later point SE first consider the role of the initial state of the system. in time. Our condition is tight up to differences in min- Letusthusfixthestateoftheenvironment|ψ(cid:105) . Before E and max-entropies, which vanish in the asymptotic limit embarking on the study of time-scales, we first show our where both quantities converge to the von Neumann en- main result, namely that we can considerably simplify tropy. This limit is relevant e.g. when considering quan- the problem for any interaction H . In particular, we SE tum memories. Our result easily extends to the case show that whether the system has become independent where the initial state of the system is correlated with ofitsinitialstate|φ(cid:105) ∈H aftertimetcan,foralmost S ΩS a reference system (i.e. it is not pure). We emphasize all initial states, be decided by analyzing the temporal thatourresultisentirelydifferentfrom[15]whichmakes evolution U(t) = exp(−iH t) of just one special state SE statements about “almost all” states on S and E. given by Note that for (2) to hold, we need the environment to be of sufficient size to “absorb” the entropy of the τ (t)=U(t)(π ⊗|ψ(cid:105)(cid:104)ψ| )U(t)† , (1) SE ΩS E system. It is easy to make this idea precise and show 3 that for any |ψ(cid:105) and any interaction Hamiltonian H independence of the initial state of the environment is E SE ifthesystemissufficientlylarge(i.e., logd (cid:38)2logd ) thereforeofnousetoinvestigatethetime-scalesonwhich ΩS E thenthestateofthesystemalways dependsonitsinitial thermalization happens; hence we focus on indepence of state, for almost all initial states |φ(cid:105) . the system’s own initial state. We can guarantee that S We note that the criterion presented above is tight up the system still “remembers” its initial state as long as to differences between smooth min- and max-entropies (3) with τ (t) as defined in (1) is fulfilled. It is thus SE and becomes literally tight in an i.i.d. scenario where interesting to study how fast Hε (S) decreases from min τ each of a large number n of systems S is undergoing the its initial value logd and how fast Hε (E) increases ΩS max τ same the same interaction with its local environment. fromzero. Theanswertothisquestionofcoursedepends In this limit, a tight criterion in terms of the von on the speficic model under consideration. In a physical Neumann entropy can be given, as discussed in detail model with only local interactions we expect, for exam- in the appendix. Physically, we interpret this limit as a ple, that the entropies in S and E respectively can be quantum memory H⊗n suffering the influence of noise. changed the faster, the larger the interaction surface be- S tween S and E is. While the rate of change of the von Independence of the initial state of the environ- Neumann entropy has been studied intensively [16–18], mentWeproceedtoshowasimilarstatementaboutthe no such results exist for the one-shot entropies Hε and min role of the initial state of the environment. Let us thus Hε . Asanillustrationofourmethod, wederiveasim- max now fix the state of the system |φ(cid:105) . We show that to pleboundforallH withouttakinglocalityconstraints S SE determine whether at time t the system depends on the into account. Namely we show that changes of Hε (S) min τ initial state of the environment can, for almost all initial andHε (E) needalotoftimeifeithertheinitialstate max τ states |ψ(cid:105) ∈H , be decided by considering the state E E τ (0)=π ⊗|ψ(cid:105)(cid:104)ψ| (7) SE ΩS E τ (t)=U(t)(|φ(cid:105)(cid:104)φ| ⊗π )U(t)† , (4) SE S ΩE is close to commuting with the Hamiltonian H or if SE where π denotes the maximally mixed state on H . the interactive part of the Hamiltonian [39] ΩE ΩE In particular, we again obtain an entropic condition. If for a particular time t Hint :=HSE −HS ⊗IE −IS ⊗HE (8) Hε (E) (cid:38)Hε (S) , (5) is weak, if measured in the usual operator norm. Apply- min τ max τ ing these findings to criterion (3), we find that S retains then the system is independent of the initial state of the at least some memory about its initial-state for times of environment, for all but exponentially few initial states order |ψ(cid:105)E. Inparticular,thestateofthesystematsuchtimes (cid:18) (cid:26) 1 1 (cid:27)(cid:19) isveryclosetoτS(t),whichonlydependsonmacroscopic O max , . 4(cid:107)H (cid:107) (cid:107)[π ⊗|ψ(cid:105)(cid:104)ψ| ,H ](cid:107) constraints of the environment. If on the other hand int ∞ ΩS E SE 1 (9) Hε (E) (cid:46)Hε (S) , (6) max τ min τ Atoymodelshowsthatthissimpleboundisindeedtight then a substantial fraction of initial states of the envi- uptosomeconstantfactor. Stillweexpecttofindbetter ronment lead to different states of the system at time boundsincaseswherewedo imposelocalityassumptions t. Our condition is again tight up to differences in min- ontheHamiltonianH . Inthiscase,resultsoftheLieb- SE and max-entropies of the state τ (t), which vanish in Robinsontypelike[19]maybeappliedtoboundtherates SE the asymptotic limit. with which the min- and max-entropies can be changed. As before, our result easily leads to a statement Our criterion hence opens the door to improved lower we are already familiar with. Namely, that if the bounds on the thermalization times. environment is very large compared to the system (i.e. Recent work tackled the problem of thermalization logd (cid:38) 2logd ) then the state of the system will time-scales from another angle: in [20] sufficient time- ΩE S be the same for all but exponentially few initial states scales for equilibration were derived. In contrast, note of the environment restricted to H . Note that the that we are interested in necessary time-scales for ΩE condition “all but exponentially few” is not a mathe- thermalization. matical artifact of our proof. For some examples, one can find very specific initial states of the environment Absence of thermalization Finally, we consider the that will lead to observable effects on the system even questionwhetheritisatallpossibleforthesystemtofor- if the environment is large. A similar statement was get about its initial conditions, even if the environment shown before for almost all times [1]. Since our result islarge. In[1]itisshownthatthetemporalaverageofS holdsforall times,itdoesinparticularimplysaidresult. will be independent of its initial state if the relevant en- ergy eigenstates of H are sufficiently entangled. Here, SE Time-scalesInphysicalsituationsweusuallyexpectthe we prove a converse result. That is, we provide sufficient environment to be dimesion-wise much larger than the conditionsunderwhichasystemnever becomesindepen- system. Following the above discussion, the criterion of dentofitsinitialstate,forallbutexponentiallyfewinitial 4 states of the environment. Our result extends a recent A central theorem in quantum information theory resultof[21],whichwecomparetooursintheappendix. known as the decoupling theorem [22, 23] quantifies The most important advantage of our result is that we whether a particular process, i.e. a quantum channel can make statements about the time-evolved state of S T acting on the system A can have this decoupling A→B (asopposedtostatementsabouttemporalaverages)and effect. Originally devised to demonstrate the existence do not require S to be small. Roughly, we show that of certain encoding schemes to preserve quantum infor- if those eigenstates of the Hamiltonian which on S have mation[24], wewillemployithereinadifferentcontext. mostoverlapwiththeinitialstatearenotsufficientlyen- Usingtechniquesfrommeasureconcentrationandthede- tangled, the state of the system will remain close to its coupling theorem of [23] we obtain that for any ε ≥ 0, initial state for all times, for all but exponentially few δ >0 initial states |ψ(cid:105) of the environment. E (cid:110) (cid:111) Let us now explain more precisely what we mean Pr (cid:107)TA→B(|φ(cid:105)(cid:104)φ|A)−TA→B(πA)(cid:107)1 ≥2−ζ/2+12ε+δ by these conditions. Note that the energy eigenstates |φ(cid:105)A {|Ek(cid:105)SE}k=1,...,dSdE form a basis of the product space ≤2exp(−dAδ2/16) , (11) H ⊗H . Assumethatwewanttoapproximatethisbasis S E where the probability is taken over the Haar measure, by a product basis {|i(cid:105) ⊗|j(cid:105) } . That S E i=1,...,dS,j=1,...,dE π =I /d , and is, to each energy eigenstate |E (cid:105) we assign the ele- A A A k SE ment of the product basis |i(cid:105) ⊗|j(cid:105) which best approx- S E ζ =Hε (A(cid:48)|B) (12) imates it and assume that this correspondence is one-to- min τ (cid:18) (cid:19) oanree.aLsseitgnIe(di)tdoenaostteattheeosfetthoef feonremrgy|i(cid:105)eig⊗en|sjt(cid:105)ate,swwithhicha ≥Hm2εin(A(cid:48)B)τ −Hm2εax(B)τ −O log1ε . (13) S E fixed i and an arbitrary j. We introduce the quantity The state τ appearing in the last expression is the δ(i) to quantify how well the energy eigenstates in I(i) A(cid:48)B Choi-Jamiol(cid:32)kowski representation of the channel T are approximated by an element of the product basis, A→B τ =(I ⊗T )|Ψ(cid:105)(cid:104)Ψ| , (14) δ(i):= min max {|(cid:104)E | |i(cid:105) |j(cid:105) |} . (10) A(cid:48)B A A→B A(cid:48)A k SE S E |Ek(cid:105)∈I(i)j=1,...,dE where |Ψ(cid:105) is the maximally entangled state across A A(cid:48)A Let ρS(t) denote the state of the system at time t and and A(cid:48) and IA denotes the identity channel. Clearly, to assume that its initial state was ρ (0)=|i(cid:105)(cid:104)i| . Then at obtain a strong bound, we need ζ to be big for a small ε. S S anytimettheprobabilitythatρS(t)isfurtherawayfrom Intuitively, ζ measures “how good the channel TA→B is its initial state than 4δ(i)(cid:112)1−δ(i)2 (in trace distance atdecoupling”,i.e. atdestroyingpotentialcorrelationsa (non-pure) input-state on A might have to some outside (cid:107)...(cid:107) ) is exponentially small. This radius is small if 1 reference R. In particular, the bound on ζ tells us that δ(i) is close to 1, that is, if the enery eigenstates which on S are most similar to |i(cid:105)(cid:104)i|S are sufficiently close to for Hm2εin(A(cid:48)B)τ (cid:38)Hm2εax(B)τ almost all states |φ(cid:105)A have product. The probability is computed over the choice of become indepndent and are close to τA. A more general the initial state of the environment |ψ(cid:105) . statement involving mixed input states can be found in E the appendix. In the appendix, we derive a statement converse to the above which we briefly sketch here: We show that II. METHODS if Hε (A(cid:48)B) (cid:46) Hε (B) , then there is no state ω max τ min τ B whichissuchthatmostinputstates|φ(cid:105)(cid:104)φ| yieldachan- A Let us now explain the main conceptual idea in prov- nel output T (|φ(cid:105)(cid:104)φ| ) which is close to it. A→B A ing our results on initial state independence - our results To see how we can apply this result to our situation, on time scales and absence of thermalization then follow note that for the product initial state |φ(cid:105) ⊗|ψ(cid:105) with S E using involved, but relatively standard technical meth- |φ(cid:105) ∈ H ⊆ H and |ψ(cid:105) ∈ H ⊆ H we find the ods as outlined in the appendix. For simplicity, let us S ΩS S E ΩE E state of the system after time t to be thereby first consider the question whether the system depends on its initial state |φ(cid:105)S ∈ HΩS at time t. The ρS(t)=trE(cid:2)U(t)(|φ(cid:105)(cid:104)φ|S ⊗|ψ(cid:105)(cid:104)ψ|E)U(t)†(cid:3) . (15) key idea to our proof is to take an information theoretic standpointandlookatthisproblemfromtheperspective We can look at this either as a quantum channel T S→S of an outside reference system R who prepares the sys- (taking |φ(cid:105)(cid:104)φ| as an input) or as a quantum channel S tem S in its initial state. That is, R and S are initially T (taking |ψ(cid:105)(cid:104)ψ| as an input). The first channel E→S E correlated which, in the simple case that S is pure, can captures the influence that the initial state of the sys- beunderstoodasR havingpreparedS inadefinitestate tem has on the system state at time t. The latter cap- |φ(cid:105) . Instead of asking whether the system still depends turestheinfluenceoftheinitialstateoftheenvironment. S onitsinitialstateattimet, wecannowequivalentlyask Applying the above results to these channels and using whether R still “knows” about the system state at time somebasicpropertiesofthesmoothentropiesthenyields t, or whether R has become decoupled from S. our results about independence of the initial state in a 5 straightforward manner. For example, for T , we ap- S→S ply our theorems with A = S and B = S. Note that a purification of τ is simply obtained by ommitting the S(cid:48)S trace over E in the channel, i.e., τ (t)=U (t)(|Ψ(cid:105)(cid:104)Ψ| ⊗|ψ(cid:105)(cid:104)ψ| )U (t)† . (16) S(cid:48)SE SE S(cid:48)S E SE Tracing over S(cid:48) then gives the special state τ (t) SE from (1). Since τ is pure, we have H (S(cid:48)S) = S(cid:48)SE min H (E) yielding the claimed entropy conditions. min III. DISCUSSION We have shown that the problem of finding lower bounds on thermalization times can be simplified con- siderably – for almost all states it suffices to understand thetemporalevolutionofthespecialstateτ ,orrather SE changes in entropy for this state. Whereas our general exampleboundsforentropychangesareratherweak,our result opens the door for more sophisticated methods to beapplied,takingintoaccountthelocalityoftheHamil- tonian. The emergence of a special state τ is indeed SE somewhat analogous to the setting of channel coding, where themaximally entangled stateplays animportant roleinquantifyingachannelscapacitytocarryquantum information. Note, however, that we do not ask about how much quantum information could be conveyed by using any form of coding scheme. Furthermore, merely asking whether the state of the system depends on its initial state after some time, or in more information the- oreticterms,askingwhethertheoutputstateofthechan- nel depends on its input state does (unlike in the classi- cal world) not immediately answer the question whether this channel is useful for transmitting quantum informa- tion [40]. Furthermore,notethatallourstatementshold“foral- most all initial states from the Haar measure”, i.e., we make statements about the volume of states. Of course, from a given starting state it is in general not the case thatallsuchstatescouldbereachedinaphysicalsystem, andhenceonemightquestiontherelevanceofourresults. Note, however, that our approach applies to any set of unitaries which have such a decoupling effect. In [25] it is shown that random two-qubit interactions efficiently approximate the first and second moments of the Haar distribution,therebyconstitutingapproximate2-designs. This is all one needs for decoupling [26, 27]. It is an in- teresting open question what other sets of unitaries have this property. Acknowledgments We gratefully acknowledge Andrew Doherty, Jens Eis- ert,ChristianGogolinandTonyShortforinterestingdis- cussions and comments. This research was supported by the National Research Foundation and Ministry of Edu- cation, Singapore. 6 In this appendix, we provide more detailed explanations and technical details of our claims. We would like to emphasize that from a quantum information theory perspective our proof is appealing in its simplicity - contrary to what the length of this appendix may suggest. However, for convenience of the reader we provide background material. Thecentralideabehindourapproachistothinkofthetime-evolutionasaquantumchannel. Whenconsideringwhat happens to the initial state of the system, we can think of the channel T (ρ ) = tr (cid:2)U(t)(ρ ⊗|ψ(cid:105)(cid:104)ψ| )U(t)†(cid:3) S→S S E S E where |ψ(cid:105) is the intial state of the environment. Similarly, when considering the initial state of the environment, we E will think of the channel T (ρ )=tr (cid:2)U(t)(|φ(cid:105)(cid:104)φ| ⊗ρ )U(t)†(cid:3) for fixed |φ(cid:105) . E→S E E S E S We first provide the necessary background material about smooth min- and max-entropies (Section A). These entropies allow us to state a criterion for which a channel maps almost all its inputs to the same output (see Section B).Weproceedtoapplythiscriteriontothequestionofindependenceoftheinitialstateoftheenviornment(Section C), and independence of the inital state of the system (Section D). Finally, we provide sufficient conditions for which the system stays close to its initial state for all times (Section E). Contents I. Results 2 II. Methods 4 III. Discussion 5 Acknowledgments 5 A. Entropy measures and the time needed to change them 7 1. Notation 7 2. Min- and max-entropy 7 3. Distance measures 7 4. Smooth entropy measures 9 5. Chain rules 10 6. The times which are necessary for entropy changes 11 B. Our criterion 14 1. Quantum channels 14 2. Informal version 15 3. Main tools 16 4. Discussion 18 5. In the i.i.d. scenario 19 C. Independence of the initial state of the environment 20 1. Essentially tight version 20 2. Simple version for large environments 21 D. Independence of the initial state of the system 23 1. Main result 23 2. Time-scales 24 a. Achievability 25 3. I.i.d. interactions 26 E. Absence of thermalization 27 1. Previous results 27 2. Preliminaries 27 3. Theorem and proof 28 F. Technical lemmas 32 References 34 7 Appendix A: Entropy measures and the time needed to change them 1. Notation I The dimension of the Hilbert space HA is denoted by dA. We write πA := dAA to denote the fully mixed state on system A and |Ψ(cid:105)AA(cid:48) := √1dA (cid:80)di=A1|i(cid:105)A⊗|i(cid:105)A(cid:48) to denote the maximally entangled state between HA and HA(cid:48), a copy of H . We introduce the set of normalized density operators A S (H ):={ρ ∈Herm(H ):ρ ≥0,trρ =1} (A1) = A A A A A as well as the set of subnormalized density operators S (H ):={ρ ∈Herm(H ):ρ ≥0,trρ ≤1} . (A2) ≤ A A A A A For a Hermitian operator O we write λ (O) to denote its largest eigenvalue. For an arbitrary linear operator M, max √ (cid:112) let (cid:107)M(cid:107) := λ (M†M) and (cid:107)M(cid:107) :=tr M†M. With log we denote the binary logarithm. ∞ max 1 2. Min- and max-entropy For ρ ∈S (H ) we introduce the min-entropy of A conditioned on B as AB ≤ AB H (A|B) := sup sup(cid:8)λ∈R:2−λI ⊗σ ≥ρ (cid:9) (A3) min ρ A B AB σB∈S=(HB) and the max-entropy of A conditioned on B as H (A|B) := sup log[F(ρ ,I ⊗σ )]2 . (A4) max ρ AB A B σB∈S=(HB) √ ForatrivialsystemB theysimplifytoH (A) =−logλ (ρ)andH (A) =2logtr ρ . Forρ ∈S (H ), min ρ max max ρ A AB = AB let H(A|B) denote the well-known von Neumann entropy of A conditioned on B. Then we have from [13, Lemma 2 ρ and Lemma 20] that −logd ≤H (A|B) ≤H(A|B) ≤H (A|B) ≤logd (A5) min min ρ ρ max ρ A where d :=min{d ,d }. min A B An example of the operational significance of H (A|B) is that its negative quantifies the maximal number of min ρ fully entangled bits achievable from ρ with local operations restricted to B. H (A|B) quantifies, for instance, AB max ρ howrandomAappears(whenusedtogenerateakey,forexample)fromthepointofviewofanadversarywithaccess to B [10]. 3. Distance measures We call the metric (cid:107)ρ−σ(cid:107) induced by the 1-norm the trace distance of ρ,σ ∈S (H) and omit the usual factor 1 1 ≤ 2 in the definition. This distance measure determines the maximal distinguishability of the states ρ and σ [14]. A notion of the similarity of two states is given by the fidelity which generalizes the Hilbert space scalar product to mixed states. For ρ,σ ∈S (H) it is defined by ≤ (cid:13)√ √ (cid:13) F(ρ,σ):=(cid:13) ρ σ(cid:13) . (A6) 1 If one of the states is pure, say ρ=|ψ(cid:105)(cid:104)ψ|, we have (cid:112) F(|ψ(cid:105)(cid:104)ψ|,σ):= (cid:104)φ|σ|φ(cid:105) . (A7) The fidelity can only increase under CPTPM’s (e.g. partial traces) [28], i.e. F (T(ρ),T(σ))≥F(ρ,σ) . (A8) Many important properties involving the fidelity can be derived from the following theorem [29]. 8 Theorem A.1 (Uhlmann’s theorem). For ρ,σ ∈S (H) we have = F(ρ,σ)= max |(cid:104)ψ|φ(cid:105)| (A9) |ψ(cid:105),|φ(cid:105) where the maximum is over all purifications |ψ(cid:105) of ρ and |φ(cid:105) of σ. For a fixed purification |ψ(cid:105) it suffices to maximize over all |φ(cid:105). The fidelity and the trace distance are essentially equivalent measures of the distance/similarity of two states ρ,σ ∈S (H), as shown by the Fuchs-van de Graaf inequalities [30] = 1 (cid:112) 1−F(ρ,σ)≤ (cid:107)ρ−σ(cid:107) ≤ 1−F(ρ,σ)2 . (A10) 2 1 By use of the fidelity we can define a distance measure satisfying many natural conditions. We first indroduce the generalized fidelity for subnormalized states ρ,σ ∈S (H) ≤ F¯(ρ,σ):=F(ρ,σ)+(cid:112)(1−trρ)(1−trσ) (A11) which coincides with the usual fidelity if at least one of the states is normalized. This allows us to define the purified distance (cid:113) P(ρ,σ)= 1−F¯(ρ,σ)2 . (A12) For subnormalized states ρ,σ ∈S (H) the purified distance satisfies the following properties [31]: ≤ • It is a metric. • It cannot increase under CPTPM’s. • It is invariant under extensions and purifications in the sense that for every extension (purification) ρ¯of ρ we can find an extension (purification) σ¯ of σ such that P(ρ,σ)=P(ρ¯,σ¯). WecanfindastatementsimilartotheFuchs-vandeGraafinequalitiesforthepurifieddistanceandforsubnormalized states. Lemma A.2. For ρ,σ ∈S (H) we have ≤ 1 (cid:113) (cid:107)ρ−σ(cid:107) ≤P(ρ,σ)≤ 2(cid:107)ρ−σ(cid:107) . (A13) 2 1 1 If ρ,σ ∈S (H) we have = 1 (cid:113) (cid:107)ρ−σ(cid:107) ≤P(ρ,σ)≤ (cid:107)ρ−σ(cid:107) . (A14) 2 1 1 Proof. Combining [31, Definition 1 and Lemma 6] we have 1 1 (cid:113) (cid:107)ρ−σ(cid:107) + |trρ−trσ|≤P (ρ,σ)≤ (cid:107)ρ−σ(cid:107) +|trρ−trσ| . (A15) 2 1 2 1 The second statement then follows trivially, the first statement follows with the observation that |trρ−trσ|≤(cid:107)ρ−σ(cid:107) . (A16) 1 By use of the purified distance we are able to define neighbourhoods of mixed states. For ρ ∈ S (H) and ε ≥ 0 ≤ with trρ≥ε2 we define an ε-ball in S (H) around ρ as ≤ Bε(ρ):={σ ∈S (H):P(ρ,σ)≤ε} . (A17) ≤ From the triangle inequality for P we find the following triangle inequality for the ε-balls: τ ∈Bε(ρ)∧σ ∈Bε(cid:48)(τ)⇒σ ∈Bε+ε(cid:48)(ρ) . (A18) For more details about the purified distance and ε-balls we refer to [31]. 9 4. Smooth entropy measures A problem with the conditional min- and max-entropies introduced above is that they are sensitive to small varia- tions of the state on which they are defined whereas the physical quantities we are bounding with them generally are not. Following an idea first introduced to quantum mechanics in [32] we will therefore use “smooth” versions of these entropy measures. [41] Roughly speaking, the smoothing means that states which are highly untypical do not have to be taken into account. For ε≥0 and ρ ∈S (H ) we define the ε-smooth min-entropy of A conditioned on B as AB ≤ AB Hε (A|B) := sup H (A|B) (A19) min ρ min ρˆ ρˆAB∈Bε(ρAB) and the ε-smooth max-entropy of A conditioned on B as Hε (A|B) := inf H (A|B) . (A20) max ρ max ρˆ ρˆAB∈Bε(ρAB) Since all Hilbert spaces in this appendix are finite dimensional, we can and will replace the suprema and infima by maxima and minima, respectively. In particular, we will make use of the fact that there is a state in the ε-ball which achieves the extremal value. Note that Hε (A|B) is monotously increasing and Hε (A|B) monotously decreasing min ρ max ρ in ε. The relevance of the smooth entropies is due to the fact that they can be given an operational meaning in one-shot scenarios, where ε usually plays the role of an error probability. On the other hand, the von Neumann entropy is mainly relevant in an i.i.d. scenario. Hε (A|B), for example, quantifies the work cost to erase system A conditioned max on a memory B, except with a certain probability [11]. The smooth min- and max-entropy are dual to each other in the sense that if ρ ∈S (H ) is pure we have ABC ≤ ABC [31] Hε (A|B) =−Hε (A|C) . (A21) min ρ max ρ Furthermore, Hε (A|B) is invariant under isometries acting on A or B, i.e. it does not depend on the Hilbert space min ρ used torepresentthe density operator locally. These two propertiesof the smooth entropy measures cruciallydepend on the choice of P as the relevant distance measure. The smooth entropy measures share natural properties with the usual von Neumann entropy like strong subadditivity [31, footnote 7] Hε (A|BC) ≤Hε (A|B) and min ρ min ρ Hε (A|BC) ≤Hε (A|B) . (A22) max ρ max ρ It canbe seenfrom the Schmidtdecomposition thatfor apure state |φ(cid:105)(cid:104)φ| the entropiesof the marginalson the AB A- and B-subsystem are identical. This observation generalizes to the case of the smooth entropy measures. Lemma A.3. Let |φ(cid:105)(cid:104)φ| ∈S (H ⊗H ) be a pure state. Then, AB = A B Hε (A) =Hε (B) and min φ min φ Hε (A) =Hε (B) . (A23) max φ max φ Proof. Sincetr |φ(cid:105)(cid:104)φ| andtr |φ(cid:105)(cid:104)φ| havethesameeigenvalues, thereisanisometrymappingonetotheother. A AB B AB The statement then follows directly from the invariance of the smooth entropy measures under isometries. While the smooth entropy measures coincide (for ε → 0) with the von Neumann entropy for density operators which are proportional to projectors, they are strictly more general. We discover the von Neumann entropy from the smooth entropy measures in an i.i.d. (independent and identically distributed) scenario. Theorem A.4 (Fully Quantum Asymptotic Equipartition Property). [13] Let ε > 0 and let ρ ∈ S (H ⊗H ). AB = A B Then, 1 lim lim Hε (A|B) =H(A|B) (A24) ε→0n→∞n min ρ⊗n ρ 1 lim lim Hε (A|B) =H(A|B) . (A25) ε→0n→∞n max ρ⊗n ρ 10 5. Chain rules In order to deal with the introduced smooth entropy measures, chain rules are indispensable. Lemma A.5. [23, Lemma A.6.] Let ε>0, ε(cid:48),ε(cid:48)(cid:48) ≥0 and ρ ∈S (H ). Then, ABC = ABC 2 Hε(cid:48) (A|BC) ≤Hε+2ε(cid:48)+ε(cid:48)(cid:48)(AB|C) −Hε(cid:48)(cid:48) (B|C) +log . (A26) min ρ min ρ min ρ ε2 In the other direction (i.e. in order to lower-bound Hε (A|B) ), we will use two chain rules neither of which is min ρ stronger than the other. Since we will not need it, we omit the conditioning system C. Lemma A.6. For any ε≥0, ρ ∈S (H ⊗H ) we have AB ≤ A B Hε (A|B) ≥Hε (AB) −logd . (A27) min ρ min ρ B Proof. Choose ρ˜ ∈Bε(ρ) such that H (AB) =Hε (AB) . From [33, Lemma 3.1.10.][42] we have AB AB min ρ˜ min ρ H (A|B) ≥H (AB) −logrankρ . (A28) min ρ˜ min ρ˜ B By definition Hε (A|B) ≥H (A|B) and logrankρ ≤logd and hence the assertion. min ρ min ρ˜ B B Lemma A.7. Let ε>0 and ρ ∈S (H ⊗H ). Then, AB = A B Hεmin(A|B)ρ ≥Hm2εin(AB)ρ−Hm2εax(B)ρ−2·log2ε42 . (A29) Proof. We introduce the auxiliary quantity H (A) :=−sup(cid:8)λ∈R:ρ ≥2λ·ρ0(cid:9) (A30) R ρ A A where ρ0 denotes the projector onto supp(ρ ). Since H (A) is the negative logarithm of the smallest non-zero A A R ρ eigenvalue of ρ it is obvious that H (A) ≥logrankρ . Using (A28) we find R ρ A H (A|B) ≥H (AB) −logrankρ min ρ min ρ B ≥H (AB) −H (B) . (A31) min ρ R ρ By the definition of the smooth min-entropy and (A31) we have Hε (A|B) ≥ max {H (AB) −H (B) } min ρ min ρˆ R ρˆ ρˆAB∈Bε(ρAB) (cid:26) (cid:27) ≥ max max[H (AB) −H (B) ] . (A32) ωAB∈B2ε(ρAB) ΠB min ΠBωΠB R ΠBωΠB ThemaximummaxΠB rangesoverall0≤ΠB ≤IB suchthatΠBωABΠB ∈B2ε(ωAB)andhencebyuseofthetriangle inequality (A18) Π ω Π ∈Bε(ρ ). Using the auxiliary Lemma F.1 we find B AB B AB (cid:26) (cid:27) Hε (A|B) ≥ max H (AB) −min[H (B) ] . (A33) min ρ ωAB∈B2ε(ρAB) min ω ΠB R ΠBωΠB As a next step we choose ωAB =ω˜AB ∈B2ε(ρAB) such that Hm2εin(AB)ρ =Hmin(AB)ω˜. Hence we get Hε (A|B) ≥H2ε (AB) −min[H (B) ] , (A34) min ρ min ρ ΠB R ΠBω˜ΠB wwehecraenncohwootshee0m≤axΠimBu≤mImBawxiΠthB ΠraBnω˜gAesBoΠvBer∈alBl 2ε0(≤ω˜AΠBB) ≤sucIhBtshuacth that ΠBω˜ABΠB ∈ B2ε(ω˜AB). Using Lemma F.2 ε2 ε2 HR(B)ΠAω˜ΠA ≤Hm24ax(B)ω˜ −2·log24 . (A35) From this we finally obtain Hεmin(A|B)ρ ≥Hm2εin(AB)ρ−Hmε224ax(B)ω˜ +2·log2ε42 ε ε ε2 ≥Hm2in(AB)ρ−Hm2ax(B)ρ+2·log24 . (A36)

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.