ebook img

Statistical Complexity and Fisher-Shannon Information. Applications PDF

3.3 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Statistical Complexity and Fisher-Shannon Information. Applications

Ricardo Lo´pez-Ruiz, Jaime San˜udo, Elvira Romera, Xavier Calbet 2 1 0 2 Statistical Complexity and n a J Fisher-Shannon Information. 2 1 Applications ] D C . n – Book Chapter – i l n [ 2 January 13,2012 v 1 9 2 2 . 1 0 2 1 : v i X r a Contents 1 StatisticalComplexityandFisher-ShannonInformation.Applications 5 1.1 AStatisticalMeasureofComplexity.SomeApplications ......... 6 1.1.1 Complexityinaphasetransition:coupledmaplattices ..... 11 1.1.2 Complexityversustime:thetetrahedralgas .............. 16 1.2 TheStatisticalComplexityintheContinuousCase............... 29 1.2.1 Entropy/informationanddisequilibrium ................. 30 1.2.2 ThecontinuousversionCˆ oftheLMCcomplexity......... 32 1.2.3 PropertiesofCˆ ...................................... 33 1.3 Fisher-ShannonInformationProduct.SomeApplications ......... 38 1.3.1 Fisher-Shannoninformation:definitionandproperties ..... 38 1.3.2 Fisher-Shannonproductasanelectroniccorrelationmeasure 43 1.3.3 Fisherinformationforasingleparticleinacentralpotential 46 1.4 ApplicationstoQuantumSystems ............................ 47 1.4.1 Formulasinpositionandmomentumspaces.............. 47 1.4.2 TheH-atom......................................... 48 1.4.3 Thequantumharmonicoscillator....................... 52 1.4.4 Thesquarewell...................................... 56 1.4.5 Theperiodictable.................................... 58 1.4.6 Magicnumbersinnuclei.............................. 63 References..................................................... 65 3 Chapter 1 Statistical Complexity and Fisher-Shannon Information. Applications AbstractInthischapter,astatisticalmeasureofcomplexityandtheFisher-Shannon informationproductareintroducedandtheirpropertiesarediscussed.Thesemea- suresarebasedontheinterplaybetweentheShannoninformation,orafunctionof it,andtheseparationofthesetofaccessiblestatestoasystemfromtheequiprob- ability distribution, i.e. the disequilibrium or the Fisher information, respectively. Differentapplicationsindiscreteandcontinuoussystemsareshown.Someofthem areconcernedwithquantumsystems,fromprototypicalsystemssuchastheH-atom, the harmonic oscillator and the square well to other ones such as He-like ions, Hooke’satomsorjusttheperiodictable.Inallofthem,thesestatisticalindicators show an interesting behavior able to discern and highlight some conformational propertiesofthosesystems. 5 6 1 StatisticalComplexityandFisher-ShannonInformation.Applications 1.1 A StatisticalMeasure ofComplexity. SomeApplications Thiscenturyhasbeentoldtobethecenturyofcomplexity[1].Nowadaystheques- tion “whatis complexity?”is circulatingoverthe scientific crossroadsofphysics, biology,mathematicsandcomputerscience,althoughunderthepresentunderstand- ingoftheworldcouldbenourgenttoanswerthisquestion.However,manydifferent pointsofviewhavebeendevelopedtothisrespectandhencea lotofdifferentan- swerscanbefoundintheliterature.Hereweexplainindetailoneoftheseoptions. Onthemostbasicgrounds,anobject,aprocedure,orsystemissaidtobe“com- plex” when it doesnotmatch patternsregardedas simple. This soundsrather like anoxymoronbutcommonknowledgetellsuswhatissimpleandcomplex:simpli- fiedsystemsoridealizationsarealwaysastartingpointtosolvescientificproblems. Thenotionof“complexity”inphysics[2,3]startsbyconsideringtheperfectcrystal and the isolated ideal gas as examplesof simple models and therefore as systems withzero“complexity”.Letusbrieflyrecalltheirmaincharacteristicswith“order”, “information”and“equilibrium”. A perfect crystal is completely ordered and the atoms are arranged following stringentrulesofsymmetry.Theprobabilitydistributionforthestatesaccessibleto theperfectcrystaliscenteredaroundaprevailingstateofperfectsymmetry.Asmall pieceof“information”isenoughtodescribetheperfectcrystal:thedistancesandthe symmetriesthatdefinetheelementarycell.The“information”storedinthissystem canbeconsideredminimal.Ontheotherhand,theisolatedidealgasiscompletely disordered.The system can be found in any of its accessible states with the same probability.Allofthemcontributeinequalmeasuretothe“information”storedin theidealgas.Ithasthereforeamaximum“information”.Thesetwosimplesystems areextremainthescaleof“order”and“information”.Itfollowsthatthedefinition of“complexity”mustnotbemadeintermsofjust“order”or“information”. It might seem reasonable to propose a measure of “complexity” by adopting somekindofdistancefromtheequiprobabledistributionoftheaccessiblestatesof the system. Defined in this way, “disequilibrium”would givean idea of the prob- abilistichierarchyofthesystem.“Disequilibrium”wouldbedifferentfromzeroif thereareprivileged,ormoreprobable,statesamongthoseaccessible.Butthiswould notwork.Goingbackto thetwo exampleswe beganwith,itis readilyseen thata perfectcrystal is far from an equidistributionamongthe accessible states because oneofthemistotallyprevailing,andso“disequilibrium”wouldbemaximum.For theidealgas,“disequilibrium”wouldbezerobyconstruction.Thereforesuchadis- tanceor“disequilibrium”(ameasureofaprobabilistichierarchy)cannotbedirectly associatedwith“complexity”. InFigure1.1wesketchanintuitivequalitativebehaviorfor“information”H and “disequilibrium”Dforsystemsrangingfromtheperfectcrystaltotheidealgas.This graphsuggeststhattheproductofthesetwoquantitiescouldbeusedasameasure of“complexity”:C=H D.ThefunctionChasindeedthefeaturesandasyntotical · propertiesthatonewould expectintuitively:itvanishesfortheperfectcrystaland for the isolated ideal gas, and it is different from zero for the rest of the systems 1.1 AStatisticalMeasureofComplexity.SomeApplications 7 ofparticles. We will followthese guidelinestoestablish a quantitativemeasureof “complexity”. Beforeattemptinganyfurtherprogress,however,wemustrecallthat“complex- ity”cannotbemeasuredunivocally,becauseitdependsonthenatureofthedescrip- tion(whichalwaysinvolvesareductionistprocess)andonthescaleofobservation. Letustakeanexampletoillustratethispoint.Acomputerchipcanlookverydiffer- entatdifferentscales.Itisanentangledarrayofelectronicelementsatmicroscopic scalebutonlyanorderedsetofpinsattachedtoablackboxatamacroscopicscale. INFORMATION = H C = H*D = COMPLEXITY DISEQUILIBRIUM = D CRYSTAL IDEAL GAS Fig.1.1 Sketchoftheintuitivenotionofthemagnitudesof“information”(H)and“disequilibrium” (D)forthephysicalsystemsandthebehaviorintuitivelyrequiredforthemagnitude“complexity”. ThequantityC=H Disproposedtomeasuresuchamagnitude. · Weshallnowdiscussameasureof“complexity”basedonthestatisticaldescrip- tionofsystems.LetusassumethatthesystemhasNaccessiblestates x ,x ,...,x 1 2 N { } whenobservedatagivenscale.WewillcallthisanN-system.Ourunderstandingof thebehaviorofthissystemdeterminesthecorrespondingprobabilities p ,p ,...,p 1 2 N (withthecondition(cid:229) N p =1)ofeachstate(p >0foralli).Thenth{eknowledge} i=1 i i of theunderlyingphysicallaws atthisscale is incorporatedintoa probabilitydis- tribution for the accessible states. It is possible to find a quantity measuring the 8 1 StatisticalComplexityandFisher-ShannonInformation.Applications amountof“information”.Undertothemostelementaryconditionsofconsistency, Shannon[4]determinedtheuniquefunctionH(p ,p ,...,p )thataccountsforthe 1 2 N “information”storedinasystem: N (cid:229) H= K p logp , (1.1) i i − i=1 whereK isapositiveconstant.ThequantityH iscalledinformation.Theredefini- tionof informationH as sometypeof monotonefunctionofthe Shannonentropy can be also useful in many contexts as we shall show in the next sections. In the case of a crystal, a state x would be the most probable p 1, and all others x c c i ∼ wouldbeveryimprobable, p 0i=c.ThenH 0.Ontheotherside,equiprob- i c ∼ 6 ∼ abilitycharacterizesanisolatedidealgas, p 1/N soH KlogN,i.e.,themax- i g ∼ ∼ imum of information for a N-system. (Notice that if one assumes equiprobability andK=k Boltzmannconstant,H isidentifiedwiththethermodinamicentropy, S=k logN)≡.AnyotherN-systemwillhaveanamountofinformationbetweenthose twoextrema. LetusproposeadefinitionofdisequilibriumDinaN-system[5].Theintuitive notionsuggeststhatsomekindofdistancefromanequiprobabledistributionshould beadopted.TworequirementsareimposedonthemagnitudeofD:D>0inorderto haveapositivemeasureof“complexity”andD=0onthelimitofequiprobability. The straightforward solution is to add the quadratic distances of each state to the equiprobabilityasfollows: (cid:229)N 1 2 D= p . (1.2) i −N i=1(cid:18) (cid:19) According to this definition, a crystal has maximum disequilibrium(for the dom- inant state, p 1, and D 1 for N ¥ ) while the disequilibriumfor an ideal c c ∼ → → gas vanishes (D 0) by construction. For any other system D will have a value g ∼ betweenthesetwoextrema. We now introduce the definition of complexityC of a N-system [6, 7]. This is simply theinterplaybetweenthe informationstoredin the system andits disequi- librium: (cid:229)N (cid:229)N 1 2 C=H D= K p logp p . (1.3) i i i · − i=1 !· i=1(cid:18) −N(cid:19) ! Thisdefinitionfitstheintuitivearguments.Foracrystal,disequilibriumislargebut theinformationstoredisvanishinglysmall,soC 0.Ontheotherhand,H islarge ∼ for an ideal gas, but D is small, soC 0 as well. Any other system will have an ∼ intermediatebehaviorandthereforeC>0. Aswasintuitivelysuggested,thedefinitionofcomplexity(1.3)alsodependson thescale.Ateachscaleofobservationanewsetofaccessiblestatesappearswithits correspondingprobabilitydistributionsothatcomplexitychanges.Physicallawsat eachlevelofobservationallowustoinfertheprobabilitydistributionofthenewset 1.1 AStatisticalMeasureofComplexity.SomeApplications 9 ofaccessiblestates,andthereforedifferentvaluesforH,DandCwillbeobtained. Thestraightforwardpassagetothecaseofacontinuumnumberofstates,x,canbe easilyinferred.Thuswemusttreatwithprobabilitydistributionswithacontinuum support,p(x),andnormalizationcondition +¥¥ p(x)dx=1.Disequilibriumhasthe limitD= +¥¥ p2(x)dxandthecomplexitycR−ouldbedefinedby: − R +¥ +¥ C=H D= K p(x)logp(x)dx p2(x)dx . (1.4) · − ¥ · ¥ (cid:18) Z− (cid:19) (cid:18)Z− (cid:19) Asweshallsee,otherpossibilitiesforthecontinuousextensionofCarealsopossi- ble. Fig.1.2 Ingeneral,dependenceofcomplexity(C)onnormalizedinformation(H)isnotunivocal: manydistributions p canpresentthesamevalueofHbutdifferentC.Thisisshowninthecase i { } N=3. DirectsimulationsofthedefinitiongivethevaluesofC forgeneralN-systems. Thesetofallthepossibledistributions p ,p ,...,p whereanN-systemcouldbe 1 2 N { } foundis sampled. Forthe sake of simplicity H is normalizedto the interval[0,1]. ThusH =(cid:229) N p logp/logN.Foreachdistribution p thenormalizedinforma- i=1 i i { i} tionH( p ),andthedisequilibriumD( p )(eq.1.2)arecalculated.Ineachcase i i { } { } thenormalizedcomplexityC=H Disobtainedandthepair(H,C)stored.These · twomagnitudesareplottedonadiagram(H,C(H))inordertoverifythequalitative behaviorpredictedin Figure1.1. For N =2 ananalyticalexpressionforthe curve 10 1 StatisticalComplexityandFisher-ShannonInformation.Applications C(H)isobtained.Iftheprobabilityofonestateis p =x,thatofthesecondoneis 1 simply p =1 x.Thecomplexityofthesystemwillbe: 2 − 1 x 1 2 C(x)=H(x) D(x)= [xlog +log(1 x)] 2 x . (1.5) · −log2 1 x − · −2 (cid:18) − (cid:19) (cid:18) (cid:19) Complexity vanishes for the two simplest 2-systems: the crystal (H =0; p =1, 1 p =0)andtheidealgas(H=1; p =1/2,p =1/2).Letusnoticethatthiscurve 2 1 2 isthesimplestonethatfulfillsalltheconditionsdiscussedintheintroduction.The largest complexity is reached for H 1/2 and its value is:C(x 0.11) 0.151. ∼ ∼ ∼ ForN>2therelationshipbetweenH andCisnotunivocalanymore.Manydiffer- ent distributions p store the same informationH but have differentcomplexity i { } C. Figure 1.2 displays such a behavior for N =3. If we take the maximum com- plexityC (H)associatedwitheachH acurvesimilartotheonefora2-systemis max recovered.Every3-systemwillhaveacomplexitybelowthislineanduppertheline ofC (H)andalso uppertheminimumenvelopecomplexityC . Theselines min minenv will be analytically found in a next section. In Figure 1.3 curvesC (H) for the max casesN=3,...,10arealsoshown.Letusobservetheshiftofthecomplexity-curve peak to smaller values of entropy for rising N. This fact agrees with the intuition tellingusthatthebiggestcomplexity(numberofpossibilitiesof‘complexification’) bereachedforlesserentropiesforthesystemswithbiggernumberofstates. Fig.1.3 Complexity(C=H D)asafunctionofthenormalizedinformation(H)forasystemwith · twoaccessiblestates(N=2).Alsocurvesofmaximumcomplexity(Cmax)areshownforthecases: N=3,...,10.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.