ebook img

Stochastic optimization of a cold atom experiment using a genetic algorithm PDF

0.11 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Stochastic optimization of a cold atom experiment using a genetic algorithm

Stochastic optimization of a cold atom experiment using a genetic algorithm W. Rohringer,1 R. Bu¨cker,1 S. Manz,1 T. Betz,1 Ch. Koller,1 M. G¨obel,1 A. Perrin,1 J. Schmiedmayer,1 and T. Schumm1,∗ 1Atominstitut der O¨sterreichischen Universit¨aten, TU-Wien, Stadionallee 2, 1020 Vienna, Austria (Dated: January 15, 2009) We employ an evolutionary algorithm to automatically optimize different stages of a cold atom experiment without human intervention. This approach closes the loop between computer based experimental control systems and automatic real time analysis and can be applied to a wide range 9 of experimental situations. The genetic algorithm quickly and reliably converges to the most per- 0 forming parameter set independent of the starting population. Especially in many-dimensional or 0 connected parameter spaces theautomatic optimization outperforms amanual search. 2 n a In experimental research, scientists are often con- cooled 87Rb atoms are loaded into a magnetic trap cre- J fronted with the task of finding ideal parameters to per- ated by current carrying macroscopic copper structures. 5 form a measurement. Although in modern setups, ex- Evaporativecooling is applied, creating a sample of 106 1 perimental parameters are handled by computer-based atoms at microkelvin temperatures. These atoms are control systems and also the evaluation of the experi- loaded onto an atom chip, where microscopic traps are ] mental output is often performed by real-time computer generated by lithographically patterned wires [6, 7], and h p systems, optimization is still essentially carried out by cooledtoquantumdegeneracyinasecondphaseofevap- - hand. orative cooling. The wire structures on the atom chip m In this article we present the application of a genetic allowsalargevarietyofexperimentslikematterwavein- o algorithm to automatically optimize control parameters terferometry[8]orthestudyoflow-dimensionalquantum at in an atomic physics experiment. Such algorithms have gases [9]. After the experiment is performed, the atomic . been employedto tackle optimization problems in differ- density distribution is recorded using standard absorp- s c entfields,rangingfrommathematicsandalgorithmics[1] tionor fluorescenceimaging [10], either in the atomchip i overeconomics[2]andaerospaceengineering[3]togame trap or in free time-of-flight expansion. As the atomic s y theory [4]. Our algorithm generates a set of random sample is destroyed in the imaging process, the above h starting parameters, automatically performs the experi- procedure has to be repeated many times when param- p mentandrecoversinformationaboutthe performanceof eters are changed and to accumulate statistics. The du- [ the parameter set from real-time analysis of the experi- ration of an experiment is 35s, the experimental cycle 2 mentalresult. Thisinformationisusedtogenerateanext is repeated continuously and data taking can cover sev- v generation of parameters with better performance, until eral days. It shall be pointed out that individual oper- 4 an optimum is reached (see figure1). This approach can ations within the experimental sequence take place on a 7 be implemented in any experimental setup that requires timescale on the order of tens of microseconds, necessi- 4 optimization,providedthatitiscomputercontrolledand tating a relative temporal resolution on the 10−6 level. 4 features real-time analysis. Acommercialrealtime controlsystem(ADwin ProI)in . 0 connection with a custom interface software coordinates The experimental setup used in these investigations 1 the 60 involved individual devices. They are governed 8 is a single-chamber atom chip setup for the creation by a total parameter space of 2300 values during a typi- 0 and study of Bose-Einstein condensates (BEC) [5]. Pre- cal experimental sequence, a subspace of which is made : v accessible for each individual optimization task. i X Each experimental cycle produces an image of the controlsystem experimentsetup analysissystem r atom cloud which yields a projected 2D measurement a sequence control imaging event of the particle density distribution. Important physical parameters hardware systems evaluation quantities like the total number of atoms N, the peak density n , the position of the cloud or the phase space 0 density can be derived already from a single picture. In geneticalgorithm termsofoptimizationtheory,theabovequantitiescanbe interpretedasobjectivefunctionsf(ji),whereji denotes FIG. 1: The feedback loop between control system, experi- asingleparametersetoutoftheoverallparameterspace. mentandreal-timeanalysisisclosedbythegeneticalgorithm. Optimizing the experimental setup corresponds to find- ingaparametersetjithatmaximizes(minimizes)agiven objective function. Probing this function hence means performing the experimental sequence with a given pa- rameter set and determining the desired measurement ∗Electronicaddress: [email protected] value. Since in our experiments usually only few or even 2 algorithm. In a first step, a starting population is dis- stochastically)generate)initial)generation)of)parameter)sets tributed randomly in a parameter sub-space spanned by 2-5variablesassociatedwithacertainstageoftheexper- iment. Sometimes population individuals near a known perform)experiment)to)evaluate)generation)individuals optimum or on the edges of parameter space are added to speed up convergence or test the performance of the rraannkk)iinnddiivviidduuaallss)aaccccoorrddiinngg)ttoo)ppeerrffoorrmmaannccee algorithm. The size of the initial population has to be chosen carefully: Large populations are time consuming assign)fitness)to)individuals)of)one)generation as they increase the number of evaluations per genera- tions as well as the number of generations. Too small starting populations have shown to lead to premature choose)parent)individuals)according)to)fitness convergence as the diversity of the individuals decreases quickly. An intermediate initial population size of 10-15 generate)children)generation)by)recombination individuals has shown to perform well in various opti- mization tasks. To evaluate the performance of a population, the ex- perform)mutation)on)individuals)of)children)generation perimental sequence is executed with the parameter set- tingsrepresentedby theindividuals. The individualsare then sorted according to their performance with respect to the selected objective function, e.g. particle number FIG. 2: Block diagram of theoptimization process. or temperature. In analogy to evolution terminology, the fitness of a a single output quantity is considered, the problem is a population individual F(ji) is a measure for the proba- prime example of a single-objective optimization prob- bility for this state to be selected for reproduction. We lem. Such tasks can be tackled effectively by stochastic havechosenlinear rankingbased fitness assignment[15], algorithms [11]. which considers the relative performance of individuals The simplest approach would be to evaluate the ob- but is independent of the achieved absolute value of the jective function onthe entireparameterspace. However, objective function. It distributes the individual fitnesses thenumberofnecessaryexperimentalrunsgrowsrapidly between0 and1: F(ji)= n2(1−ni−−11)where n is the size with the dimensions of the parameter space. With 10 of the population. evaluations per parameter, a scan over 1, 2, 3, 4 param- The selection process determines the parent individu- eters (i.e. 10, 100, 1000, 10000 parameter sets) takes als that will ’breed’ the next generation. From several about 6 min, 1 h, 10 h or 4 days respectively. possibleselectionschemeswehavechosenstochasticuni- An alternative approach are deterministic algorithms versal sampling [13] where parent individuals are cho- such as hill climbing [12], where knowledge on the gra- sen according to their fitness but which still introduces dient of the objective function is used to generate im- a stochastic process so that less performing individuals proved parameter sets. They perform well for problems cancontribute tothe nextgenerationandpopulationdi- that behave approximately linear in the objective func- versity is kept up. The total number of parent states tion and guarantee convergence for unimodal problems is adjusted to 2/3 of the total population with 2 chil- representedby objectivefunctionsfeaturing onlyoneex- dren per pair of parent states. Since the population size tremum. For multimodal problems, however, they are is chosen to be fixed throughout the optimization pro- prone to converging against local extrema. cess, the free slots are filled up by the best states from Stochastic algorithms trade guaranteed convergence the previous generation. This makes our algorithm ’eli- for shorter runtime in applications with a high number tist’, assuring that the states showing best performance of (interacting) parameters [12]. They introduce a ran- arenotlostduringtheoptimizationprocess,speedingup dom element into the selection of points in parameter convergence. space, there is no formal way to prove whether a found The next generation individuals are created by inter- solution is really the global optimum. Within stochastic mediary recombination [16, 17]. The parameters of the algorithms,geneticalgorithmsbelongtotheclassofpar- two parent states are taken as limiting edges of a hy- allel optimization algorithms which do not rest on the percube in parameterspace, the two child states are dis- evaluation of individual states in parameter space but tributedrandomlywithinthisvolume. Toavoidtoorapid ratherevaluate the performance ofanensemble of states decrease of the children’s parameter space, its volume is (generation). Compared with local algorithms, this ap- stretched by a factor 1.5 [16, 17]. proachis lessdependent onthe specific initialconditions Mutation is introduced to further impede convergence for each optimization run. Out of the different paral- against local extrema. Its influence on the performance lel methods, genetic algorithms are well explored, with ofgeneticalgorithmshasbeeninvestigatedingreatdetail numerous known implementational strategies [13, 14]. and several well performing schemes are known [17, 18]. Figure 2 illustrates the workingconcept ofour genetic For each state within one generation, there is a certain 3 Gen 1 Gen 2 Gen 3 Gen 4 z z z z H −60 H −60 H −60 H −60 M M M M g / −70 g / −70 g / −70 g / −70 n n n n ni ni ni ni u −80 u −80 u −80 u −80 Det Up2/Dow0n /− G2 0 Bia1s / G 2 Det Up2/Dow0n /− G2 0 Bia1s / G 2 Det Up2/Dow0n /− G2 0 Bia1s / G 2 Det Up2/Dow0n /− G2 0 Bi1as / G 2 5 x 10 ng / MHz −−7600 Gen 5 ng / MHz −−7600 Gen 6 ng / MHz −−7600 Gen 7 m Number 468 uni −80 uni −80 uni −80 Ato 2 Det Up2/Dow0n /− G2 0 Bia1s / G 2 Det Up2/Dow0n /− G2 0 Bia1s / G 2 Det Up2/Dow0n /− G2 0 Bia1s / G 2 Mean 0 1 2 3 4 5 6 7 Generation FIG.3: Positionsoftheindividualsinthethree-dimensionalparameterspacethroughsevengenerations. Convergenceisclearly observed. Onthebottom right,theachievedmean atomnumberfor eachgeneration isdepicted. Theerrorbarscorrespond to thestandard deviation of the individual’s results within each generation. probability for random alteration, in our case 1 %, with 7 generations with a starting population of 10 individu- the additional constraint that only one of the state’s als. The overallruntime was30minutes,as comparedto variables is affected. The mutated values are obtained 386 h it would need to sample the entire parameter vol- by a mechanism shown in [19], that constructs jmut = ume with the same resolution. Note that the optimum i ji+si·rDi·2−uk,wherejimut andji denotethemutated was found although the first two generations contained andthe sourcestate respectively,si randomlychosesthe not a single well performing individual. The algorithm sign of the mutation step and r defines the mutation rapidlyandreliablyfindsoptimumvaluesforvariousop- range as a fraction of the accessible parameterspace Di. timization tasks and reproduces or outperforms manu- The last factor designates the actual distribution, char- ally optimized parameters sets. For too small starting acterizedbytherandomnumberuuniformlydistributed populations, it has shown to be prone to premature con- in [0,1]and the mutation precisionk (Mu¨hlenbein’s mu- vergence leading to sub-optimal final parameters. Nu- tation). With r = 0.1 and k = 10 we have adapted merous characteristics of the algorithm such as the size establishedvalues [16]. After the mutationstep the next ofthestartingpopulation,theamountofreproducingin- childpopulationisreadytobetestedontheexperiment. dividuals per generation or the mutation rate and range For simplicity, the optimization cycle is repeated for a can be adapted, weighing optimality against runtime. preset number of 10 times or is stopped manually when In future work we will extend the approach to objec- the diversity in the population has dropped to the point tivefunctionsthatcannotbeevaluatedinasingleexper- that all states are equal. More evolved convergence cri- imentalrun(image)butprobetemporaldynamicsofthe teria like the distance of individuals in parameter space system or the scattering of measured parametersaround or convergence of the objective function can be imple- themeanvalue,requiringseveralmeasurementstoevalu- mented easily. ateasingleparameterset. Thiswillallowapplicationsof Thedescribedgeneticalgorithmhasbeenusedtoopti- the genetic algorithms beyond mere optimization tasks, mize various stages of the experimental sequence [5] like like the reduction of excitations or the enhancement of the optical pumping into magnetically trappable states squeezing or entanglement in dynamic matter wave in- or the evaporative cooling ramps. Figure 3 shows an terferometry[20]. Ultimately, we aim to develop”exper- example of an optimization of critical experimental pa- imental optimal control”, in analogy to optimal control rameters during the phase of optical molasses, which ef- theory,wherethe resultofthe optimizationdirectlypro- fectuatesanadditionalcoolingoftheatomscomingfrom vides scientific insight, not accessible by other means. the magneto-optical trap before they are loaded into a To conclude, we have implemented a robust and effi- magnetic wire trap. The three-dimensional parameter cient stochastic genetic algorithm to automatically per- space is spanned by two magnetic fields (Up/Down and form time-consuming optimization tasks in a complex Bias), and the detuning of the cooling laser frequency cold atom experiment. In the given example, the algo- with respect to the atomic transition. The two magnetic rithm converged to optimal settings within 30 min, as fields have to be optimized to compensate the earth’s compared to several hours typically required for manual magnetic field and possible unknown stray fields, the search. Ourapproachclosesthe loopbetweencomputer- laser frequency has to be optimized to achieve an effi- based experimental control systems and real-time data cient cooling (high detuning) while keeping the atoms analysis and can be implemented in many situations in confined (low detuning). As shown in figure 3 the ge- experimental research. neticalgorithmconvergestooptimumparameterswithin ThisworkwassupportedbytheEUIntegratedProject 4 FET/QUIP ”SCALA” and the Austrian Science Fund FWF Projects P20372 and P21080. [1] C. M. Anderson-Cook, R. L. Haupt, and S. E. Haupt, ceedings of the International School of Physics ”Enrico Journal of the American Statistical Association 100, Fermi” p.67 (1999). 1099 (2005). [11] A.Assion, T.Baumert, M. Bergt, T.Brixner, B. Kiefer, [2] E. K. Burke and J. P. Newall, IEEE Transactions on V.Seyfried,M.Strehle,andG.Gerber,Science282,919 Evolutionary Computation 3, 63 (1999). (1998). [3] S. Obayashi, D. Sasaki, Y. Takeguchi, and N. Hirose, [12] T. Weise, Global Optimization Algorithms (ebook, GNU IEEETransactionsonEvolutionaryComputation4,182 Free Publication License Version 1.2, 2008). (2000). [13] N. A.Baricelli, ActaBiotheoretica 16, 69 (1962). [4] J. W. Weibull, Evolutionary Game Theory (MIT Press, [14] N. A.Baricelli, ActaBiotheoretica 16, 99 (1963). 1995). [15] J. E. Baker, Proceedings of the Second International [5] S. Wildermuth, P. Kru¨ger, C. Becker, M. Brajdic, Conference on Genetic Algorithms, Mahwah, USA p. 14 S. Haupt, A. Kasper, R. Folman, and J. Schmiedmayer, (1987). Phys.Rev.A 69, 030901 (2004). [16] H. Pohlheim, Evolution¨are Algorithmen: Verfahren, Op- [6] S.Groth,P.Kru¨ger,S.Wildermuth,R.Folman,T.Fern- eratoren und Hinweise fu¨r die Praxis (Springer Verlag, holz, J. Schmiedmayer, D. Mahalu,and I. Bar-Joseph, 1999). Applied Physics Letters 85, 2980 (2004). [17] F. Herrera, M. Lozano, and J. L. Verdegay, Artificial [7] M. Trinker, S. Groth, S. Haslinger, S. Manz, T. Betz, Intelligence Review 12, 265 (1998). S.Schneider,I.Bar-Joseph,T.Schumm,andJ.Schmied- [18] R. L. Haupt, IEEE Antennas and Propagation Society mayer,Applied Physics Letters 92, 254102 (2008). International Symposium 2, 1034 (2000). [8] T. Schumm, S. Hofferberth, L. M. Andersson, S. Wil- [19] H. Mu¨hlenein and D. Schlierkamp-Voosen, Analysis of dermuth,S.Groth,I.Bar-Joseph,J.Schmiedmayer,and selection, mutation and recombination in generic algo- P.Kru¨ger, Nature Phys.1, 57 (2005). rithms (SpringerVerlag, UK,London, 1995). [9] S. Hofferberth, I. Lesanovsky, B. Fischer, T. Schumm, [20] J. Esteve, C. Gross, A. Weller, S. Giovanazzi, and and J. Schmiedmayer, Nature449, 324 (2007). M. Oberthaler, Naturedoi:10.1038 (2008). [10] W. Ketterle, D. S. Durfee, and D. Stamper-Kurn, Pro-

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.