Table Of ContentKe-Lin Du
M.N.S. Swamy
Search and
Optimization by
Metaheuristics
Techniques
and Algorithms
Inspired by Nature
Ke-Lin Du M.N.S. Swamy
(cid:129)
Search and Optimization
by Metaheuristics
Techniques and Algorithms Inspired
by Nature
Ke-Lin Du M.N.S.Swamy
XonlinkInc Department ofElectrical andComputer
Ningbo,Zhejiang Engineering
China Concordia University
Montreal,QC
and Canada
Department ofElectrical andComputer
Engineering
Concordia University
Montreal,QC
Canada
ISBN978-3-319-41191-0 ISBN978-3-319-41192-7 (eBook)
DOI 10.1007/978-3-319-41192-7
LibraryofCongressControlNumber:2016943857
MathematicsSubjectClassification(2010):49-04,68T20,68W15
©SpringerInternationalPublishingSwitzerland2016
Thisworkissubjecttocopyright.AllrightsarereservedbythePublisher,whetherthewholeorpart
of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations,
recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission
orinformationstorageandretrieval,electronicadaptation,computersoftware,orbysimilarordissimilar
methodologynowknownorhereafterdeveloped.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this
publicationdoesnotimply,evenintheabsenceofaspecificstatement,thatsuchnamesareexemptfrom
therelevantprotectivelawsandregulationsandthereforefreeforgeneraluse.
The publisher, the authors and the editors are safe to assume that the advice and information in this
book are believed to be true and accurate at the date of publication. Neither the publisher nor the
authorsortheeditorsgiveawarranty,expressorimplied,withrespecttothematerialcontainedhereinor
foranyerrorsoromissionsthatmayhavebeenmade.
Printedonacid-freepaper
ThisbookispublishedunderthetradenameBirkhäuser
TheregisteredcompanyisSpringerInternationalPublishingAGSwitzerland
(www.birkhauser-science.com)
To My Friends Jiabin Lu and Biaobiao
Zhang
Ke-Lin Du
and
To My Parents
M.N.S. Swamy
Preface
Optimization is a branch of applied mathematics and numerical analysis. Almost
everyprobleminengineering,science,economics,andlifecanbeformulatedasan
optimization or a search problem. While some of the problems can be simple that
canbesolvedbytraditional optimizationmethodsbasedonmathematical analysis,
most of the problems are very hard to be solved using analysis-based approaches.
Fortunately, we can solve these hard optimization problems by inspirations from
nature, since we know that nature is a system of vast complexity and it always
generates a near-optimum solution.
Natural computing is concerned with computing inspired by nature, as well as
with computations taking place in nature. Well-known examples of natural com-
putingareevolutionarycomputation,neuralcomputation,cellularautomata,swarm
intelligence, molecular computing, quantum computation, artificial immune sys-
tems, and membrane computing. Together, they constitute the discipline of com-
putational intelligence.
Among all the nature-inspired computational paradigms, evolutionary compu-
tation is most influential. It is a computational method for obtaining the best pos-
sible solutions in a huge solution space based on Darwin’s survival-of-the-fittest
principle. Evolutionary algorithms are a class of effective global optimization
techniques for many hard problems.
More and more biologically inspired methods have been proposed in the past
twodecades.Themostprominentonesareparticleswarmoptimization,antcolony
optimization, and immune algorithm. These methods are widely used due to their
particular features compared with evolutional computation. All these biologically
inspired methods are population-based. Computation is performed by autonomous
agents, and these agents exchange information by social behaviors. The memetic
algorithm models the behavior of knowledge propagation of animals.
There are also many other nature-inspired metaheuristics for search and opti-
mization. These include methods inspired by physical laws, chemical reaction,
biological phenomena, social behaviors, and animal thinking.
Metaheuristics are a class of intelligent self-learning algorithms for finding
near-optimum solutions to hard optimization problems, mimicking intelligent
processes and behaviors observed from nature, sociology, thinking, and other
disciplines. Metaheuristics may be nature-inspired paradigms, stochastic, or
vii
viii Preface
probabilistic algorithms. Metaheuristics-based search and optimization are widely
used for fully automated decision-making and problem-solving.
In this book, we provide a comprehensive introduction to nature-inspired
metaheuristical methods for search and optimization. While each
metaheuristics-basedmethodhasitsspecificstrengthforparticularcases,according
to no free lunch theorem, it has actually the same performance as that of random
searchinconsiderationoftheentiresetofsearchandoptimizationproblems.Thus,
whentalkingabouttheperformanceofanoptimizationmethod,itisactuallybased
on the same benchmarking examples that are representatives of some particular
class of problems.
Thisbookisintendedasanaccessibleintroductiontometaheuristicoptimization
forabroadaudience.Itprovidesanunderstandingofsomefundamentalinsightson
metaheuristicoptimization,andservesasahelpfulstartingpointforthoseinterested
in more in-depth studies of metaheuristic optimization. The computational para-
digms described in this book are of general purpose in nature. This book can be
usedasatextbook for advanced undergraduate studentsandgraduatestudents.All
those interested in search and optimization can benefit from this book. Readers
interested in a particular topic will benefit from the appropriate chapter.
A roadmap for navigating through the book is given as follows. Except the
introductory Chapter 1, the contents of the book can be grossly divided into five
categories and an appendix.
(cid:129) Evolution-based approach is covered in Chapters 3–8:
Chapter 3. Genetic Algorithms
Chapter 4. Genetic Programming
Chapter 5. Evolutionary Strategies
Chapter 6. Differential Evolution
Chapter 7. Estimation of Distribution Algorithms
Chapter 8. Topics in Evolutionary Algorithms
(cid:129) Swarm intelligence-based approach is covered in Chapters 9–15:
Chapter 9. Particle Swarm Optimization
Chapter 10. Artificial Immune Systems
Chapter 11. Ant Colony Optimization
Chapter 12. Bee Metaheuristics
Chapter 13. Bacterial Foraging Algorithm
Chapter 14. Harmony Search
Chapter 15. Swarm Intelligence
(cid:129) Sciences-based approach is covered in Chapters 2, 16–18:
Chapter 2. Simulated Annealing
Chapter 16. Biomolecular Computing
Chapter 17. Quantum Computing
Chapter 18. Metaheuristics Based on Sciences
(cid:129) Human-based approach is covered in Chapters 19–21:
Preface ix
Chapter 19. Memetic Algorithms
Chapter 20. Tabu Search and Scatter Search
Chapter 21. Search Based on Human Behaviors
(cid:129) General optimization problems are treated in Chapters 22–23:
Chapter 22. Dynamic, Multimodal, and Constrained Optimizations
Chapter 23. Multiobjective Optimization
(cid:129) The appendix contains auxiliary benchmarks helpful to test new and existing
algorithms.
In this book, hundreds of different metaheuristic methods are introduced.
However, due to space limitation, we only give detailed description to a large
numberofthemostpopularmetaheuristicmethods.Somecomputationalexamples
for representative metaheuristic methods are given. The MATLAB codes for these
examplesareavailableatthebookwebsite.WehavealsocollectedsomeMATLAB
codes for some other metaheuristics. Thesecodes areofgeneralpurposeinnature.
The reader needs just to run these codes with their own objective functions.
Forinstructors,thisbookhasbeendesignedtoserveasatextbookforcourseson
evolutionaryalgorithmsornature-inspiredoptimization.Thisbookcanbetaughtin
12two-hoursessions.WerecommendthatChapters1–11,19,22and23shouldbe
taught. In order to acquire a mastery of these popular metaheuristic algorithms,
someprogrammingexercisesusingthebenchmarkfunctionsgivenintheappendix
shouldbeassignedtothestudents.TheMATLABcodesprovidedwiththebookare
useful for learning the algorithms.
For readers, we suggest that you start with Chapter 1, which covers basic
concepts in optimization and metaheuristics. When you have digested the basics,
you can delve into one or more specific metaheuristic paradigms that you are
interested in or that satisfy your specific problems. The MATLAB codes accom-
panying the book are very useful for learning those popular algorithms, and they
can be directly used for solving your specific problems. The benchmark functions
are also very useful for researchers for evaluating their own algorithms.
We would like to thank Limin Meng (Zhejiang University of Technology,
China), and Yongyao Yang (SUPCON Group Inc, China) for their consistent
help.WewouldliketothankallthehelpfulandthoughtfulstaffatXonlinkInc.Last
butnotleast,wewouldliketorecognizetheassistanceofBenjaminLevittandthe
production team at Springer.
Ningbo, China Ke-Lin Du
Montreal, Canada M.N.S. Swamy
Contents
1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Computation Inspired by Nature . . . . . . . . . . . . . . . . . . . . . 1
1.2 Biological Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Evolution Versus Learning. . . . . . . . . . . . . . . . . . . . . . . . . 5
1.4 Swarm Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.4.1 Group Behaviors. . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.4.2 Foraging Theory. . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.5 Heuristics, Metaheuristics, and Hyper-Heuristics . . . . . . . . . . 9
1.6 Optimization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
1.6.1 Lagrange Multiplier Method. . . . . . . . . . . . . . . . . . 12
1.6.2 Direction-Based Search and Simplex Search. . . . . . . 13
1.6.3 Discrete Optimization Problems . . . . . . . . . . . . . . . 14
1.6.4 P, NP, NP-Hard, and NP-Complete. . . . . . . . . . . . . 16
1.6.5 Multiobjective Optimization Problem. . . . . . . . . . . . 17
1.6.6 Robust Optimization . . . . . . . . . . . . . . . . . . . . . . . 19
1.7 Performance Indicators. . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
1.8 No Free Lunch Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . 22
1.9 Outline of the Book. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
2 Simulated Annealing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
2.2 Basic Simulated Annealing. . . . . . . . . . . . . . . . . . . . . . . . . 30
2.3 Variants of Simulated Annealing. . . . . . . . . . . . . . . . . . . . . 33
References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
3 Genetic Algorithms. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
3.1 Introduction to Evolutionary Computation . . . . . . . . . . . . . . 37
3.1.1 Evolutionary Algorithms Versus Simulated
Annealing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
3.2 Terminologies of Evolutionary Computation. . . . . . . . . . . . . 39
3.3 Encoding/Decoding. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
3.4 Selection/Reproduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
3.5 Crossover. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
xi
xii Contents
3.6 Mutation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
3.7 Noncanonical Genetic Operators . . . . . . . . . . . . . . . . . . . . . 49
3.8 Exploitation Versus Exploration . . . . . . . . . . . . . . . . . . . . . 51
3.9 Two-Dimensional Genetic Algorithms . . . . . . . . . . . . . . . . . 55
3.10 Real-Coded Genetic Algorithms . . . . . . . . . . . . . . . . . . . . . 56
3.11 Genetic Algorithms for Sequence Optimization. . . . . . . . . . . 60
References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
4 Genetic Programming. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
4.2 Syntax Trees. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
4.3 Causes of Bloat. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
4.4 Bloat Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
4.4.1 Limiting on Program Size . . . . . . . . . . . . . . . . . . . 77
4.4.2 Penalizing the Fitness of an Individual
with Large Size. . . . . . . . . . . . . . . . . . . . . . . . . . . 77
4.4.3 Designing Genetic Operators . . . . . . . . . . . . . . . . . 77
4.5 Gene Expression Programming. . . . . . . . . . . . . . . . . . . . . . 78
References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
5 Evolutionary Strategies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
5.2 Basic Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
5.3 Evolutionary Gradient Search and Gradient Evolution . . . . . . 85
5.4 CMA Evolutionary Strategies. . . . . . . . . . . . . . . . . . . . . . . 88
References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
6 Differential Evolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
6.2 DE Algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
6.3 Variants of DE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
6.4 Binary DE Algorithms. . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
6.5 Theoretical Analysis on DE . . . . . . . . . . . . . . . . . . . . . . . . 100
References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
7 Estimation of Distribution Algorithms. . . . . . . . . . . . . . . . . . . . . 105
7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
7.2 EDA Flowchart. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
7.3 Population-Based Incremental Learning . . . . . . . . . . . . . . . . 108
7.4 Compact Genetic Algorithms . . . . . . . . . . . . . . . . . . . . . . . 110
7.5 Bayesian Optimization Algorithm . . . . . . . . . . . . . . . . . . . . 112
7.6 Concergence Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
7.7 Other EDAs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
7.7.1 Probabilistic Model Building GP. . . . . . . . . . . . . . . 115
References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116