外文科技图书简介
当前位置:首页 > 检索结果 >文献详细内容

书名:Nature-inspired optimization algorithms

责任者:Xin-She Yang  |  School of Science and Technology  |  Middlesex University London  |  London.

ISBN\ISSN:9780124167438,0124167438 

出版时间:2014

出版社:Elsevier,

分类号:


前言

Nature-inspired optimization algorithms have become increasingly popular in recent years, and most of these metaheuristic algorithms, such as particle swarm optimization and firefly algorithms, are often based on swarm intelligence. Swarmintelligence-based algorithms such as cuckoo search and firefly algorithms have been found to be very efficient.
The literature has expanded significantly in the last 10 years, intensifying the need to review and summarize these optimization algorithms. Therefore, this book strives to introduce the latest developments regarding all major nature-inspired algorithms, including ant and bee algorithms, bat algorithms, cuckoo search, firefly algorithms, flower algorithms, genetic algorithms, differential evolution, harmony search, simulated annealing, particle swarm optimization, and others. We also discuss hybrid methods, multiobjective optimization, and the ways of dealing with constraints.
Organization of the book's contents follows a logical order so that we can introduce these algorithms for optimization in a natural way. As a result, we do not follow the order of historical developments. We group algorithms and analyze them in terms of common criteria and similarities to help readers gain better insight into these algorithms.
This book's emphasis is on the introduction of basic algorithms, analysis of key components of these algorithms, and some key steps in implementation. However, we do not focus too much on the exact implementation using programming languages, though we do provide some demo codes in the Appendices.
The diversity and popularity of nature-inspired algorithms do not mean there is no problem that needs urgent attention. In fact, there are many important questions that remain open problems. For example, there are some significant gaps between theory and practice. On one hand, nature-inspired algorithms for optimization are very successful and can obtain optimal solutions in a reasonably practical time. On the other hand, mathematical analysis of key aspects of these algorithms, such as convergence, balance of solution accuracy and computational efforts, is lacking, as is the tuning and control of parameters.

查看更多

目录

Preface xi

1 Introduction to Algorithms 1

1.1What is an Algorithm? 1

1.2 Newton's Method 3

1.3 Optimization 5

      1.3.1 Gradient-Based Algorithms 6

      1.3.2 Hill Climbing with Random Restart 10

1.4 Search for Optimality 11

1.5 No-Free-Lunch Theorems 12

      1.5.1 NFL Theorems 12

      1.5.2 Choice of Algorithms 14

1.6 Nature-Inspired Metaheuristics 15

1.7 A Brief History of Metaheuristics 16

References 20

2 Analysis of Algorithms 23

2.1 Introduction 23

2.2 Analysis of Optimization Algorithms 24

      2.2.1 Algorithm as an Iterative Process 24

      2.2.2 An Ideal Algorithm? 25

      2.2.3 A Self-Organization System 26

      2.2.4 Exploration and Exploitation 26

      2.2.5 Evolutionary Operators 27

2.3 Nature-Inspired Algorithms 29

      2.3.1 Simulated Annealing 30

      2.3.2 Genetic Algorithms 30

      2.3.3 Differential Evolution 31

      2.3.4 Ant and Bee Algorithms 31

      2.3.5 Particle Swarm Optimization 32

      2.3.6 The Firefly Algorithm 33

      2.3.7 Cuckoo Search 33

      2.3.8 The Bat Algorithm 34

      2.3.9 Harmony Search 35

      2.3.10 The Flower Algorithm 36

      2.3.11 Other Algorithms 37

2.4 Parameter Tuning and Parameter Control 38

      2.4.1 Parameter Tuning 38

      2.4.2 Hyperoptimization 38

      2.4.3 Multiobjective View 39

      2.4.4 Parameter Control 40

2.5 Discussions 40

2.6 Summary 41

References 42

3 Random Walks and Optimization 45

3.1 Random Variables 45

3.2 Isotropic Random Walks 46

3.3 Levy Distribution and Levy Flights 49

3.4 Optimization as Markov Chains 51

      3.4.1 Markov Chain 51

      3.4.2 Optimization as a Markov Chain 53

3.5 Step Sizes and Search Efficiency 54

      3.5.1Step Sizes, Stopping Criteria, and Efficiency 54

      3.5.2 Why L6vy Flights are More Efficient 56

3.6 Modality and Intermittent Search Strategy 56

3.7 Importance of Randomization 59

      3.7.1Ways to Carry Out Random Walks 59

      3.7.2 Importance of Initialization 61

      3.7.3 Importance Sampling 61

      3.7.4 Low-Discrepancy Sequences 62

3.8 Eagle Strategy 63

      3.8.1 Basic Ideas of Eagle Strategy 63

      3.8.2 Why Eagle Strategy is So Efficient 64

References 65

4 Simulated Annealing 67

4.1 Annealing and Boltzmann Distribution 67

4.2 Parameters 68

4.3 SA Algorithm 69

4.4 Unconstrained Optimization 70

4.5 Basic Convergence Properties 71

4.6 SA Behavior in Practice 73

4.7 Stochastic IXinneling 74

References 75

5 Genetic Algorithms 77

5.1 Introduction 77

5.2 Genetic Algorithms 78

5.3 Role of Genetic Operators 79

5.4 Choice of Parameters 80

5.5 GA Variants 82

5.6 Schema Theorem 83

5.7 Convergence Analysis 84

References 86

6 Differential Evolution 89

6.1 Introduction 89

6.2 Differential Evolution 90

6.3 Variants 91

6.4 Choice of Parameters 93

6.5 Convergence Analysis 93

6.6 Implementation 95

References 96

7 Particle Swarm Optimization 99

7.1 Swarm Intelligence 99

7.2 PSO Algorithm 99

7.3 Accelerated PSO 101

7.4 Implementation 102

7.5 Convergence Analysis 104

      7.5.1 Dynamical System 105

      7.5.2 Markov Chain Approach 107

7.6 Binary PSO 108

References 109

8 Firefly Algorithms 111

8.1 The Firefly Algorithm 111

      8.1.1 Firefly Behavior 111

      8.1.2 Standard Firefly Algorithm 112

      8.1.3 Variations of Light Intensity and Attractiveness 113

      8.1.4 Controlling Randomization 114

8.2 Algorithm Analysis 115

      8.2.1 Scalings and Limiting Cases 115

      8.2.2 Attraction and Diffusion 116

      8.2.3 Special Cases of FA 117

8.3 Implementation 118

8.4 Variants of the Firefly Algorithm 118

      8.4.1 FA Variants 118

      8.4.2 How Can We Discretize FA.? 120

8.5 Firefly Algorithms in Applications 122

8.6 Why the Firefly Algorithm is Efficient 123

References 124

9 Cuckoo Search 129

9.1 Cuckoo Breeding Behavior 129

9.2 Levy Flights 129

9.3 Cuckoo Search 130

      9.3.1 Special Cases of Cuckoo Search 132

      9.3.2 How to Carry Out L6vy Flights 132

      9.3.3 Choice of Parameters 133

      9.3.4 Variants of Cuckoo Search 133

9.4 Why Cuckoo Search is so Efficient 135

9.5 Global Convergence: Brief Mathematical Analysis 135

9.6 Applications 137

References 138

10 Bat Algorithms 141

10.1 Echolocation of Bats 141

      10.1.1Behavior of Microbats 141

      10.1.2 Acoustics of Echolocation 142

10.2 Bat Algorithms 142

      10.2.1 Movement of Virtual Bats 143

      10.2.2 Loudness and Pulse Emission 144

10.3 Implementation 145

10.4 Binary Bal Algorithms 147

10.5 Variants of the Bat Algorithm 148

10.6 Convergence Analysis 149

10.7 Why the Bat Algorithm is Efficient 150

10.8 Applications 150

      10.8.1Continuous Optimization 151

      10.8.2 Combinatorial Optimization and Scheduling 151

      10.8.3 Inverse Problems and Parameter Estimation 151

      10.8.4 Classifications, Clustering, and Data Mining 151

      10.8.5 Image Processing 152

      10.8.6 Fuzzy Logic and Other Applications 152

References 153

11 Flower Pollination Algorithms 155

11.1 Introduction 155

11.2 Flower Pollination Algorithms 156

      11.2.1 Characteristics of Flower Pollination 156

      11.2.2 Flower Pollination Algorithms 157

11.3 Multi-Objective Flower Pollination Algorithms 159

11.4 Validation and Numerical Experiments 160

      11.4.1 Single-Objective Test Functions 160

      11.4.2 Multi-Objective Test Functions 162

      11.4.3 Analysis of Results and Comparison 164

11.5 Applications 165

      11.5.1 Single-Objective Design Benchmarks 165

      11.5.2 Multi-Objective Design Benchmarks 170

11.6 Further Research Topics 171

References 172

12 A Framework for Self-Ihning Algorithms 175

12.1 Introduction 175

12.2Algorithm Analysis and Parameter Tuning 175

      12.2.1 A General Formula for Algorithms 176

      12.2.2 Type of Optimality 176

      12.2.3 Parameter Tuning 177

12.3 Framework for Self-Tuning Algorithms 178

      12.3.1 Hyperoptimization 178

      12.3.2 A Multi-Objective View 178

      12.3.3 Self-Tuning Framework 179

12.4 A Self-Tuning Firefly Algorithm 180

12.5 Some Remarks 181

References 182

13 How to Deal with Constraints 183

13.1 Introduction and Overview 183

13.2 Method of Lagrange Multipliers 184

13.3 KKT Conditions 186

13.4 Penalty Method 187

13.5 Equality with Tolerance 188

13.6 Feasibility Rules and Stochastic Ranking 189

13.7 Multi-objective Approach to Constraints 190

13.8 Spring Design 191

13.9 Cuckoo Search Implementation 191

References 196

14 Multi-Objective Optimization 197

14.1 Multi-Objective Optimization 197

14.2 Pareto Optimality 198

14.3 Weighted Sum Method 201

14.4 Utility Method 204

14.5 The ε-Constraint Method 206

14.6 Metaheuristic Approaches 209

14.7 NSGA-Ⅱ 210

References 210

15Other Algorithms and Hybrid Algorithms 213

15.1 Ant Algorithms 213

      15.1.1 Ant Behavior 213

      15.1.2 Ant Colony Optimization 214

      15.1.3 Virtual Ant Algorithms 215

15.2 Bee-Inspired Algorithms 216

      15.2.1 Honeybee Behavior 216

      15.2.2 Bee Algorithms 216

      15.2.3 Honeybee Algorithm 217

      15.2.4 Virtual Bee Algorithm 218

      15.2.5 Artificial Bee Colony Optimization 218

15.3 Harmony Search 219

      15.3.1 Harmonics and Frequencies 219

      15.3.2 Harmony Search 220

15.4 Hybrid Algorithms 222

      15.4.1 Other Algorithms 222

      15.4.2 Ways to Hybridize 223

15.5 Final Remarks 224

References 224

APPENDIX A Test Function Benchmarks for Global Optimization 227

References 245

APPENDIX B Matlab Programs 247

B.1 Simulated Annealing 247

B.2 Particle Swarm Optimization 248

B.3 Differentia! Evolution 250

B.4 Firefly Algoriihin 252

B.5 Cuckoo Search 256

B.6 Bal Algorithm 259

B.7 Flower Pollination Algoriihm 261

查看更多

馆藏单位

中科院文献情报中心