This book describes the optimization methods most commonly encountered in signal and image processing: artificial evolution and Parisian approach; wavelets and fractals; information criteria; training and quadratic programming; Bayesian formalism; probabilistic modeling; Markovian approach; hidden Markov models; and metaheuristics (genetic algorithms, ant colony algorithms, cross-entropy, particle swarm optimization, estimation of distribution algorithms, and artificial immune systems).
Author(s): Patrick Siarry
Edition: 1
Publisher: Wiley-ISTE
Year: 2009
Language: English
Pages: 385
Optimization in Signal and Image Processing......Page 5
Table of Contents......Page 7
Introduction......Page 15
1.1. Modeling at the source of image analysis and synthesis......Page 25
1.2. From image synthesis to analysis......Page 26
1.3. Scene geometric modeling and image synthesis......Page 27
1.4.1. The deterministic Hough transform......Page 28
1.4.2. Stochastic exploration of parameters: evolutionary Hough......Page 29
1.4.3. Examples of generalization......Page 31
1.5.1. Photometric modeling......Page 33
1.5.2. Motion modeling......Page 34
1.6. Conclusion......Page 36
1.8. Bibliography......Page 37
2.2. The Parisian approach for evolutionary algorithms......Page 39
2.3. Applying the Parisian approach to inverse IFS problems......Page 41
2.3.2. Retribution of individuals......Page 42
2.4. Results obtained on the inverse problems of IFS......Page 44
2.5. Conclusion on the usage of the Parisian approach for inverse IFS problems......Page 46
2.6.1. The principles......Page 47
2.6.2. Results on real images......Page 51
2.6.3. Application to robotics: fly-based robot planning......Page 54
2.6.4. Sensor fusion......Page 58
2.6.5. Artificial evolution and real time......Page 61
2.6.6. Conclusion about the fly algorithm......Page 63
2.7. Conclusion......Page 64
2.9. Bibliography......Page 65
3.1. Introduction......Page 69
3.2.1. Fractals and paradox......Page 70
3.2.2. Fractal sets and self-similarity......Page 71
3.2.3. Fractal dimension......Page 73
3.3.1. Regularity......Page 78
3.3.2. Multifractal spectrum......Page 82
3.4.2. A rough guide to the world of wavelet......Page 84
3.4.3. Wavelet Transform Modulus Maxima (WTMM) method......Page 87
3.4.4. Spectrum of singularities and wavelets......Page 90
3.4.5. WTMM and some didactic signals......Page 92
3.5.1. Fractal analysis of structures in images: applications in microbiology......Page 94
3.5.2. Using WTMM for the classification of textures – application in the field of medical imagery......Page 96
3.7. Bibliography......Page 100
4.1. Introduction and context......Page 103
4.2. Overview of the different criteria......Page 105
4.3. The case of auto-regressive (AR) models......Page 107
4.3.1. Origin, written form and performance of different criteria on simulated examples......Page 108
4.3.2. AR and the segmentation of images: a first approach......Page 111
4.3.3. Extension to 2D AR and application to the modeling of textures......Page 113
4.3.4. AR and the segmentation of images: second approach using 2D AR......Page 116
4.4. Applying the process to unsupervised clustering......Page 119
4.5.1. Theoretical aspects......Page 122
4.5.2. Two applications used for encoding images......Page 123
4.6.1. Estimation of the order of Markov models......Page 127
4.6.2. Data fusion......Page 128
4.8.1. Kullback (-Leibler) information......Page 130
4.9. Bibliography......Page 131
5.1. Introduction......Page 135
5.2.1. General framework......Page 136
5.2.2. Functional framework......Page 138
5.2.3. Cost and regularization......Page 139
5.2.4. The aims of realistic learning processes......Page 140
5.3.1. Primal and dual forms......Page 141
5.4. Methods and resolution......Page 143
5.4.2. Tools to be used......Page 144
5.4.4. Decomposition methods......Page 145
5.4.5. Solving quadratic problems......Page 147
5.4.6. Online and non-optimized methods......Page 150
5.4.7. Comparisons......Page 151
5.5.1. Comparison of empirical complexity......Page 152
5.5.2. Very large databases......Page 154
5.6. Conclusion......Page 156
5.7. Bibliography......Page 157
6.1. Continuum, a path toward oblivion......Page 161
6.2. The cross-entropy (CE) method......Page 162
6.2.1. Probability of rare events......Page 163
6.2.2.CE applied to optimization......Page 167
6.3. Examples of implementation of CE for surveillance......Page 170
6.3.1. Introducing the problem......Page 171
6.3.2. Optimizing the distribution of resources......Page 173
6.3.3. Allocating sensors to zones......Page 174
6.3.4. Implementation......Page 175
6.4.1. Definition of the problem......Page 177
6.4.2. Applying the CE......Page 180
6.4.3. Analyzing a simple example......Page 181
6.5. Optimal control under partial observation......Page 182
6.5.1. Decision-making in partially observed environments......Page 183
6.5.2. Implementing CE......Page 186
6.5.3. Example......Page 187
6.7. Bibliography......Page 190
7.1. Introduction......Page 193
7.2.1. Estimability measurement of the problem......Page 194
7.2.2. Framework for computing exterior products......Page 197
7.3. Application to the optimization of emissions (deterministic case)......Page 199
7.3.1. The case of a maneuvering target......Page 204
7.4. The case of a target with a Markov trajectory......Page 205
7.6. Appendix: monotonous functional matrices......Page 213
7.7. Bibliography......Page 216
8.1. Introduction and application framework......Page 219
8.2. Detection, segmentation and classification......Page 220
8.3.1. Markov modeling......Page 223
8.3.2. Bayesian inference......Page 224
8.4. Segmentation using the causal-in-scale Markov model......Page 225
8.5. Segmentation into three classes......Page 227
8.6. The classification of objects......Page 230
8.7. The classification of seabeds......Page 236
8.8. Conclusion and perspectives......Page 238
8.9. Bibliography......Page 239
9.1. Introduction......Page 243
9.2.1. Definition......Page 244
9.2.2. The criteria used in programming hidden Markov models......Page 245
9.3.1. The different types of solution spaces used for the training of HMMs......Page 247
9.3.2. The metaheuristics used for the training of the HMMs......Page 249
9.4.1. Genetic algorithms......Page 250
9.4.2. The API algorithm......Page 252
9.4.3. Particle swarm optimization......Page 254
9.4.4. A behavioral comparison of the metaheuristics......Page 257
9.4.5. Parameter setting of the algorithms......Page 258
9.4.6. Comparing the algorithms’ performances......Page 261
9.6. Bibliography......Page 264
10.1. Introduction......Page 269
10.2. Relationship to existing works......Page 270
10.4.1. A priori energy......Page 272
10.4.2. Image energy......Page 273
10.5.1. Evolution strategies......Page 276
10.5.2. Clonal selection (CS)......Page 279
10.6.1. Preliminaries......Page 283
10.6.2. Evaluation on the CD3 sequence......Page 285
10.7. Conclusion......Page 289
10.8. Bibliography......Page 290
11.1. Introduction......Page 293
11.2.1. Difficult optimization......Page 294
11.2.2. Optimization algorithms......Page 296
11.3.1. Existing methods......Page 299
11.3.2. A possible optimization method for image registration......Page 301
11.4. Optimizing the image registration process......Page 303
11.4.1. The objective function......Page 304
11.4.2. The Nelder-Mead algorithm......Page 305
11.4.3. The hybrid continuous interacting ant colony (HCIAC)......Page 307
11.4.4. The continuous hybrid estimation of distribution algorithm......Page 309
11.5.1. Preliminary tests......Page 312
11.5.3. Typical cases......Page 315
11.5.4. Additional problems......Page 317
11.6. Analysis of the results......Page 319
11.9. Bibliography......Page 320
12.1. Introduction......Page 325
12.2. Brainstem Auditory Evoked Potentials (BAEPs)......Page 326
12.3. Processing BAEPs......Page 327
12.4. Genetic algorithms......Page 329
12.5. BAEP dynamics......Page 331
12.5.1. Validation of the simulated signal approach......Page 337
12.5.2. Validating the approach on real signals......Page 344
12.5.3. Acceleration of the GA’s convergence time......Page 345
12.6. The non-stationarity of the shape of the BAEPs......Page 348
12.8. Bibliography......Page 351
13.1. Introduction......Page 353
13.1.1. Finding good parameters for the processor......Page 354
13.1.2. Interacting with the patient......Page 355
13.2. Choosing an optimization algorithm......Page 357
13.3. Adapting an evolutionary algorithm to the interactive fitting of cochlear implants......Page 358
13.3.1. Population size and the number of children per generation......Page 359
13.3.4. Crossover......Page 360
13.4. Evaluation......Page 361
13.5.1. The first experiment with patient A......Page 363
13.5.2. Analyzing the results......Page 367
13.5.3. Second set of experiments: verifying the hypotheses......Page 369
13.5.4. Third set of experiments with other patients......Page 373
13.6. Medical issues which were raised during the experiments......Page 374
13.7. Algorithmic conclusions for patient A......Page 376
13.9. Bibliography......Page 378
List of Authors......Page 381
Index......Page 383