Feature Extraction Foundations and Applications. Pattern Recognition

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Author(s): Guyon Isabelle
Publisher: Springer
Year: 2006

Language: English
Pages: 781

Contents......Page 7
An Introduction to Feature Extraction......Page 15
Part I Feature Extraction Fundamentals......Page 41
1 Learning Machines......Page 43
2 Assessment Methods......Page 79
3 Filter methods......Page 103
4 Search Strategies......Page 133
5 Embedded Methods......Page 153
6 Information-Theoretic Methods......Page 183
7 Ensemble Learning......Page 203
8 Fuzzy Neural Networks......Page 223
Part II Feature Selection Challenge......Page 253
9 Design and Analysis of the NIPS2003 Challenge......Page 255
10 High Dimensional Classification with Bayesian Neural Networks and Dirichlet Diffusion Trees......Page 283
11 Ensembles of Regularized Least Squares Classifiers for High-Dimensional Problems......Page 315
12 Combining SVMs with Various Feature Selection Strategies......Page 333
13 Feature Selection with Transductive Support Vector Machines......Page 343
14 Variable Selection using Correlation and Single Variable Classifier Methods: Applications......Page 361
15 Tree-Based Ensembles with Dynamic Soft Feature Selection......Page 377
16 Sparse, Flexible and Efficient Modeling using L1 Regularization......Page 393
17 Margin Based Feature Selection and Infogain with Standard Classifiers......Page 413
18 Bayesian Support Vector Machines for Feature Ranking and Selection......Page 421
19 Nonlinear Feature Selection with the Potential Support Vector Machine......Page 437
20 Combining a Filter Method with SVMs......Page 455
21 Feature Selection via Sensitivity Analysis with Direct Kernel PLS......Page 463
22 Information Gain, Correlation and Support Vector Machines......Page 479
23 Mining for Complex Models Comprising Feature Selection and Classification......Page 487
24 Combining Information-Based Supervised and Unsupervised Feature Selection......Page 505
25 An Enhanced Selective Naïve Bayes Method with Optimal Discretization......Page 515
26 An Input Variable Importance Definition based on Empirical Data Probability Distribution......Page 523
Part III New Perspectives in Feature Extraction......Page 531
27 Spectral Dimensionality Reduction......Page 533
28 Constructing Orthogonal Latent Features for Arbitrary Loss......Page 561
29 Large Margin Principles for Feature Selection......Page 591
30 Feature Extraction for Classification of Proteomic Mass Spectra: A Comparative Study......Page 611
31 Sequence motifs: highly predictive features of protein function......Page 631
Appendix A Elementary Statistics......Page 653
Elementary Statistics......Page 655
References......Page 669
Appendix B Feature Selection Challenge Datasets......Page 671
Experimental design......Page 673
Arcene......Page 677
Gisette......Page 685
Dexter......Page 689
Dorothea......Page 693
Madelon......Page 697
Matlab code of the lambda method......Page 703
Matlab code used to generate Madelon......Page 705
Appendix C Feature Selection Challenge Fact Sheets......Page 711
10 High Dimensional Classification with Bayesian Neural Networks and Dirichlet Diffusion Trees......Page 713
11 Ensembles of Regularized Least Squares Classifiers for High-Dimensional Problems......Page 715
12 Combining SVMs with Various Feature Selection Strategies......Page 717
13 Feature Selection with Transductive Support Vector Machines......Page 719
14 Variable Selection using Correlation and SVC Methods: Applications......Page 721
15 Tree-Based Ensembles with Dynamic Soft Feature Selection......Page 723
16 Sparse, Flexible and Efficient Modeling using L1 Regularization......Page 725
17 Margin Based Feature Selection and Infogain with Standard Classifiers......Page 727
18 Bayesian Support Vector Machines for Feature Ranking and Selection......Page 729
19 Nonlinear Feature Selection with the Potential Support Vector Machine......Page 731
20 Combining a Filter Method with SVMs......Page 733
21 Feature Selection via Sensitivity Analysis with Direct Kernel PLS......Page 735
22 Information Gain, Correlation and Support Vector Machines......Page 737
23 Mining for Complex Models Comprising Feature Selection and Classification......Page 739
24 Combining Information-Based Supervised and Unsupervised Feature Selection......Page 741
25 An Enhanced Selective Naïve Bayes Method with Optimal Discretization......Page 743
26 An Input Variable Importance Definition based on Empirical Data Probability Distribution......Page 745
Appendix D Feature Selection Challenge Results Tables......Page 747
Result Tables of the NIPS2003 Challenge......Page 749
Arcene......Page 751
Dexter......Page 755
Dorothea......Page 759
Gisette......Page 763
Madelon......Page 767
Overall results......Page 771
Index......Page 775