
000 | 00968pamuu2200277 a 4500 | |
001 | 000000699420 | |
005 | 20010328171022 | |
008 | 980729s1999 enka b 001 0 eng | |
010 | ▼a 98038717 | |
015 | ▼a GB99-07072 | |
020 | ▼a 185233004X (pbk. : alk. paper) | |
040 | ▼a DLC ▼c DLC ▼d UKM ▼d OHX ▼d 211009 | |
049 | 1 | ▼l 121046820 ▼f 과학 |
050 | 0 0 | ▼a QA76.87 ▼b .C663 1999 |
072 | 7 | ▼a QA ▼2 lcco |
082 | 0 0 | ▼a 006.3/2 ▼2 21 |
090 | ▼a 006.32 ▼b C731 | |
245 | 0 0 | ▼a Combining artificial neural nets : ▼b ensemble and modular multi-net systems / ▼c Amanda J.C Sharkey, ed. |
260 | ▼a London ; ▼a New York : ▼b Springer, ▼c c1999. | |
300 | ▼a xv, 298 p. : ▼b ill. ; ▼c 24 cm. | |
440 | 0 | ▼a Perspectives in neural computing |
504 | ▼a Includes bibliographical references and index. | |
650 | 0 | ▼a Neural networks (Computer science) |
650 | 4 | ▼a Neural networks (Computer science) |
700 | 1 | ▼a Sharkey, Amanda J. C. , ▼d 1957- |
소장정보
No. | 소장처 | 청구기호 | 등록번호 | 도서상태 | 반납예정일 | 예약 | 서비스 |
---|---|---|---|---|---|---|---|
No. 1 | 소장처 과학도서관/Sci-Info(2층서고)/ | 청구기호 006.32 C731 | 등록번호 121046820 | 도서상태 대출가능 | 반납예정일 | 예약 | 서비스 |
컨텐츠정보
목차
'1. Multi-Net Systems.- 1.0.1 Different Forms of Multi-Net System.- 1.1 Ensembles.- 1.1.1 Why Create Ensembles?.- 1.1.2 Methods for Creating Ensemble Members.- 1.1.3 Methods for Combining Nets in Ensembles.- 1.1.4 Choosing a Method for Ensemble Creation and Combination.- 1.2 Modular Approaches.- 1.2.1 Why Create Modular Systems?.- 1.2.2 Methods for Creating Modular Components.- 1.2.3 Methods for Combining Modular Components.- 1.3 The Chapters in this Book.- 1.4 References.- 2. Combining Predictors.- 2.1 Combine and Conquer.- 2.2 Regression.- 2.2.1 Bias and Variance.- 2.2.2 Bagging - The Pseudo-Fairy Godmother.- 2.2.3 Results of Bagging.- 2.3 Classification.- 2.3.1 Bias and Spread.- 2.3.2 Examples.- 2.3.3 Bagging Classifiers.- 2.4 Remarks.- 2.4.1 Pruning.- 2.4.2 Randomising the Construction.- 2.4.3 Randomising the Outputs.- 2.5 Adaboost and Arcing.- 2.5.1 The Adaboost Algorithm.- 2.5.2 What Makes Adaboost Work?.- 2.6 Recent Research.- 2.6.1 Margins.- 2.6.2 Using Simple Classifiers.- 2.6.3 Instability is Needed.- 2.7 Coda.- 2.7.1 Heisenberg's Principle for Statistical Prediction.- 2.8 References.- 3. Boosting Using Neural Networks.- 3.1 Introduction.- 3.2 Bagging.- 3.2.1 Classification.- 3.2.2 Regression.- 3.2.3 Remarks.- 3.3 Boosting.- 3.3.1 Introduction.- 3.3.2 A First Implementation: Boostl.- 3.3.3 Adaboost.M1.- 3.3.4 AdaBoost.M2.- 3.3.5 AdaBoost.R2.- 3.4 Other Ensemble Techniques.- 3.5 Neural Networks.- 3.5.1 Classification.- 3.5.2 Early Stopping.- 3.5.3 Regression.- 3.6 Trees.- 3.6.1 Training Classification Trees.- 3.6.2 Pruning Classification Trees.- 3.6.3 Training Regression Trees.- 3.6.4 Pruning Regression Trees.- 3.7 Trees vs. Neural Nets.- 3.8 Experiments.- 3.8.1 Experiments Using Boostl.- 3.8.2 Experiments Using AdaBoost.- 3.8.3 Experiments Using AdaBoost.R2.- 3.9 Conclusions.- 3.10 References.- 4. A Genetic Algorithm Approach for Creating Neural Network Ensembles.- 4.1 Introduction.- 4.2 Neural Network Ensembles.- 4.3 The ADDEMUP Algorithm.- 4.3.1 ADDEMUP's Top-Level Design.- 4.3.2 Creating and Crossing-Over KNNs.- 4.4 Experimental Study.- 4.4.1 Generalisation Ability of ADDEMUP.- 4.4.2 Lesion Study of ADDEMUP.- 4.5 Discussion and Future Work.- 4.6 Additional Related Work.- 4.7 Conclusions.- 4.8 References.- 5. Treating Harmful Collinearity in Neural Network Ensembles.- 5.1 Introduction.- 5.2 Overview of Optimal Linear Combinations (OLC) of Neural Networks.- 5.3 Effects of Collinearity on Combining Neural Networks.- 5.3.1 Collinearity in the Literature on Combining Estimators.- 5.3.2 Testing the Robustness of NN Ensembles.- 5.3.3 Collinearity, Correlation, and Ensemble Ambiguity.- 5.3.4 The Harmful Effects of Collinearity.- 5.4 Improving the Generalisation of NN Ensembles by Treating Harmful Collinearity.- 5.4.1 Two Algorithms for Selecting the Component NNs in the Ensemble.- 5.4.2 Modification to the Algorithms.- 5.5 Experimental Results.- 5.5.1 Problem I.- 5.5.2 Problem II.- 5.5.3 Discussion of the Experimental Results.- 5.6 Concluding Remarks.- 5.7 References.- 6. Linear and Order Statistics Combiners for Pattern Classification.- 6.1 Introduction.- 6.2 Class Boundary Analysis and Error Regions.- 6.3 Linear Combining.- 6.3.1 Linear Combining of Unbiased Classifiers.- 6.3.2 Linear Combining of Biased Classifiers.- 6.4 Order Statistics.- 6.4.1 Introduction.- 6.4.2 Background.- 6.4.3 Combining Unbiased Classifiers Through OS.- 6.4.4 Combining Biased Classifiers Through OS.- 6.5 Correlated Classifier Combining.- 6.5.1 Introduction.- 6.5.2 Combining Unbiased Correlated Classifiers.- 6.5.3 Combining Biased Correlated Classifiers.- 6.5.4 Discussion.- 6.6 Experimental Combining Results.- 6.6.1 Oceanic Data Set.- 6.6.2 Probenl Benchmarks.- 6.7 Discussion.- 6.8 References.- 7. Variance Reduction via Noise and Bias Constraints.- 7.1 Introduction.- 7.2 Theoretical Considerations.- 7.3 The BootstrapEnsemble with Noise Algorithm.-
정보제공 :
