Abstract
All learning algorithms attempt to improve the accuracy of a classification system. However, the effectiveness of such a system is dependent on the heuristic used by the learning paradigm to measure performance. This paper demonstrates that the use of Precision (P) for performance evaluation of unbalanced data sets could lead the solution towards sub-optimal answers. We move onto present a novel performance heuristic, the 'Optimized Precision (OP)', to negate these detrimental effects. We also analyze the impact of these observations on the training performance of ensemble learners and Multi-Classifier Systems (MCS), and provide guidelines for the proper training of multi-classifier systems.
Original language | English |
---|---|
Title of host publication | 2006 IEEE Congress on Evolutionary Computation, CEC 2006 |
Publisher | IEEE |
Pages | 2254-2261 |
Number of pages | 8 |
ISBN (Print) | 0780394879, 9780780394872 |
DOIs | |
Publication status | Published - 11 Sept 2006 |
Externally published | Yes |
Event | 2006 IEEE Congress on Evolutionary Computation, CEC 2006 - Vancouver, BC, Canada Duration: 16 Jul 2006 → 21 Jul 2006 |
Conference
Conference | 2006 IEEE Congress on Evolutionary Computation, CEC 2006 |
---|---|
Country/Territory | Canada |
City | Vancouver, BC |
Period | 16/07/06 → 21/07/06 |
Keywords
- Computational intelligence
- Laboratories
- Performance analysis
- Guidelines
- Wikipedia
- Classification algorithms
- Competitive intelligence
- Measurement
- Multilayer perceptrons
- Mean square error methods
ASJC Scopus subject areas
- Artificial Intelligence
- Software
- Theoretical Computer Science