Optimized precision - A new measure for classifier performance evaluation

Romesh Ranawana, Vasile Palade

Research output: Chapter in Book/Report/Conference proceedingConference proceedingpeer-review

67 Citations (Scopus)

Abstract

All learning algorithms attempt to improve the accuracy of a classification system. However, the effectiveness of such a system is dependent on the heuristic used by the learning paradigm to measure performance. This paper demonstrates that the use of Precision (P) for performance evaluation of unbalanced data sets could lead the solution towards sub-optimal answers. We move onto present a novel performance heuristic, the 'Optimized Precision (OP)', to negate these detrimental effects. We also analyze the impact of these observations on the training performance of ensemble learners and Multi-Classifier Systems (MCS), and provide guidelines for the proper training of multi-classifier systems.

Original languageEnglish
Title of host publication2006 IEEE Congress on Evolutionary Computation, CEC 2006
PublisherIEEE
Pages2254-2261
Number of pages8
ISBN (Print)0780394879, 9780780394872
DOIs
Publication statusPublished - 11 Sept 2006
Externally publishedYes
Event2006 IEEE Congress on Evolutionary Computation, CEC 2006 - Vancouver, BC, Canada
Duration: 16 Jul 200621 Jul 2006

Conference

Conference2006 IEEE Congress on Evolutionary Computation, CEC 2006
Country/TerritoryCanada
CityVancouver, BC
Period16/07/0621/07/06

Keywords

  • Computational intelligence
  • Laboratories
  • Performance analysis
  • Guidelines
  • Wikipedia
  • Classification algorithms
  • Competitive intelligence
  • Measurement
  • Multilayer perceptrons
  • Mean square error methods

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Theoretical Computer Science

Fingerprint

Dive into the research topics of 'Optimized precision - A new measure for classifier performance evaluation'. Together they form a unique fingerprint.

Cite this