Considering Decision Cost During Learning of Feature Weights

  • This paper is to present a new algorithm, called KNNcost, for learning feature weights for CBR systems used for classification. Unlike algorithms known so far, KNNcost considers the profits of a correct and the cost of a wrong decision. The need for this algorithm is motivated from two real-world applications, where cost and profits of decisions play a major role. We introduce a representation of accuracy, cost and profits of decisions and define the decision cost of a classification system. To compare accuracy optimization with cost optimization, we tested KNNacc against KNNcost. The first one optimizes classification accuracy with a conjugate gradient algorithm. The second one optimizes the decision cost of the CBR system, respecting cost and profits of the classifications. We present experiments with these two algorithms in a real application to demonstrate the usefulness of our approach.

Export metadata

  • Export Bibtex
  • Export RIS

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Ralph Bergmann, Wolfgang Wilke
URN (permanent link):urn:nbn:de:hbz:386-kluedo-957
Document Type:Preprint
Language of publication:English
Year of Completion:1996
Year of Publication:1996
Publishing Institute:Technische Universität Kaiserslautern
Faculties / Organisational entities:Fachbereich Informatik
DDC-Cassification:004 Datenverarbeitung; Informatik

$Rev: 12793 $