In many real binary classification problems, in addition to the presence of positive and negative classes, we are also given the examples of third neutral class, i.e., the examples with uncertain or intermediate state between positive and negative. Although it is a common practice to ignore the neutral class in a learning process, its appropriate use can lead to the improvement in classification accuracy. In this paper, to include neutral examples in a training stage, we adapt two variants of Tri-Class SVM (proposed by Angulo et al. in Neural Process Lett 23(1):89–101, 2006), the method designed to solve three-class problems with a use of single learning model. In analogy to classical SVM, we look for such a hyperplane, which maximizes the margin between positive and negative instances and which is localized as close to the neutral class as possible. In addition to original Angulo’s paper, we give a new interpretation of the model and show that it can be easily implemented in the primal. Our experiments demonstrate that considered methods obtain better results in binary classification problems than classical SVM and semi-supervised SVM.
pl
dc.subject.en
classification
pl
dc.subject.en
SVM
pl
dc.subject.en
semi-supervised learning
pl
dc.subject.en
cheminformatics
pl
dc.description.volume
22
pl
dc.description.number
2
pl
dc.identifier.doi
10.1007/s10044-017-0654-3
pl
dc.identifier.eissn
1433-755X
pl
dc.title.journal
Pattern Analysis and Applications
pl
dc.language.container
eng
pl
dc.affiliation
Wydział Matematyki i Informatyki : Instytut Informatyki i Matematyki Komputerowej
pl
dc.subtype
Article
pl
dc.rights.original
CC-BY; inne; ostateczna wersja wydawcy; w momencie opublikowania; 0