Combining k-nearest neighbor and centroid neighbor classifier for fast and robust classification

2016
book section
conference proceedings
2
cris.lastimport.wos2024-04-09T19:34:56Z
dc.abstract.enThe k-NN classifier is one of the most known and widely used nonparametric classifiers. The k-NN rule is optimal in the asymptotic case which means that its classification error aims for Bayes error if the number of the training samples approaches infinity. A lot of alternative extensions of the traditional k-NN have been developed to improve the classification accuracy. However, it is also well-known fact that when the number of the samples grows it can become very inefficient because we have to compute all the distances from the testing sample to every sample from the training data set. In this paper, a simple method which addresses this issue is proposed. Combining k-NN classifier with the centroid neighbor classifier improves the speed of the algorithm without changing the results of the original k-NN. In fact usage confusion matrices and excluding outliers makes the resulting algorithm much faster and robust.pl
dc.affiliationWydział Fizyki, Astronomii i Informatyki Stosowanej : Zakład Technologii Gierpl
dc.conference11th International Conference on Hybrid Artificial Intelligent Systems, HAIS 2016pl
dc.conference.citySewilla
dc.conference.countryHiszpania
dc.conference.datefinish2016-04-20
dc.conference.datestart2016-04-18
dc.conference.indexscopustrue
dc.conference.indexwostrue
dc.contributor.authorChmielnicki, Wiesław - 160876 pl
dc.contributor.editorMartínez-Álvarez, Franciscopl
dc.contributor.editorTroncoso, Aliciapl
dc.contributor.editorQuintián, Héctorpl
dc.contributor.editorCorchado, Emiliopl
dc.date.accessioned2016-06-30T11:02:43Z
dc.date.available2016-06-30T11:02:43Z
dc.date.issued2016pl
dc.description.conftypeinternationalpl
dc.description.physical536-548pl
dc.description.publication0,8pl
dc.description.seriesLecture Notes in Computer Science. Lecture Notes in Artificial Intelligence
dc.description.seriesLecture Notes in Computer Science
dc.description.seriesnumber9648
dc.identifier.doi10.1007/978-3-319-32034-2_45pl
dc.identifier.eisbn978-3-319-32034-2pl
dc.identifier.isbn978-3-319-32033-5pl
dc.identifier.serieseissn1611-3349
dc.identifier.seriesissn0302-9743
dc.identifier.urihttp://ruj.uj.edu.pl/xmlui/handle/item/28536
dc.languageengpl
dc.language.containerengpl
dc.pubinfo[s.l.] : Springer International Publishingpl
dc.rightsDodaję tylko opis bibliograficzny*
dc.rights.licencebez licencji
dc.rights.uri*
dc.subject.enk-NN classifierpl
dc.subject.enconfusion matrixpl
dc.subject.enstatistical classifierspl
dc.subject.ensupervised classificationpl
dc.subject.enmulticlass classifierspl
dc.subtypeConferenceProceedingspl
dc.titleCombining k-nearest neighbor and centroid neighbor classifier for fast and robust classificationpl
dc.title.containerHybrid artificial intelligent systems : 11th International Conference, HAIS 2016, Seville, Spain, April 18-20, 2016 : proceedingspl
dc.typeBookSectionpl
dspace.entity.typePublication
Affiliations

* The migration of download and view statistics prior to the date of April 8, 2024 is in progress.

Views
0
Views per month

No access

No Thumbnail Available