Simple view
Full metadata view
Authors
Statistics
Extreme entropy machines : robust information theoretic classification
rapid learning
extreme learning machines
random projections
classification
entropy
Most existing classification methods are aimed at minimization of empirical risk (through some simple point-based error measured with loss function) with added regularization. We propose to approach the classification problem by applying entropy measures as a model objective function. We focus on quadratic Renyi’s entropy and connected Cauchy-Schwarz Divergence which leads to the construction of extreme entropy machines (EEM). The main contribution of this paper is proposing a model based on the information theoretic concepts which on the one hand shows new, entropic perspective on known linear classifiers and on the other leads to a construction of very robust method competitive with the state of the art noninformation theoretic ones (including Support Vector Machines and Extreme Learning Machines). Evaluation on numerous problems spanning from small, simple ones from UCI repository to the large (hundreds of thousands of samples) extremely unbalanced (up to 100:1 classes’ ratios) datasets shows wide applicability of the EEM in real-life problems. Furthermore, it scales better than all considered competitive methods.
dc.abstract.en | Most existing classification methods are aimed at minimization of empirical risk (through some simple point-based error measured with loss function) with added regularization. We propose to approach the classification problem by applying entropy measures as a model objective function. We focus on quadratic Renyi’s entropy and connected Cauchy-Schwarz Divergence which leads to the construction of extreme entropy machines (EEM). The main contribution of this paper is proposing a model based on the information theoretic concepts which on the one hand shows new, entropic perspective on known linear classifiers and on the other leads to a construction of very robust method competitive with the state of the art noninformation theoretic ones (including Support Vector Machines and Extreme Learning Machines). Evaluation on numerous problems spanning from small, simple ones from UCI repository to the large (hundreds of thousands of samples) extremely unbalanced (up to 100:1 classes’ ratios) datasets shows wide applicability of the EEM in real-life problems. Furthermore, it scales better than all considered competitive methods. | pl |
dc.affiliation | Wydział Matematyki i Informatyki : Instytut Informatyki i Matematyki Komputerowej | pl |
dc.contributor.author | Czarnecki, Wojciech - 115076 | pl |
dc.contributor.author | Tabor, Jacek - 132362 | pl |
dc.date.accessioned | 2017-04-26T10:41:11Z | |
dc.date.available | 2017-04-26T10:41:11Z | |
dc.date.issued | 2017 | pl |
dc.date.openaccess | 0 | |
dc.description.accesstime | w momencie opublikowania | |
dc.description.number | 2 | pl |
dc.description.physical | 383-400 | pl |
dc.description.version | ostateczna wersja wydawcy | |
dc.description.volume | 20 | pl |
dc.identifier.doi | 10.1007/s10044-015-0497-8 | pl |
dc.identifier.eissn | 1433-755X | pl |
dc.identifier.issn | 1433-7541 | pl |
dc.identifier.uri | http://ruj.uj.edu.pl/xmlui/handle/item/39735 | |
dc.language | eng | pl |
dc.language.container | eng | pl |
dc.rights | Udzielam licencji. Uznanie autorstwa 3.0 Polska | * |
dc.rights.licence | CC-BY | |
dc.rights.uri | http://creativecommons.org/licenses/by/3.0/pl/legalcode | * |
dc.share.type | inne | |
dc.subject.en | rapid learning | pl |
dc.subject.en | extreme learning machines | pl |
dc.subject.en | random projections | pl |
dc.subject.en | classification | pl |
dc.subject.en | entropy | pl |
dc.subtype | Article | pl |
dc.title | Extreme entropy machines : robust information theoretic classification | pl |
dc.title.journal | Pattern Analysis and Applications | pl |
dc.type | JournalArticle | pl |
dspace.entity.type | Publication |