Hebbian continual representation learning

2023
book section
conference proceedings
dc.abstract.enContinual Learning aims to bring machine learning into a more realistic scenario, where tasks are learned sequentially and the i.i.d. assumption is not preserved. Although this setting is natural for biological systems, it proves very difficult for machine learning models such as artificial neural networks. To reduce this performance gap, we investigate the question whether biologically inspired Hebbian learning is useful for tackling continual challenges. In particular, we highlight a realistic and often overlooked unsupervised setting, where the learner has to build representations without any supervision. By combining sparse neural networks with Hebbian learning principle, we build a simple yet effective alternative (HebbCL) to typical neural network models trained via the gradient descent. Due to Hebbian learning, the network have easily interpretable weights, which might be essential in critical application such as security or healthcare. We demonstrate the efficacy of HebbCL in an unsupervised learning setting applied to MNIST and Omniglot datasets. We also adapt the algorithm to the supervised scenario and obtain promising results in the class-incremental learning.pl
dc.affiliationWydział Matematyki i Informatyki : Instytut Informatyki i Matematyki Komputerowejpl
dc.affiliationSzkoła Doktorska Nauk Ścisłych i Przyrodniczychpl
dc.conference56th Annual Hawaii International Conference on System Sciencespl
dc.conference.cityMaui
dc.conference.countryHawaje, Stany Zjednoczone
dc.conference.datefinish2023-01-06
dc.conference.datestart2023-01-03
dc.conference.shortcutHICSS
dc.contributor.authorMorawiecki, Pawelpl
dc.contributor.authorKrutsylo, Andriipl
dc.contributor.authorWołczyk, Maciej - 247731 pl
dc.contributor.authorŚmieja, Marek - 135996 pl
dc.contributor.editorBui, Tung X.pl
dc.date.accession2023-01-23pl
dc.date.accessioned2023-02-20T15:54:42Z
dc.date.available2023-02-20T15:54:42Z
dc.date.issued2023pl
dc.date.openaccess1
dc.description.accesstimepo opublikowaniu
dc.description.conftypeinternationalpl
dc.description.physical1259-1268pl
dc.description.versionostateczna wersja wydawcy
dc.identifier.bookweblinkhttps://hdl.handle.net/10125/102785pl
dc.identifier.eisbn2572-6862pl
dc.identifier.isbn978-0-9981331-6-4pl
dc.identifier.urihttps://ruj.uj.edu.pl/xmlui/handle/item/308035
dc.identifier.weblinkhttps://hdl.handle.net/10125/102785pl
dc.languageengpl
dc.language.containerengpl
dc.pubinfoHonolulu : HIpl
dc.rightsUdzielam licencji. Uznanie autorstwa - Użycie niekomercyjne - Bez utworów zależnych 4.0 Międzynarodowa*
dc.rights.licenceCC-BY-NC-ND
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/legalcode.pl*
dc.share.typeotwarte repozytorium
dc.subject.eninterpretable neural networkpl
dc.subject.encontinual learningpl
dc.subject.enunsupervised learningpl
dc.subject.enhebbian learningpl
dc.subtypeConferenceProceedingspl
dc.titleHebbian continual representation learningpl
dc.title.containerProceedings of the 56th Annual Hawaii International Conference on System Sciences, HICSS 2023, Hyatt Regency Maui, Hawaii, USA, 3-6 January 2023, Maui, Hawaiipl
dc.typeBookSectionpl
dspace.entity.typePublication
Affiliations

* The migration of download and view statistics prior to the date of April 8, 2024 is in progress.

Views
0
Views per month
Downloads
wolczyk_smieja_hebbian_continual_representation_learning_2023.pdf
2