445 Hits in 4.0 sec

Hebbian Semi-Supervised Learning in a Sample Efficiency Setting [article]

Gabriele Lagani, Fabrizio Falchi, Claudio Gennaro, Giuseppe Amato
2021 arXiv   pre-print
We propose to address the issue of sample efficiency, in Deep Convolutional Neural Networks (DCNN), with a semisupervised training strategy that combines Hebbian learning with gradient descent: all internal  ...  We performed experiments on various object recognition datasets, in different regimes of sample efficiency, comparing our semi-supervised (Hebbian for internal layers + SGD for the final fully layer) approach  ...  Section 4 defines our approach to sample efficiency based on semi-supervised Hebbian + SGD learning; Section 5 provides a background on the Hebbian learning rule that we used in this work; Section 6 delves  ... 
arXiv:2103.09002v1 fatcat:rba5wzatynanrd5nuf73kevov4

Semi-supervised learning with Bayesian Confidence Propagation Neural Network [article]

Naresh Balaji Ravichandran, Anders Lansner, Pawel Herman
2021 arXiv   pre-print
In this work, we show how such representations can be leveraged in a semi-supervised setting by introducing and comparing different classifiers.  ...  We also evaluate and compare such networks with other popular semi-supervised classifiers.  ...  Semi-supervised learning methods aim to remedy this problem by learning effectively from few labelled samples [1] .  ... 
arXiv:2106.15546v1 fatcat:g4txchd2jjaydkt4uanbn5zmfy

Learning Sparse, Distributed Representations using the Hebbian Principle [article]

Aseem Wadhwa, Upamanyu Madhow
2016 arXiv   pre-print
The "fire together, wire together" Hebbian model is a central principle for learning in neuroscience, but surprisingly, it has found limited applicability in modern machine learning.  ...  In this paper, we take a first step towards bridging this gap, by developing flavors of competitive Hebbian learning which produce sparse, distributed neural codes using online adaptation with minimal  ...  on labels, and our results motivate further investigation into Hebbian algorithms as building blocks for semi-supervised learning.  ... 
arXiv:1611.04228v1 fatcat:uoz4osftk5gbhcchrhjg4htbwq

A Single-Layer Semi-Supervised Feed Forward Neural Network Clustering Method

Roya Asadi, Sameem Abdul Kareem, Mitra Asadi, Shokoofeh Asadi
2015 Malaysian Journal of Computer Science  
The aim of this research is to develop and propose a single-layer semi-supervised feed forward neural network clustering method with one epoch training in order to solve the problems of low training speed  ...  A code book of non-random weights is learned through the input data directly.  ...  Fig. 2 shows a sample topology of an unsupervised neural network with competitive learning [17, 18] : The similarities between Hebbian learning and competitive learning include unsupervised learning  ... 
doi:10.22452/mjcs.vol28no3.2 fatcat:6v23vql5kzh67dz4kme2zwaw2y

Improving robustness against electrode shift of sEMG based hand gesture recognition using online semi-supervised learning

Qiu Xia Li, Patrick P. K. Chan, Dalin Zhou, Yinfeng Fang, Honghai Liu, Daniel S. Yeung
2016 2016 International Conference on Machine Learning and Cybernetics (ICMLC)  
In this paper, we apply an online semi-supervised learning in which a classifier is trained with a small amount of labeled samples and then is updated with unlabeled samples online to hand gesture recognition  ...  A well-known online semi-supervised learning algorithm, online multi-channel semi-supervised growing neural gas (OSSMGNG) algorithm, is used in this preliminary study.  ...  A classifier is trained with a small amount of labeled samples and then is updated with unlabeled samples online in online semi-supervised learning.  ... 
doi:10.1109/icmlc.2016.7860925 dblp:conf/icmlc/LiCZFLY16 fatcat:bc7u23g5xzfkfc5k6e3gblkov4

A dynamic semisupervised feedforward neural network clustering

Roya Asadi, Sameem Abdul Kareem, Shokoofeh Asadi, Mitra Asadi
2016 Artificial intelligence for engineering design, analysis and manufacturing  
After learning, the model assigns a class label to the unlabeled data by considering a linear activation function and the exclusive threshold.  ...  Learning Repository; and the superior F-measure results in between 98.14% and 100% accuracies for the breast cancer data set from the University of Malaya Medical Center.  ...  Figure 17 shows a sample of this data set.  ... 
doi:10.1017/s0890060416000160 fatcat:hd4iuekcpfgiddgcdpur6uwvum

Comparative Study of Microarray Based Disease Prediction - A Survey

T. Sneka, K. Palanivel
2019 International Journal of Scientific Research in Computer Science Engineering and Information Technology  
Hence, it is used in analyzing samples that may be normal or affected, also in diagnosing various gene-based diseases.  ...  This survey shows that how semi-supervised approach evolves and outperforms the existing algorithms.  ...  . • To determine how supervised, unsupervised and semi-supervised methods contribute in prediction of disease. • Also, how semi-supervised method evolving with improved accuracy.  ... 
doi:10.32628/cseit195435 fatcat:jpbopg3mmvczpm4q3bqcdk55pq

Truncated Variational EM for Semi-Supervised Neural Simpletrons [article]

Dennis Forster, Jörg Lücke
2017 arXiv   pre-print
To obtain efficiently trainable, large-scale and well performing generative networks for semi-supervised learning, we here combine two recent developments: a neural network reformulation of hierarchical  ...  Experiments on the MNIST data set herein allow for comparison to standard and state-of-the-art models in the semi-supervised setting.  ...  [6] still being better in semi-supervised settings when more labels were available).  ... 
arXiv:1702.01997v1 fatcat:jqavjndbozfdljoom5uy5crlem

Neuroscience-inspired online unsupervised learning algorithms [article]

Cengiz Pehlevan, Dmitri B. Chklovskii
2019 arXiv   pre-print
Motivated by this and biological implausibility of deep learning networks, we developed a family of biologically plausible artificial neural networks (NNs) for unsupervised learning.  ...  Although the currently popular deep learning networks achieve unprecedented performance on some tasks, the human brain still has a monopoly on general intelligence.  ...  This work was in part supported by a gift from the Intel Corporation.  ... 
arXiv:1908.01867v2 fatcat:6eih5oqttjgvnaycqs5uc2laje

Text Classification Techniques: A Literature Review

2018 Interdisciplinary Journal of Information, Knowledge, and Management  
These algorithms were divided based on the learning procedure used. Finally, the findings were plotted as a tree structure for visualizing the relationship between learning procedures and algorithms.  ...  Another interesting research opportunity lies in building intricate text data models with deep learning systems.  ...  matrix factorization, singular value decomposition and also those mentioned in Figure 1 Semi-supervised learning (SSL) Semi-supervised learning is a combination of supervised and unsupervised learning  ... 
doi:10.28945/4066 fatcat:6dio5bpajjf77lkrs7xdtciveu

Unsupervised Feature Learning With Winner-Takes-All Based STDP

Paul Ferré, Franck Mamalet, Simon J. Thorpe
2018 Frontiers in Computational Neuroscience  
We present a novel strategy for unsupervised feature learning in image applications inspired by the Spike-Timing-Dependent-Plasticity (STDP) biological learning rule.  ...  Next we introduce a binary STDP learning rule compatible with training on batches of images.  ...  These methods reach state of the art performances, either using top layer features as inputs for a classifier or within a semi-supervised learning framework.  ... 
doi:10.3389/fncom.2018.00024 pmid:29674961 pmcid:PMC5895733 fatcat:hxdcric4abg5jiqims53p23pwi

On the Performance of Hierarchical Temporal Memory Predictions of Medical Streams in Real Time

Noha O. El-Ganainy, Ilangko Balasingham, Per Steinar Halvorsen, Leiv Arne Rosseland
2019 2019 13th International Symposium on Medical Information and Communication Technology (ISMICT)  
HTM is found advantageous as it provides efficient unsupervised predictions compared to the semi-supervised learning supported by LSTM in terms of the error measures.  ...  Applying machine learning on medical streams might lead to a breakthrough on emergency and critical care through online predictions.  ...  a semi-supervised process.  ... 
doi:10.1109/ismict.2019.8743902 dblp:conf/ismict/El-GanainyBHR19 fatcat:w2hga4ne7jb6vow3pahjvzko2e

Improvement of Heterogeneous Transfer Learning Efficiency by Using Hebbian Learning Principle

Arjun Magotra, Juntae Kim
2020 Applied Sciences  
In this article, we propose a way of improving transfer learning efficiency, in case of a heterogeneous source and target, by using the Hebbian learning principle, called Hebbian transfer learning (HTL  ...  We apply the Hebbian principle as synaptic plasticity in transfer learning for classification of images using a heterogeneous source-target dataset, and compare results with the standard transfer learning  ...  Various deep learning models have been successfully applied to unsupervised, semi-supervised, and supervised learning.  ... 
doi:10.3390/app10165631 fatcat:76hkgddlpbgk7jctrwai44vzvq

Top–Down Connections in Self-Organizing Hebbian Networks: Topographic Class Grouping

M Luciw, Juyang Weng
2010 IEEE Transactions on Autonomous Mental Development  
It is very general, describing a framework including unsupervised learning, supervised learning, semi-supervised learning, and communicative learning. B.  ...  Therefore, optimality is spatio-temporal since we use the most efficient estimator for any time . 3) Optimal Hebbian Updating: We use the spatio-temporal optimality as a guide to set a neuron's learning  ... 
doi:10.1109/tamd.2010.2072150 fatcat:4rckp3y445bojipl5j3mfoqmti

Emergence of multimodal action representations from neural network self-organization

German I. Parisi, Jun Tani, Cornelius Weber, Stefan Wermter
2017 Cognitive Systems Research  
Associative links to bind unimodal representations are incrementally learned by a semi-supervised algorithm with bidirectional connectivity.  ...  We propose a hierarchical architecture with growing self-organizing neural networks for learning human actions from audiovisual inputs.  ...  Acknowledgment This research was partially supported by the DAAD German Academic Exchange Service (Kz:A/13/94748) and the DFG (Deutsche Forschungsgemeinschaft) for the project Cross-modal Learning TRR-  ... 
doi:10.1016/j.cogsys.2016.08.002 fatcat:uvsy7swgkvgypnudisbilpuz2i
« Previous Showing results 1 — 15 out of 445 results