Filters








31,616 Hits in 2.2 sec

Low-Shot Learning with Imprinted Weights [article]

Hang Qi, Matthew Brown, David G. Lowe
2018 arXiv   pre-print
However, it differs in that only a single imprinted weight vector is learned for each novel category, rather than relying on a nearest-neighbor distance to training instances as typically used with embedding  ...  We describe how to add a similar capability to ConvNet classifiers by directly setting the final layer weights from novel training examples during low-shot learning.  ...  Transfer Learning with Imprinted Weights. We show that imprinting benefits transfer learning in general.  ... 
arXiv:1712.07136v2 fatcat:oghxdoqdd5dttfeup2xqwnbmku

Low-Shot Learning with Imprinted Weights

Hang Qi, Matthew Brown, David G. Lowe
2018 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition  
However, it differs in that only a single imprinted weight vector is learned for each novel category, rather than relying on a nearest-neighbor distance to training instances as typically used with embedding  ...  We describe how to add a similar capability to ConvNet classifiers by directly setting the final layer weights from novel training examples during low-shot learning.  ...  Transfer Learning with Imprinted Weights. We show that imprinting benefits transfer learning in general.  ... 
doi:10.1109/cvpr.2018.00610 dblp:conf/cvpr/QiBL18 fatcat:663tryoywjf75mgzfof3yaep5m

Adaptive Masked Proxies for Few-Shot Segmentation [article]

Mennatullah Siam, Boris Oreshkin, Martin Jagersand
2019 arXiv   pre-print
It utilizes multi-resolution average pooling on base embeddings masked with the label to act as a positive proxy for the new class, while fusing it with the previously learned class signatures.  ...  We propose a novel adaptive masked proxies method that constructs the final segmentation layer weights from few labelled samples.  ...  The study also compares weight imprinting coupled with back-propagation against backpropagation on randomly generated weights.  ... 
arXiv:1902.11123v5 fatcat:bxznbwvp5zdulfaioswkl4p7qe

Characterization of bovine (Bos taurus) imprinted genes from genomic to amino acid attributes by data mining approaches

Keyvan Karami, Saeed Zerehdaran, Ali Javadmanesh, Mohammad Mahdi Shariati, Hossein Fallahi, Sebastian D. Fugmann
2019 PLoS ONE  
To obtain the accuracy of classification, 10-fold cross-validation with concerning each combination of attribute weighting (feature selection) and machine learning algorithms, was used.  ...  In this study, we have employed classification and clustering algorithms with attribute weighting to specify the unique attributes of both imprinted (monoallelic) and biallelic expressed genes.  ...  Deep Learning was trained with stochastic gradient descent and in this process, back-propagation was used.  ... 
doi:10.1371/journal.pone.0217813 pmid:31170205 pmcid:PMC6553745 fatcat:vpi6pcohufbubaomaixbhugr7m

Personalizing Pre-trained Models [article]

Mina Khan, P Srivatsa, Advait Rane, Shriram Chenniappa, Asadali Hazariwala, Pattie Maes
2021 arXiv   pre-print
We developed a technique, called Multi-label Weight Imprinting (MWI), for multi-label, continual, and few-shot learning, and CLIPPER uses MWI with image representations from CLIP.  ...  We consider how upstream pretrained models can be leveraged for downstream few-shot, multilabel, and continual learning tasks.  ...  We replace the softmax with sigmoid activations for each class in weight imprinting to enable multi-label learning.  ... 
arXiv:2106.01499v1 fatcat:7wu5e533crf3padwftwtnx2f3y

COVID-19 Detection from Chest X-Ray Images Using Deep Convolutional Neural Networks with Weights Imprinting Approach

Hala As'ad, Hilda Azmi, Pengcheng Xi, Ashkan Ebadi, Stéphane Tremblay, Alexander Wong
2021 Journal of Computational Vision and Imaging Systems  
To thebest of authors' knowledge, the proposed solution is one of the firstto utilize imprinted weights model with weighted average ensemblefor enhancing the model sensitivity to COVID-19.  ...  In this work, wepropose the use of pre-trained deep Convolutional Neural Networks(CNN) and integrate them with a few-shot learning approach namedimprinted weights.  ...  In order to deal with the imbalanced dataset, we apply the imprinted weights approach with fine-tuning.  ... 
doi:10.15353/jcvis.v6i1.3546 fatcat:v2aldl5yinf4hk72vbj5dx6ye4

TransMatch: A Transfer-Learning Scheme for Semi-Supervised Few-Shot Learning [article]

Zhongjie Yu, Lin Chen, Zhongwei Cheng, Jiebo Luo
2020 arXiv   pre-print
Under the proposed framework, we develop a novel method for semi-supervised few-shot learning called TransMatch by instantiating the three components with Imprinting and MixMatch.  ...  updating the model with a semi-supervised learning method.  ...  Part II: Classifier Weight Imprinting The weight imprinting method was proposed by [16] , and has achieved impressive performance in the few-shot learning task as a representative of transfer-learning  ... 
arXiv:1912.09033v2 fatcat:szsc2nnhzzfstb4zsvwonfarze

XtarNet: Learning to Extract Task-Adaptive Representation for Incremental Few-Shot Learning [article]

Sung Whan Yoon, Do-Yeon Kim, Jun Seo, Jaekyun Moon
2020 arXiv   pre-print
The challenge gets greater when a novel task is given with only a few labeled examples, a problem known as incremental few-shot learning.  ...  The concept of TAR can also be used in conjunction with existing incremental few-shot learning methods; extensive simulation results in fact show that applying TAR enhances the known methods significantly  ...  When used with our XtarNet, the weight generator is learned along with other modules within XtarNet during meta-training.  ... 
arXiv:2003.08561v2 fatcat:4eup33cncfdivae37x55wvtk6q

Exploiting the Short-term to Long-term Plasticity Transition in Memristive Nanodevice Learning Architectures [article]

Christopher H. Bennett, Selina La Barbera, Adrien F. Vincent, Fabien Alibart, Damien Querlioz
2016 arXiv   pre-print
Here, we explore the integration of electrochemical metallization cell (ECM) nanodevices with tunable filamentary switching in nanoscale learning systems.  ...  To overcome these limitations, a dual-crossbar learning system partly inspired by the extreme learning machine (ELM) approach is then introduced.  ...  Fig. 7 . 7 Classification rate as a function of hidden layer size M in different conditions: random weights on first layer (Random Weights ELM); imprinting on first layer, with no variability (Imprinting  ... 
arXiv:1606.08366v1 fatcat:2bik4swcjrbrlmzcfrjtaakr7m

FLAT: Few-Shot Learning via Autoencoding Transformation Regularizers [article]

Haohang Xu, Hongkai Xiong, Guojun Qi
2019 arXiv   pre-print
Most of existing few-shot learning models attempt to address this challenge by either learning the meta-knowledge from multiple simulated tasks on the base categories, or resorting to data augmentation  ...  One of the most significant challenges facing a few-shot learning task is the generalizability of the (meta-)model from the base to the novel categories.  ...  better) performances. weight imprinting directly imprints the prediction weights of novel categories with their mean representations on top of the neural networks pre-trained with labeled examples of  ... 
arXiv:1912.12674v1 fatcat:tl5ri2xtczfinoccznfgvjuk2q

Speciation by perception

Anders Brodin, Fredrik Haas
2006 Animal Behaviour  
We then mimicked the sexual imprinting process by training artificial neural networks to separate their own type from the other two.  ...  The network learned pure phenotypes faster and better than the hybrid patterns showing that already at the receptor level there may be signal reception properties that will make speciation under sympatric  ...  Acknowledgments We thank Lars Hård and Mattias Ohlsson for helping us with the neural networks. A.B. was financed with a grant from the Swedish Research Council, VR.  ... 
doi:10.1016/j.anbehav.2005.10.011 fatcat:cjdj5gcurbhkzgsnoekbvza4ky

Coevolution of learning and data-acquisition mechanisms: a model for cognitive evolution

A. Lotem, J. Y. Halpern
2012 Philosophical Transactions of the Royal Society of London. Biological Sciences  
Using basic examples from imprinting and associative learning, we argue that by coevolving to handle the natural distribution of data in the animal's environment, learning and data-acquisition mechanisms  ...  We conclude with some predictions and suggested directions for experimental and theoretical work.  ...  We start with basic examples from imprinting and associative learning, and continue with more challenging tasks of learning patterns in time and space (which is needed, for example, in language acquisition  ... 
doi:10.1098/rstb.2012.0213 pmid:22927567 pmcid:PMC3427549 fatcat:wikacvsgavgqbji7r53y7zcqhy

Object Recognition and Sensitive Periods: A Computational Analysis of Visual Imprinting

Randall C. O'Reilly, Mark H. Johnson
1994 Neural Computation  
Hysteresis, when combined with a simple Hebbian covariance learning mechanism, has been shown in this and earlier work (Foldidk 1991; O'Reilly and McClelland 1992) to produce translation-invariant visual  ...  network model of translation-invariant object recognition that incorporates features of the neural circuitry of IMHV, and exhibits behavior qualitatively similar to a range of findings in the filial imprinting  ...  Note Added in Proof Another neural network model on imprinting in the chick has recently come to our attention (Bateson & Horn, in press, Animal Behavior).  ... 
doi:10.1162/neco.1994.6.3.357 fatcat:pn7b6mvh3zgsvghlh76jrvmjpi

DocFace+: ID Document to Selfie Matching [article]

Yichun Shi, Anil K. Jain
2018 arXiv   pre-print
Next, a pair of sibling networks with partially shared parameters are trained to learn a unified face representation with domain-specific parameters.  ...  To overcome this shortcoming, we propose a method, called dynamic weight imprinting (DWI), to update the classifier weights, which allows faster convergence and more generalizable representations.  ...  We compare our dynamic weight imprinting method with static imprinting methods.  ... 
arXiv:1809.05620v2 fatcat:4ke7w4g4cjhypfqun4rqkxw7iu

Page 52 of Psychological Abstracts Vol. 64, Issue 1 [page]

1980 Psychological Abstracts  
(Agnews State Hosp, Behavior Adjustment Program, San Jose, CA) Learning perfor- mance varies with brain weight in heterogeneous mouse lines.  ...  Partialing out differences in operant level and body weight had little effect on the magnitude of the correla- tions between brain weight and learning performance.  ... 
« Previous Showing results 1 — 15 out of 31,616 results