Filters








7,689 Hits in 3.9 sec

Learning to Self-Train for Semi-Supervised Few-Shot Classification [article]

Xinzhe Li, Qianru Sun, Yaoyao Liu, Shibao Zheng, Qin Zhou, Tat-Seng Chua, Bernt Schiele
2019 arXiv   pre-print
To this end, we train the LST model through a large number of semi-supervised few-shot tasks.  ...  We evaluate our LST method on two ImageNet benchmarks for semi-supervised few-shot classification and achieve large improvements over the state-of-the-art method.  ...  Conclusions We propose a novel LST approach for semi-supervised few-shot classification.  ... 
arXiv:1906.00562v2 fatcat:frdizk7xmveo7knwi4oz5f7zku

Task-Adaptive Clustering for Semi-Supervised Few-Shot Classification [article]

Jun Seo, Sung Whan Yoon, Jaekyun Moon
2020 arXiv   pre-print
Few-shot learning aims to handle previously unseen tasks using only a small amount of new training data.  ...  In this work, we propose a few-shot learner that can work well under the semi-supervised setting where a large portion of training data is unlabeled.  ...  Task-Adaptive Clustering for Semi-Supervised Few-Shot Learning Problem Definition for Semi-Supervised Few-Shot Learning Episodic training, often employed in meta-training for few-shot learners, feeds  ... 
arXiv:2003.08221v1 fatcat:agdkvkl6ljc65lids52coeps44

Revisiting Self-Training for Few-Shot Learning of Language Model [article]

Yiming Chen, Yan Zhang, Chen Zhang, Grandee Lee, Ran Cheng, Haizhou Li
2021 arXiv   pre-print
In this work, we revisit the self-training technique for language model fine-tuning and present a state-of-the-art prompt-based few-shot learner, SFLM.  ...  As unlabeled data carry rich task-relevant information, they are proven useful for few-shot learning of language model. The question is how to effectively make use of such data.  ...  Acknowledgements We would like to thank all the anonymous reviewers for their constructive comments. This work is partly supported by Human-Robot Interaction Phase 1 (Grant No. 19225  ... 
arXiv:2110.01256v1 fatcat:eey6tycymbglnccoxir4q3zyaq

Semi Supervised Learning For Few-shot Audio Classification By Episodic Triplet Mining [article]

Swapnil Bhosale, Rupayan Chakraborty, Sunil Kumar Kopparapu
2021 arXiv   pre-print
Few-shot learning aims to generalize unseen classes that appear during testing but are unavailable during training.  ...  We incorporate episodic training for mining the semi hard positive and the semi hard negative triplets to overcome the overfitting.  ...  For the semi supervised setup, we found that pre-training the model for a few initial episodes (50 episodes for both speaker recognition and audio event classification) in a completely supervised setup  ... 
arXiv:2102.08074v1 fatcat:zg5tzjuvgvbxnfdp2tao4k2ocy

Few-shot Learning and Self Training for eNodeB Log Analysis for Service Level Assurance in LTE Networks

Shogo Aoki, Kohei Shiomoto, Chin Lam Eng
2020 IEEE Transactions on Network and Service Management  
To mitigate the cost and time issues, we propose a method based on few-shot learning that uses Prototypical Networks algorithm to complement the eNodeB states analysis.  ...  However, an issue with supervised learning requires a large amount of labeled dataset, which takes costly human-labor and time to annotate data.  ...  ACKNOWLEDGMENT The authors would like to thank Atsushi Morohoshi and Sebastian Backstad for useful discussion.  ... 
doi:10.1109/tnsm.2020.3032156 fatcat:23app2tzpjdc5df3znhd2m6xqe

Unsupervised Embedding Adaptation via Early-Stage Feature Reconstruction for Few-Shot Classification [article]

Dong Hoon Lee, Sae-Young Chung
2021 arXiv   pre-print
We propose unsupervised embedding adaptation for the downstream few-shot classification task.  ...  Based on findings that deep neural networks learn to generalize before memorizing, we develop Early-Stage Feature Reconstruction (ESFR) -- a novel adaptation scheme with feature reconstruction and dimensionality-driven  ...  These methods are mostly originated from semi-supervised learning studies; approaches in semi-supervised learning are still motivating few-shot classification research.  ... 
arXiv:2106.11486v1 fatcat:77c2zl5ujrb4ro37w7cg4n6zsa

Meta-Learning for Semi-Supervised Few-Shot Classification [article]

Mengye Ren, Eleni Triantafillou, Sachin Ravi, Jake Snell, Kevin Swersky, Joshua B. Tenenbaum, Hugo Larochelle, Richard S. Zemel
2018 arXiv   pre-print
Recent progress in few-shot classification has featured meta-learning, in which a parameterized model for a learning algorithm is defined and trained on episodes representing different classification problems  ...  In few-shot classification, we are interested in learning algorithms that train a classifier from only a handful of labeled examples.  ...  Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright annotation thereon.  ... 
arXiv:1803.00676v1 fatcat:xggntzs3orb6rewe2yhbilelfi

Building One-Shot Semi-supervised (BOSS) Learning up to Fully Supervised Performance [article]

Leslie N. Smith, Adam Conovaloff
2021 arXiv   pre-print
We demonstrate for the first time the potential for building one-shot semi-supervised (BOSS) learning on Cifar-10 and SVHN up to attain test accuracies that are comparable to fully supervised learning.  ...  In addition, we demonstrate that class balancing methods substantially improve accuracy results in semi-supervised learning to levels that allow self-training to reach the level of fully supervised learning  ...  We also express our deep appreciation to Nicholas Carlini and David Berthelot for discussions of ReMixMatch and FixMatch that facilitated our use of their methods and codes.  ... 
arXiv:2006.09363v2 fatcat:pv56rnhimrejrgtofxopz7mbne

Charting the Right Manifold: Manifold Mixup for Few-shot Learning [article]

Puneet Mangla, Mayank Singh, Abhishek Sinha, Nupur Kumari, Vineeth N Balasubramanian, Balaji Krishnamurthy
2020 arXiv   pre-print
This work investigates the role of learning relevant feature manifold for few-shot tasks using self-supervision and regularization techniques.  ...  Few-shot learning algorithms aim to learn model parameters capable of adapting to unseen classes with the help of only a few labeled examples.  ...  of few-shot tasks. • We show that adding self-supervision loss to the training procedure, enables robust semantic feature learning that leads to a significant improvement in few-shot classification.  ... 
arXiv:1907.12087v4 fatcat:a5v3zqnvyva3zar62g6as44u3a

TransMatch: A Transfer-Learning Scheme for Semi-Supervised Few-Shot Learning [article]

Zhongjie Yu, Lin Chen, Zhongwei Cheng, Jiebo Luo
2020 arXiv   pre-print
In this paper, we propose a new transfer-learning framework for semi-supervised few-shot learning to fully utilize the auxiliary information from labeled base-class data and unlabeled novel-class data.  ...  Under the proposed framework, we develop a novel method for semi-supervised few-shot learning called TransMatch by instantiating the three components with Imprinting and MixMatch.  ...  Figure 1 . 1 An overview of meta-learning based semi-supervised few-shot classification framework.  ... 
arXiv:1912.09033v2 fatcat:szsc2nnhzzfstb4zsvwonfarze

Adaptive Self-training for Few-shot Neural Sequence Labeling [article]

Yaqing Wang, Subhabrata Mukherjee, Haoda Chu, Yuancheng Tu, Ming Wu, Jing Gao, Ahmed Hassan Awadallah
2020 arXiv   pre-print
Specifically, we develop self-training and meta-learning techniques for training neural sequence taggers with few labels.  ...  While self-training serves as an effective mechanism to learn from large amounts of unlabeled data -- meta-learning helps in adaptive sample re-weighting to mitigate error propagation from noisy pseudo-labels  ...  for semi-supervised learning.  ... 
arXiv:2010.03680v2 fatcat:6ol7eutgjjholpoezv25j6hkh4

Self-training Improves Pre-training for Natural Language Understanding [article]

Jingfei Du, Edouard Grave, Beliz Gunel, Vishrav Chaudhary, Onur Celebi, Michael Auli, Ves Stoyanov, Alexis Conneau
2020 arXiv   pre-print
In this paper, we study self-training as another way to leverage unlabeled data through semi-supervised learning.  ...  Finally, we also show strong gains on knowledge-distillation and few-shot learning.  ...  Experiments show that SentAugment is effective for self-training, knowledge distillation and few-shot learning.  ... 
arXiv:2010.02194v1 fatcat:i4btr6525zb7pe3dd2bfjum3ra

Dynamic Distillation Network for Cross-Domain Few-Shot Recognition with Unlabeled Data [article]

Ashraful Islam, Chun-Fu Chen, Rameswar Panda, Leonid Karlinsky, Rogerio Feris, Richard J. Radke
2021 arXiv   pre-print
few-shot learning task.  ...  The problem of cross-domain few-shot recognition with unlabeled target data is largely unaddressed in the literature. STARTUP was the first method that tackles this problem using self-training.  ...  MetaOptNet [16] uses a discriminatively trained linear predictor to learn representations for few-shot learning.  ... 
arXiv:2106.07807v3 fatcat:ukg2u2q5rjc6beeyunhrcitiaq

Embedding Propagation: Smoother Manifold for Few-Shot Classification [article]

Pau Rodríguez, Issam Laradji, Alexandre Drouin, Alexandre Lacoste
2020 arXiv   pre-print
Moreover, manifold smoothness is a key factor for semi-supervised learning and transductive learning algorithms.  ...  Few-shot classification is challenging because the data distribution of the training set can be widely different to the test set as their classes are disjoint.  ...  -Achieves state-of-the-art few-shot classification results for the transductive and semi-supervised learning setups.  ... 
arXiv:2003.04151v2 fatcat:of7vvbj2ozhsfenfb5mcukbw34

Informative Pseudo-Labeling for Graph Neural Networks with Few Labels [article]

Yayong Li, Jie Yin, Ling Chen
2022 arXiv   pre-print
Graph Neural Networks (GNNs) have achieved state-of-the-art results for semi-supervised node classification on graphs.  ...  It aims to augment the training set with pseudo-labeled unlabeled nodes with high confidence so as to re-train a supervised model in a self-training cycle.  ...  address semi-supervised node classification with few labels.  ... 
arXiv:2201.07951v1 fatcat:k5lb5bbnlfailn2va7u5w4beha
« Previous Showing results 1 — 15 out of 7,689 results