Filters








31,293 Hits in 4.0 sec

Diversity Transfer Network for Few-Shot Learning

Mengting Chen, Yuxin Fang, Xinggang Wang, Heng Luo, Yifeng Geng, Xinyu Zhang, Chang Huang, Wenyu Liu, Bo Wang
2020 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
Few-shot learning is a challenging task that aims at training a classifier for unseen classes with only a few training examples.  ...  To alleviate this problem, we propose a novel generative framework, Diversity Transfer Network (DTN), that learns to transfer latent diversities from known categories and composite them with support features  ...  Conclusion and Future Work In this work, we propose a novel generative model, Diversity Transfer Network (DTN), for few-shot image recognition.  ... 
doi:10.1609/aaai.v34i07.6628 fatcat:hsngwfqqavhalag734wslejhhe

Cross Modal Few-Shot Contextual Transfer for Heterogenous Image Classification

Zhikui Chen, Xu Zhang, Wei Huang, Jing Gao, Suhua Zhang
2021 Frontiers in Neurorobotics  
To alleviate the difficulty, we propose a cross-modal few-shot contextual transfer method that leverages the contextual information as a supplement and learns context awareness transfer in few-shot image  ...  However, when it comes to few-shot learning scenarios, due to the low diversity of several known training samples, they are prone to be dominated by specificity, thus leading to one-sidedness local features  ...  Deep Transfer Learning In augmentation-based few-shot learning, augmentation is an intuitive way to alleviate the lack of training samples and data diversity.  ... 
doi:10.3389/fnbot.2021.654519 pmid:34108871 pmcid:PMC8180855 fatcat:ue5u75pc6nf7hgpkbj6zjry4ie

A Broader Study of Cross-Domain Few-Shot Learning [article]

Yunhui Guo, Noel C. Codella, Leonid Karlinsky, James V. Codella, John R. Smith, Kate Saenko, Tajana Rosing, Rogerio Feris
2020 arXiv   pre-print
Extensive experiments on the proposed benchmark are performed to evaluate state-of-art meta-learning approaches, transfer learning approaches, and newer methods for cross-domain few-shot learning.  ...  Recent progress on few-shot learning largely relies on annotated data for meta-learning: base classes sampled from the same domain as the novel classes.  ...  Another line of work for few-shot learning uses a broader variety of classifiers for transfer learning.  ... 
arXiv:1912.07200v3 fatcat:nn5ls5ww3belvkc46jnhjwamyu

Partial Is Better Than All: Revisiting Fine-tuning Strategy for Few-shot Learning [article]

Zhiqiang Shen and Zechun Liu and Jie Qin and Marios Savvides and Kwang-Ting Cheng
2021 arXiv   pre-print
to novel data, i.e. learning to transfer in few-shot scenario.) or meta-learning.  ...  The goal of few-shot learning is to learn a classifier that can recognize unseen classes from limited support data with labels.  ...  In this work we conduct the evolutionary search in transfer learning for few-shot classification.  ... 
arXiv:2102.03983v1 fatcat:u37noahnpze4jm6ba4pcqihdyu

Research Progress on Few-Shot Learning for Remote Sensing Image Interpretation

Xian Sun, Bing Wang, Zhirui Wang, Hao Li, Hengchao Li, Kun Fu
2021 IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing  
Index Terms-Deep generative model, few-shot learning, metalearning, metric learning, remote sensing, transfer learning.  ...  This article gives a reference for scholars working on few-shot learning research in the remote sensing field.  ...  [63] used the Siamese neural networks [64] as the feature extractor for few-shot learning.  ... 
doi:10.1109/jstars.2021.3052869 fatcat:ldos3sx6mvaapjkgsua73l7tve

Knowledge Guided Metric Learning for Few-Shot Text Classification [article]

Dianbo Sui, Yubo Chen, Binjie Mao, Delai Qiu, Kang Liu, Jun Zhao
2020 arXiv   pre-print
Inspired by human intelligence, we propose to introduce external knowledge into few-shot learning to imitate human knowledge.  ...  Through experiments, we demonstrate that our method outperforms the state-of-the-art few-shot text classification models.  ...  into few-shot learning. • • A novel parameter generator network based on external knowledge is proposed to generate diverse metrics for diverse tasks. • The model yields promising results on the ARSC  ... 
arXiv:2004.01907v1 fatcat:zs3hfrvx3vhqtaj66tf4f7t46u

Diverse Few-Shot Text Classification with Multiple Metrics [article]

Mo Yu, Xiaoxiao Guo, Jinfeng Yi, Shiyu Chang, Saloni Potdar, Yu Cheng, Gerald Tesauro, Haoyu Wang, Bowen Zhou
2018 arXiv   pre-print
We study few-shot learning in natural language domains.  ...  seen few-shot task.  ...  Conclusion We propose a few-shot learning approach for diverse tasks based on task clustering.  ... 
arXiv:1805.07513v1 fatcat:74owgnvspzeuhiljvjuppucv6u

Impact of base dataset design on few-shot image classification [article]

Othman Sbai, Camille Couprie, Mathieu Aubry
2020 arXiv   pre-print
In this paper, we systematically study the effect of variations in the training data by evaluating deep features trained on different image sets in a few-shot classification setting.  ...  We also show how the base dataset design can improve performance in few-shot classification more drastically than replacing a simple baseline by an advanced state of the art algorithm.  ...  We thank Maxime Oquab, Diane Bouchacourt and Alexei Efros for helpful discussions and feedback.  ... 
arXiv:2007.08872v1 fatcat:ogyqtqnjrzcjzi3zcucwh3vnva

Grad2Task: Improved Few-shot Text Classification Using Gradients for Task Representation [article]

Jixuan Wang, Kuan-Chieh Wang, Frank Rudzicz, Michael Brudno
2022 arXiv   pre-print
In this work, we propose a novel conditional neural process-based approach for few-shot text classification that learns to transfer from other diverse tasks with rich annotation.  ...  Experimental results show that our approach outperforms traditional fine-tuning, sequential transfer learning, and state-of-the-art meta learning approaches on a collection of diverse few-shot tasks.  ...  Acknowledgments and Disclosure of Funding We would like to thank reviewers and ACs for constructive feedback and discussion.  ... 
arXiv:2201.11576v1 fatcat:jtkgzlzk2jfpdkvsvwtrubm4ve

Few-Shot Learning with Intra-Class Knowledge Transfer [article]

Vivek Roy, Yan Xu, Yu-Xiong Wang, Kris Kitani, Ruslan Salakhutdinov, Martial Hebert
2020 arXiv   pre-print
However, due to the limited number of the few-shot seeds, the generated samples usually have small diversity, making it difficult to train a discriminative classifier for the few-shot classes.  ...  Second, superclasses are clustered, and the statistical mean and feature variance of each superclass are used as transferable knowledge inherited by the children few-shot classes.  ...  However, instead of using only a few samples for data augmentation, we propose also to use the transferred knowledge from many-shot classes to generate more diverse samples for augmenting the sparse few-shot  ... 
arXiv:2008.09892v1 fatcat:tmmlddvj5vfsxcvbtihdxwmf4i

The Curse of Zero Task Diversity: On the Failure of Transfer Learning to Outperform MAML and their Empirical Equivalence [article]

Brando Miranda, Yu-Xiong Wang, Sanmi Koyejo
2022 arXiv   pre-print
We hypothesize that the diversity coefficient of the few-shot learning benchmark is predictive of whether meta-learning solutions will succeed or not.  ...  Recently, it has been observed that a transfer learning solution might be all we need to solve many few-shot learning benchmarks -- thus raising important questions about when and how meta-learning algorithms  ...  The main difference between their work and ours is that they focus their analysis mainly on transfer learning, while we concentrated on meta-learning for few-shot learning.  ... 
arXiv:2112.13121v3 fatcat:bcagllszwbgwbh6oe3ciz3h4ui

[Re] Zero-Shot Knowledge Transfer via Adversarial Belief Matching

Alexandros Ferles, Alexander Nöu, Leonidas Valavanis
2020 Zenodo  
We reproduce the work in Zero-shot Knowledge Transfer via Adversarial Belief Matching, which describes a novel approach for knowledge transfer.  ...  We compare the results of the proposed method with a few-shot knowledge distillation attention transfer setting implemented and trained from scratch.  ...  Wide ResNet 16-1 few-shot training on SVHN with no assistance from a teacher network C.3 Few-Shot Knowledge Distillation with Attention Transfer (KD-AT) Few-Shot Knowledge Distillation with Attention Transfer  ... 
doi:10.5281/zenodo.3818623 fatcat:5kpl7orh5zc2jkwfyq25haofle

ProtAugment: Unsupervised diverse short-texts paraphrasing for intent detection meta-learning [article]

Thomas Dopierre, Christophe Gravier, Wilfried Logerais
2021 arXiv   pre-print
Recent research considers few-shot intent detection as a meta-learning problem: the model is learning to learn from a consecutive set of small tasks named episodes.  ...  ProtAugment is a novel extension of Prototypical Networks, that limits overfitting on the bias introduced by the few-shots classification objective at each episode.  ...  Acknowledgments We are thankful for the discussion we had with Michele Bevilacqua, Marco Maru, and Roberto Navigli from Sapienza university about diversity in Natural Language Generation.  ... 
arXiv:2105.12995v1 fatcat:yrverod7uzgldpq6vzqvw2as6y

Dataset Bias in Few-shot Image Recognition [article]

Shuqiang Jiang, Yaohui Zhu, Chenlong Liu, Xinhang Song, Xiangyang Li, Weiqing Min
2021 arXiv   pre-print
Second, we investigate performance differences on different datasets from dataset structures and different few-shot learning methods.  ...  We use these quantitative characteristics and four few-shot learning methods to analyze performance differences on five different datasets.  ...  In this case, a simple metric is enough for effective few-shot learning.  ... 
arXiv:2008.07960v3 fatcat:pzzsu5pdvnaqnlqnb2xabx7dki

Few-shot Image Classification: Just Use a Library of Pre-trained Feature Extractors and a Simple Classifier [article]

Arkabandhu Chowdhury, Mingchao Jiang, Swarat Chaudhuri, Chris Jermaine
2021 arXiv   pre-print
Recent papers have suggested that transfer learning can outperform sophisticated meta-learning methods for few-shot image classification.  ...  We show experimentally that a library of pre-trained feature extractors combined with a simple feed-forward network learned with an L2-regularizer can be an excellent option for solving cross-domain few-shot  ...  an entire library of deep CNNs for few-shot learning.  ... 
arXiv:2101.00562v3 fatcat:mhga4gxjmfg3zb5atw67wnakny
« Previous Showing results 1 — 15 out of 31,293 results