A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
Induction Networks for Few-Shot Text Classification
2019
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
text classification. ...
In this paper, we propose a novel Induction Network to learn such a generalized class-wise representation, by innovatively leveraging the dynamic routing algorithm in meta-learning. ...
Acknowledgments The authors would like to thank the organizers of EMNLP-IJCNLP2019 and the reviewers for their helpful suggestions. This research work is sup- ...
doi:10.18653/v1/d19-1403
dblp:conf/emnlp/GengLLZJS19
fatcat:iscxeis7nndi7klho52wjjfjsy
Induction Networks for Few-Shot Text Classification
[article]
2019
arXiv
pre-print
text classification. ...
Text classification tends to struggle when data is deficient or when it needs to adapt to unseen classes. ...
Acknowledgments The authors would like to thank the organizers of EMNLP-IJCNLP2019 and the reviewers for their helpful suggestions. ...
arXiv:1902.10482v2
fatcat:gyahkqiro5ajxaw7jywmy7ozbe
Dynamic Memory Induction Networks for Few-Shot Text Classification
[article]
2020
arXiv
pre-print
This paper proposes Dynamic Memory Induction Networks (DMIN) for few-shot text classification. ...
The model utilizes dynamic routing to provide more flexibility to memory-based few-shot learning in order to better adapt the support sets, which is a critical capacity of few-shot classification models ...
Acknowledgments The authors would like to thank the organizers of ACL-2020 and the reviewers for their helpful suggestions. ...
arXiv:2005.05727v1
fatcat:rj2btd5jwzeg5ok2tncubypycu
Dynamic Memory Induction Networks for Few-Shot Text Classification
2020
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
unpublished
This paper proposes Dynamic Memory Induction Networks (DMIN) for few-shot text classification. ...
The model utilizes dynamic routing to provide more flexibility to memory-based few-shot learning in order to better adapt the support sets, which is a critical capacity of fewshot classification models ...
Acknowledgments The authors would like to thank the organizers of ACL-2020 and the reviewers for their helpful suggestions. ...
doi:10.18653/v1/2020.acl-main.102
fatcat:cunw7ivmwvh6tmatxjypmai2za
MGIMN: Multi-Grained Interactive Matching Network for Few-shot Text Classification
[article]
2022
arXiv
pre-print
They also ignore the importance to capture the inter-dependency between query and the support set for few-shot text classification. ...
Text classification struggles to generalize to unseen classes with very few labeled text instances per class. ...
In this paper we propose Multi-grained Interactive Matching Network, a backbone network for few-shot text classification. ...
arXiv:2204.04952v3
fatcat:g6zgletaxjgqdbt4g5gutxth24
When Low Resource NLP Meets Unsupervised Language Model: Meta-pretraining Then Meta-learning for Few-shot Text Classification
[article]
2019
arXiv
pre-print
It can thus be further suggested that pretraining could be a promising solution for few-shot learning of many other NLP tasks. ...
Text classification tends to be difficult when data are deficient or when it is required to adapt to unseen classes. ...
Acknowledgments We want to express gratitude to the anonymous reviewers for their hard work and kind comments and this work is funded by NSFC 91846204, national key research program 2018YFB1402800, and ...
arXiv:1908.08788v2
fatcat:mfullotkx5ahbaoqcxn2zk66em
When Low Resource NLP Meets Unsupervised Language Model: Meta-Pretraining then Meta-Learning for Few-Shot Text Classification (Student Abstract)
2020
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
It can thus be further suggested that pretraining could be a promising solution for few-shot learning of many other NLP tasks. ...
Text classification tends to be difficult when data are deficient or when it is required to adapt to unseen classes. ...
Acknowledgments We want to express gratitude to the anonymous reviewers for their hard work and kind comments and this work is funded by NSFC 91846204, national key research program 2018YFB1402800, and ...
doi:10.1609/aaai.v34i10.7158
fatcat:u75k5swow5gp5jhhs772orchfe
TNT: Text-Conditioned Network with Transductive Inference for Few-Shot Video Classification
[article]
2021
arXiv
pre-print
Recently, few-shot video classification has received an increasing interest. ...
Specifically, we formulate a text-based task conditioner to adapt video features to the few-shot learning task. ...
Acknowledgements This work was supported in part by the Millennium Institute for Foundational Research on Data (IMFD). ...
arXiv:2106.11173v2
fatcat:dgrm6xxutrharom74tcsylfmia
Few-shot Learning for Chinese Legal Controversial Issues Classification
2020
IEEE Access
Two few-shot learning algorithms are proposed for our controversial issues problem, Relation Network and Induction Network, respectively. ...
INDEX TERMS Controversial issues, few-shot learning, text classification, power-law, BERT. ...
In classification module, we use two few-shot learning algorithms, Relation Network and Induction Network. ...
doi:10.1109/access.2020.2988493
fatcat:cpcujzrcbrg6towarx6wy6i7mu
Few-shot Text Classification with Distributional Signatures
[article]
2020
arXiv
pre-print
In this paper, we explore meta-learning for few-shot text classification. ...
We demonstrate that our model consistently outperforms prototypical networks learned on lexical knowledge (Snell et al., 2017) in both few-shot text classification and relation classification by a significant ...
few-shot text classification. ...
arXiv:1908.06039v3
fatcat:bbddbkpop5gynaloacfxnuib3q
Knowledge Guided Metric Learning for Few-Shot Text Classification
[article]
2020
arXiv
pre-print
Through experiments, we demonstrate that our method outperforms the state-of-the-art few-shot text classification models. ...
The training of deep-learning-based text classification models relies heavily on a huge amount of annotation data, which is difficult to obtain. ...
dataset of few-shot text classification. ...
arXiv:2004.01907v1
fatcat:zs3hfrvx3vhqtaj66tf4f7t46u
Meta-Learning Adversarial Domain Adaptation Network for Few-Shot Text Classification
[article]
2021
arXiv
pre-print
Meta-learning has emerged as a trending technique to tackle few-shot text classification and achieved state-of-the-art performance. ...
for new classes. ...
We evaluate our model on four popular datasets for few-shot text classification. ...
arXiv:2107.12262v1
fatcat:ppuu3lgxgvcdbostoayyllr5ai
Low-resource Learning with Knowledge Graphs: A Comprehensive Survey
[article]
2021
arXiv
pre-print
appeared in training, and few-shot learning (FSL) where new classes for prediction have only a small number of labeled samples that are available. ...
), but also tasks for KG curation (e.g., inductive KG completion), and some typical evaluation resources for each task. ...
mapping functions for addressing few-shot
text classification. ...
arXiv:2112.10006v3
fatcat:wkz6gjx4r5gvlhh673p3rqsmgi
Few-Shot Learning on Graphs: A Survey
[article]
2022
arXiv
pre-print
In light of this, few-shot learning on graphs (FSLG), which combines the strengths of graph representation learning and few-shot learning together, has been proposed to tackle the performance degradation ...
However, prevailing (semi-)supervised graph representation learning models for specific tasks often suffer from label sparsity issue as data labeling is always time and resource consuming. ...
Few-Shot Node Classification. ...
arXiv:2203.09308v1
fatcat:7tpke435jnevdhdverovyug4sa
On the Importance of Attention in Meta-Learning for Few-Shot Text Classification
[article]
2018
arXiv
pre-print
Based on the Model-Agnostic Meta-Learning framework (MAML), we introduce the Attentive Task-Agnostic Meta-Learning (ATAML) algorithm for text classification. ...
Current deep learning based text classification methods are limited by their ability to achieve fast learning and generalization when the data is scarce. ...
We also analyze the impact of architectural choices for representation learning and show the effectiveness of dilated convolutional networks for few-shot text classification. ...
arXiv:1806.00852v1
fatcat:g6jwlt5xxbfe7btti5ukkfqvbm
« Previous
Showing results 1 — 15 out of 6,108 results