A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Dynamic Memory Induction Networks for Few-Shot Text Classification
[article]
2020
arXiv
pre-print
This paper proposes Dynamic Memory Induction Networks (DMIN) for few-shot text classification. ...
The model utilizes dynamic routing to provide more flexibility to memory-based few-shot learning in order to better adapt the support sets, which is a critical capacity of few-shot classification models ...
Acknowledgments The authors would like to thank the organizers of ACL-2020 and the reviewers for their helpful suggestions. ...
arXiv:2005.05727v1
fatcat:rj2btd5jwzeg5ok2tncubypycu
Dynamic Memory Induction Networks for Few-Shot Text Classification
2020
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
unpublished
This paper proposes Dynamic Memory Induction Networks (DMIN) for few-shot text classification. ...
The model utilizes dynamic routing to provide more flexibility to memory-based few-shot learning in order to better adapt the support sets, which is a critical capacity of fewshot classification models ...
Acknowledgments The authors would like to thank the organizers of ACL-2020 and the reviewers for their helpful suggestions. ...
doi:10.18653/v1/2020.acl-main.102
fatcat:cunw7ivmwvh6tmatxjypmai2za
Few-shot Learning for Chinese Legal Controversial Issues Classification
2020
IEEE Access
Two few-shot learning algorithms are proposed for our controversial issues problem, Relation Network and Induction Network, respectively. ...
INDEX TERMS Controversial issues, few-shot learning, text classification, power-law, BERT. ...
In classification module, we use two few-shot learning algorithms, Relation Network and Induction Network. ...
doi:10.1109/access.2020.2988493
fatcat:cpcujzrcbrg6towarx6wy6i7mu
Dynamic Conditional Networks for Few-Shot Learning
[chapter]
2018
Lecture Notes in Computer Science
This paper proposes a novel Dynamic Conditional Convolutional Network (DCCN) to handle conditional few-shot learning, i.e, only a few training samples are available for each condition. ...
In this manner, a specific convolutional kernel can be dynamically obtained for each conditional input. ...
Overfitting issues, memory and time costs make learning such a regressor difficult in few-shot learning settings. ...
doi:10.1007/978-3-030-01267-0_2
fatcat:yuu4u55dnfhpbnzc2ea6gg26d4
Few-shot Text Classification with Distributional Signatures
[article]
2020
arXiv
pre-print
In this paper, we explore meta-learning for few-shot text classification. ...
We demonstrate that our model consistently outperforms prototypical networks learned on lexical knowledge (Snell et al., 2017) in both few-shot text classification and relation classification by a significant ...
few-shot text classification. ...
arXiv:1908.06039v3
fatcat:bbddbkpop5gynaloacfxnuib3q
Meta-Learning with Variational Semantic Memory for Word Sense Disambiguation
[article]
2021
arXiv
pre-print
This inspired recent research on few-shot WSD using meta-learning. ...
Aiming to further close this gap, we propose a model of semantic memory for WSD in a meta-learning setting. ...
For GloVe+GRU, the approximate training time per epoch is 20 minutes; for ELMo+MLP it is 80 minutes; and for BERT, it is 60 minutes. ...
arXiv:2106.02960v1
fatcat:r7dwmqgczbghlcknke2zgc7kv4
ConvProtoNet: Deep Prototype Induction towards Better Class Representation for Few-Shot Malware Classification
2020
Applied Sciences
We design a convolutional induction module to replace the insufficient prototype reduction in most few-shot models and generates more appropriate class-level malware prototypes for classification. ...
In this paper, we propose a new neural network structure called ConvProtoNet which employs few-shot learning to address the problem of scarce malware samples while prevent from overfitting. ...
Acknowledgments: We greatly thank for the offer of academic access and malware scanning API from VirusTotal (https://www.virustotal.com). ...
doi:10.3390/app10082847
fatcat:evtxmhhyz5ht7dvfbgbpigdvyu
Online and Offline Interaction Model of International Chinese Education Based on Few-Shot Learning
2022
Computational Intelligence and Neuroscience
Few-shot learning is a method to acquire learning ability in a small amount of sample data scenarios. ...
This paper aims to study an online and offline interaction model applied to international Chinese education and teaching based on the few-shot learning method. ...
In this study, a bidirectional long-term and short-term memory network with self-attention mechanism was used to form the encoder. ...
doi:10.1155/2022/8281670
fatcat:dsxngm5zhnelljm7prtx57lnw4
Low-resource Learning with Knowledge Graphs: A Comprehensive Survey
[article]
2021
arXiv
pre-print
appeared in training, and few-shot learning (FSL) where new classes for prediction have only a small number of labeled samples that are available. ...
), but also tasks for KG curation (e.g., inductive KG completion), and some typical evaluation resources for each task. ...
mapping functions for addressing few-shot
text classification. ...
arXiv:2112.10006v3
fatcat:wkz6gjx4r5gvlhh673p3rqsmgi
Curriculum Meta-Learning for Few-shot Classification
[article]
2021
arXiv
pre-print
We propose an adaptation of the curriculum training framework, applicable to state-of-the-art meta learning techniques for few-shot classification. ...
Our experiments with the MAML algorithm on two few-shot image classification tasks show significant gains with the curriculum training framework. ...
few-shot classification. ...
arXiv:2112.02913v1
fatcat:utjbtcgvsfgm3o5wwhdkl22gja
Variational Prototype Replays for Continual Learning
[article]
2020
arXiv
pre-print
In this work, we consider few-shot continual learning in classification tasks, and we propose a novel method, Variational Prototype Replays, that efficiently consolidates and recalls previous knowledge ...
Furthermore, our method is more memory efficient since only class-representative prototypes with their means and variances, as well as only one sample per class from previous tasks need to be stored. ...
Few-shot Continual Learning Protocols Humans can learn novel concepts given a few examples without sacrificing classification accuracy on initial tasks (Gidaris & Komodakis, 2018) . ...
arXiv:1905.09447v3
fatcat:lcmcsfbulfg3va4arp5bh4f2im
Few-shot relation classification by context attention-based prototypical networks with BERT
2020
EURASIP Journal on Wireless Communications and Networking
Few-shot learning, which is widely used in image classification, is an effective method for overcoming data sparsity. In this paper, we apply few-shot learning to a relation classification task. ...
However, not all instances contribute equally to the relation prototype in a text-based few-shot learning scenario, which can cause the prototype deviation problem. ...
One of the main tasks of this paper is to generate a satisfactory prototype for a few-shot relation classification task in a text-based support set. ...
doi:10.1186/s13638-020-01720-6
fatcat:6h3oaen2dnbmnprnlzeepfu4za
Small Sample Learning in Big Data Era
[article]
2018
arXiv
pre-print
The first category of SSL approaches can be called "concept learning", which emphasizes learning new concepts from only few related observations. ...
triplet ranking networks for one-shot image classification with larger capacity in handling inter-and intra-class image variations. ...
was encoded with a meta-network trained to predict many-shot model parameters from few-shot model parameters. ...
arXiv:1808.04572v3
fatcat:lqqzzrmgfnfb3izctvdzgopuny
MCML: A Novel Memory-based Contrastive Meta-Learning Method for Few Shot Slot Tagging
[article]
2021
arXiv
pre-print
Meta-learning is widely used for few-shot slot tagging in the task of few-shot learning. The performance of existing methods is, however, seriously affected by catastrophic forgetting. ...
the current label embedded in the few shot episode with the historic ones stored in the memory, and an adaption-from memory mechanism to determine the output label based on the contrast between the input ...
., 2020) propose a dynamic memory induction networks to solve few shot text classification problem. ...
arXiv:2108.11635v2
fatcat:km4sqqk3wzbnxivuf6l22ohwna
Learning where to learn: Gradient sparsity in meta and continual learning
[article]
2021
arXiv
pre-print
This selective sparsity results in better generalization and less interference in a range of few-shot and continual learning problems. ...
Our results shed light on an ongoing debate on whether meta-learning can discover adaptable features and suggest that learning by sparse gradient descent is a powerful inductive bias for meta-learning ...
We thank Charlotte Frenkel, Frederik Benzing, Angelika Steger and Laura Sainz for helpful discussions. ...
arXiv:2110.14402v1
fatcat:ioulqgd6mvaf5ivx2uq3wpmwau
« Previous
Showing results 1 — 15 out of 3,526 results