5,718 Hits in 4.1 sec

Meta-Learning of Neural Architectures for Few-Shot Learning [article]

Thomas Elsken, Benedikt Staffler, Jan Hendrik Metzen, Frank Hutter
2021 arXiv   pre-print
Thus, few-shot learning is typically done with a fixed neural architecture. To improve upon this, we propose MetaNAS, the first method which fully integrates NAS with gradient-based meta-learning.  ...  %We present encouraging results for MetaNAS with a combination of DARTS and REPTILE on few-shot classification benchmarks.  ...  Related Work Few-Shot Learning via Meta-Learning Few-Shot Learning refers to the problem of learning to solve a task (e.g., a classification problem) from just a few training examples.  ... 
arXiv:1911.11090v3 fatcat:q42t62fh6rfchlyxllilxrqha4

Auto-Meta: Automated Gradient Based Meta Learner Search [article]

Jaehong Kim, Sangyeul Lee, Sungwan Kim, Moonsu Cha, Jung Kwon Lee, Youngduck Choi, Yongseok Choi, Dong-Yeon Cho, Jiwon Kim
2018 arXiv   pre-print
To our best knowledge, this work is the first successful neural architecture search implementation in the context of meta learning.  ...  We adopt the progressive neural architecture search liu:pnas_google:DBLP:journals/corr/abs-1712-00559 to find optimal architectures for meta-learners.  ...  Few Shot Image Classification To evaluate the automated architecture search algorithm for the gradient-based meta-learning, we applied our method to few-shot image classification problems.  ... 
arXiv:1806.06927v2 fatcat:uqn2tsabd5gsbm42l7p5c2hm54

MetAdapt: Meta-Learned Task-Adaptive Architecture for Few-Shot Classification [article]

Sivan Doveh, Eli Schwartz, Chao Xue, Rogerio Feris, Alex Bronstein, Raja Giryes, Leonid Karlinsky
2020 arXiv   pre-print
Few-Shot Learning (FSL) is a topic of rapidly growing interest.  ...  Another topic closely related to meta-learning with a lot of interest in the community is Neural Architecture Search (NAS), automatically finding optimal architecture instead of engineering it manually  ...  Notably, in all previous meta-learning methods, only the parameters of a (fixed) neural network are optimized through meta-learning in order to become adaptable (or partially adaptable) to novel few-shot  ... 
arXiv:1912.00412v3 fatcat:2zy6imvmfndrtfqzcvtm2zj6z4

Efficient Automatic Meta Optimization Search for Few-Shot Learning [article]

Xinyue Zheng, Peng Wang, Qigang Wang, Zhongchao shi, Feiyu Xu
2019 arXiv   pre-print
NAS automatically generates and evaluates meta-learner's architecture for few-shot learning problems, while the meta-learner uses meta-learning algorithm to optimize its parameters based on the distribution  ...  We propose a universal framework to optimize the meta-learning process automatically by adopting neural architecture search technique (NAS).  ...  Figure 4 . shows the good architectures we discovered. Conclusion In this paper, we introduce an efficient automatic meta optimization search for few-shot learning problems.  ... 
arXiv:1909.03817v1 fatcat:qwc2fbcalzgqrp4mfu5es6je24

M-NAS: Meta Neural Architecture Search

Jiaxing Wang, Jiaxiang Wu, Haoli Bai, Jian Cheng
Experimental results demonstrate the superiority of M-NAS against a number of competitive baselines on both toy regression and few shot classification problems.  ...  Since optimal weights for different tasks and architectures span diversely, we resort to meta-learning, and learn meta-weights that efficiently adapt to a new task on the corresponding architecture with  ...  Acknowledgment We thank Yin Zheng and Dongze Lian for helpful discussions.  ... 
doi:10.1609/aaai.v34i04.6084 fatcat:pt5j3iajczcu7go2p6kr6opgua


L. Bularca, Transilvania University of Brasov
2020 Bulletin of the Transilvania University of Brasov. Series I - Engineering Sciences  
This paper aims to present the impact of residual layers and inception modules on meta-learning in order to obtain an improvement of results.  ...  This algorithm respect the initial goal of neural networks to be able to learn and adapt in time. However, the work is just on the beginning and the best results have not been reached yet.  ...  This dataset is the most common used few-shot learning benchmarks. I follow the experimental protocol proposed in [1] , which involves fast learning of N-way classification with 1 or 5 shots.  ... 
doi:10.31926/but.ens.2020. fatcat:ufxbqgadgrbo3do2nc42duwcsm

Few-Shot Learning with Graph Neural Networks [article]

Victor Garcia, Joan Bruna
2018 arXiv   pre-print
By assimilating generic message-passing inference algorithms with their neural-network counterparts, we define a graph neural network architecture that generalizes several of the recently proposed few-shot  ...  Besides providing improved numerical performance, our framework is easily extended to variants of few-shot learning, such as semi-supervised or active learning, demonstrating the ability of graph-based  ...  ACKNOWLEDGMENTS This work was partly supported by Samsung Electronics (Improving Deep Learning using Latent Structure).  ... 
arXiv:1711.04043v3 fatcat:3fd7knhjorhrncf4h3dwuwwoby

Meta Architecture Search [article]

Albert Shaw, Wei Wei, Weiyang Liu, Le Song, Bo Dai
2019 arXiv   pre-print
We propose the Bayesian Meta Architecture SEarch (BASE) framework which takes advantage of a Bayesian formulation of the architecture search problem to learn over an entire set of tasks simultaneously.  ...  In this paper, we make the first attempt to study Meta Architecture Search which aims at learning a task-agnostic representation that can be used to speed up the process of architecture search on a large  ...  Acknowledgments We would like to thank the anonymous reviewers for their comments and suggestions. Part of this work was done while Bo Dai and Albert Shaw were at Georgia Tech.  ... 
arXiv:1812.09584v2 fatcat:akrz5crdnfgener2drdrcc2244

Few-Shot Microscopy Image Cell Segmentation [article]

Youssef Dawoud, Julia Hornauer, Gustavo Carneiro, Vasileios Belagiannis
2020 arXiv   pre-print
Our experiments on five public databases show promising results from 1- to 10-shot meta-learning using standard segmentation neural network architectures.  ...  We pose this problem as meta-learning where the goal is to learn a generic and adaptable few-shot learning model from the available source domain data sets and cell segmentation tasks.  ...  G.C. acknowledges the support by the Alexander von Humboldt-Stiftung for the renewed research stay sponsorship.  ... 
arXiv:2007.01671v1 fatcat:p4ob756fgza6fhyergfgcidfjm

Meta-ticket: Finding optimal subnetworks for few-shot learning within randomly initialized neural networks [article]

Daiki Chijiwa, Shin'ya Yamaguchi, Atsutoshi Kumagai, Yasutoshi Ida
2022 arXiv   pre-print
Few-shot learning for neural networks (NNs) is an important problem that aims to train NNs with a few data.  ...  Then the following questions naturally arise: (1) Can we find sparse structures effective for few-shot learning by meta-learning? (2) What benefits will it bring in terms of meta-generalization?  ...  Another line of research is neural architecture search (NAS [60, 32] ). In particular, NAS for few-shot learning [22, 31, 11, 54] is closely related to our work.  ... 
arXiv:2205.15619v1 fatcat:4eyhwckfaffkbhlxo5opf54zbe

[Re] Meta-learning with differentiable closed-form solvers

Arnout Devos, Sylvain Chatel, Matthias Grossglauser
2019 Zenodo  
Acknowledgements The authors would like to thank Martin Jaggi, Ruediger Urbanke, and the anonymous reviewers from the ICLR 2019 Reproducibility Challenge for feedback.  ...  A promising approach for few-shot learning is the field of meta-learning.  ...  In few-shot learning the learning scope is expanded from the classic setting of a single task with many shots to a variety of tasks with a few shots each.  ... 
doi:10.5281/zenodo.3160540 fatcat:ffgtkjtsbbfv7ht7jsguucewsy

Memory Augmented Matching Networks for Few-Shot Learnings

Kien Tran, The authors are with the Department of Computer Science at National Defense Academy of Japan, Hiroshi Sato, Masao Kubo
2019 International Journal of Machine Learning and Computing  
In our research, we propose a metric learning method for few-shot learning tasks by taking advantage of NTMs and Matching Network to improve few-shot learning task's learning accuracy on both Omniglot  ...  Despite many successful efforts have been made in one-shot/few-shot learning tasks recently, learning from few data remains one of the toughest areas in machine learning.  ...  Meta Learning with Memory Augmented Neural Network for One-Shot Learning Task Recent works have suggested the Memory Augmented Neural Network (MANN) for one-shot learning tasks via meta-learning approach  ... 
doi:10.18178/ijmlc.2019.9.6.867 fatcat:27wrqnpmorg6pmexvmmtoknllu

Meta-Learning in Neural Networks: A Survey [article]

Timothy Hospedales, Antreas Antoniou, Paul Micaelli, Amos Storkey
2020 arXiv   pre-print
We survey promising applications and successes of meta-learning such as few-shot learning and reinforcement learning.  ...  The field of meta-learning, or learning-to-learn, has seen a dramatic rise in interest in recent years.  ...  such as few-shot learning [36] or neural architecture search [37] .  ... 
arXiv:2004.05439v2 fatcat:3r23tsxxkfbgzamow5miglkrye

Meta-Transfer Learning through Hard Tasks [article]

Qianru Sun, Yaoyao Liu, Zhaozheng Chen, Tat-Seng Chua, Bernt Schiele
2019 arXiv   pre-print
In this paper, we propose a novel approach called meta-transfer learning (MTL) which learns to transfer the weights of a deep NN for few-shot learning tasks.  ...  Meta-learning has been proposed as a framework to address the challenging few-shot learning setting.  ...  It is also partially supported by German Research Foundation (DFG CRC 1223), and National Natural Science Foundation of China (61772359).  ... 
arXiv:1910.03648v1 fatcat:l2z7dowb5bclzgr2a3ofk3z2za

Meta-Learning in Neural Networks: A Survey

Timothy M Hospedales, Antreas Antoniou, Paul Micaelli, Amos J. Storkey
2021 IEEE Transactions on Pattern Analysis and Machine Intelligence  
We survey promising applications and successes of meta-learning including few-shot learning, reinforcement learning and architecture search.  ...  The field of meta-learning, or learning-to-learn, has seen a dramatic rise in interest in recent years.  ...  Finally, one can also define NAS meta-objectives to train an architecture suitable for few-shot learning [243] .  ... 
doi:10.1109/tpami.2021.3079209 pmid:33974543 fatcat:wkzeodki4fbcnjlcczn4mr6kry
« Previous Showing results 1 — 15 out of 5,718 results