Filters








39,678 Hits in 4.9 sec

Learning to Forget for Meta-Learning [article]

Sungyong Baik, Seokil Hong, Kyoung Mu Lee
2020 arXiv   pre-print
As the attenuation dynamically controls (or selectively forgets) the influence of prior knowledge for a given task and each layer, we name our method as L2F (Learn to Forget).  ...  Model-agnostic meta-learning (MAML) tackles the problem by formulating prior knowledge as a common initialization across tasks, which is then used to quickly adapt to unseen tasks.  ...  One solution for a meta-learner would be to simply forget the part of the initialization that hinders adaptation to the task, minimizing its influence.  ... 
arXiv:1906.05895v2 fatcat:r4izauljpbabjhdhdywc3nzsqu

Learning to Continually Learn [article]

Shawn Beaulieu, Lapo Frati, Thomas Miconi, Joel Lehman, Kenneth O. Stanley, Jeff Clune, Nick Cheney
2020 arXiv   pre-print
We instead advocate meta-learning a solution to catastrophic forgetting, allowing AI to learn to continually learn.  ...  Continual lifelong learning requires an agent or model to learn many sequentially ordered tasks, building on previous knowledge without catastrophically forgetting it.  ...  We are also appreciative of Blake Camp for catching typos in a draft and alerting us to a relevant paper, and to Louis Kirsch for suggesting the addition of a relevant citation.  ... 
arXiv:2002.09571v2 fatcat:hdboateo6bdfvmi7fske6tq7le

On Hard Episodes in Meta-Learning [article]

Samyadeep Basu, Amr Sharaf, Nicolo Fusi, Soheil Feizi
2021 arXiv   pre-print
We additionally investigate various properties of hard episodes and highlight their connection to catastrophic forgetting during meta-training.  ...  To address the issue of sub-par performance on hard episodes, we investigate and benchmark different meta-training strategies based on adversarial training and curriculum learning.  ...  To summarize, we find that forgetting occurs in meta-learning even when the tasks are drawn from a single task distribution.  ... 
arXiv:2110.11190v1 fatcat:5bqqt4ndjzhezmyj7wq54bxkdy

Learning to Remember from a Multi-Task Teacher [article]

Yuwen Xiong, Mengye Ren, Raquel Urtasun
2020 arXiv   pre-print
Recent studies on catastrophic forgetting during sequential learning typically focus on fixing the accuracy of the predictions for a previously learned task.  ...  Towards this goal, we propose an experimental setup that measures the amount of representational forgetting, and develop a novel meta-learning algorithm to overcome this issue.  ...  Meta-learning is a general tool for us to learn a new learning algorithm with desired properties.  ... 
arXiv:1910.04650v2 fatcat:erj2nlpqajg47pt245z7ge3yhy

Meta-learnt priors slow down catastrophic forgetting in neural networks [article]

Giacomo Spigler
2020 arXiv   pre-print
Current training regimes for deep learning usually involve exposure to a single task / dataset at a time.  ...  Here we show that catastrophic forgetting can be mitigated in a meta-learning context, by exposing a neural network to multiple tasks in a sequential manner during training.  ...  We further present SeqFOMAML, a meta-learning algorithm designed to jointly optimize for fast adaptation to new tasks, as in traditional single-task meta-learning, and for performance on previously observed  ... 
arXiv:1909.04170v2 fatcat:lf7djggjwbhafcjkl6cglrde6e

Meta Continual Learning [article]

Risto Vuorio, Dong-Yeon Cho, Daejoong Kim, Jiwon Kim
2018 arXiv   pre-print
In this paper, we propose a learning to optimize algorithm for mitigating catastrophic forgetting.  ...  Using neural networks in practical settings would benefit from the ability of the networks to learn new tasks throughout their lifetimes without forgetting the previous tasks.  ...  We propose a meta-learning approach to continual learning where a model for adjusting per-parameter update step is trained to mitigate forgetting of past tasks.  ... 
arXiv:1806.06928v1 fatcat:gr2lrveltjc5dpt5qpdwihihsq

Manipulator Meta-Imitation Learning Algorithm with Memory Weight Integration

Mingjun Yin, Qingshan Zeng
2019 IOP Conference Series: Materials Science and Engineering  
In this paper, the memory weight integration term adapted to meta-learning algorithm is proposed.  ...  By adjusting the plasticity of neurons, the manipulator can learn to learn more effectively in the process of learning multi-task and improve the forgetting problem of multi-task learning.  ...  Although meta-imitation learning algorithm(MIL) has the ability to learn many kinds of tasks, catastrophic forgetting is still difficult to overcome when the number of tasks increases and the complexity  ... 
doi:10.1088/1757-899x/569/5/052039 fatcat:prrcxixxczb3zdiwmfxwyfueuq

Meta-Learning for Natural Language Understanding under Continual Learning Framework [article]

Jiacheng Wang, Yong Fan, Duo Jiang, Shiqing Li
2020 arXiv   pre-print
In this paper, we implement the model-agnostic meta-learning (MAML) and Online aware Meta-learning (OML) meta-objective under the continual framework for NLU tasks.  ...  Methods have been developed to train a robust model to handle multiple tasks to gain a general representation of text.  ...  They developed a gradient-based Meta-Learning algorithm for quick adaption to continuously changing environment.  ... 
arXiv:2011.01452v1 fatcat:rbpbzw7ouzc4tc36ucyr3xoi5m

ARCADe: A Rapid Continual Anomaly Detector [article]

Ahmed Frikha, Denis Krompaß, Volker Tresp
2020 arXiv   pre-print
for training.  ...  Moreover, we propose A Rapid Continual Anomaly Detector (ARCADe), an approach to train neural networks to be robust against the major challenges of this new learning problem, namely catastrophic forgetting  ...  model initialization to be suitable for continual learning, i.e. to inhibit catastrophic forgetting, each meta-training and meta-testing task is built as a sequence of classification tasks [18] , [10  ... 
arXiv:2008.04042v2 fatcat:ti5swtcwxzdcpo4y7j3ftlv74a

Addressing Catastrophic Forgetting in Few-Shot Problems [article]

Pauching Yap, Hippolyt Ritter, David Barber
2021 arXiv   pre-print
Our framework utilises Bayesian online learning and meta-learning along with Laplace approximation and variational inference to overcome catastrophic forgetting in few-shot classification problems.  ...  We demonstrate that the popular gradient-based model-agnostic meta-learning algorithm (MAML) indeed suffers from catastrophic forgetting and introduce a Bayesian online meta-learning framework that tackles  ...  Acknowledgements We would like to thank the reviewers for their constructive comments, and Peter Hayes for the useful initial discussions.  ... 
arXiv:2005.00146v3 fatcat:cmm5hijou5hahfyrlgfr4wx7zm

Homeostasis-Inspired Continual Learning: Learning to Control Structural Regularization

Joonyoung Kim, Hyowoon Seo, Wan Choi, Kyomin Jung
2021 IEEE Access  
To obtain effective and optimal IoRs for the real-time continual learning circumstances, we propose a homeostasis-inspired meta learning architecture.  ...  Learning continually without forgetting might be one of the ultimate goals for building artificial intelligence (AI).  ...  Towards applying the meta learning to continual learning, we must distinguish the tasks for meta training and testing.  ... 
doi:10.1109/access.2021.3050176 fatcat:ua2ias3ftzh4heahomyu4tq6dy

Continual Adaptation of Visual Representations via Domain Randomization and Meta-learning [article]

Riccardo Volpi, Diane Larlus, Grégory Rogez
2021 arXiv   pre-print
In this context, we show that one way to learn models that are inherently more robust against forgetting is domain randomization - for vision tasks, randomizing the current domain's distribution with heavy  ...  Building on this result, we devise a meta-learning strategy where a regularizer explicitly penalizes any loss associated with transferring the model from the current domain to different "auxiliary" meta-domains  ...  [24] and Csurka [11] for detailed reviews on lifelong learning, meta-learning and domain adaptation. Lifelong learning.  ... 
arXiv:2012.04324v2 fatcat:mjq3imnthjd3tmjojli2xo7cbu

Reducing catastrophic forgetting with learning on synthetic data [article]

Wojciech Masarczyk, Ivona Tautkute
2020 arXiv   pre-print
We propose a method to generate such data in two-step optimisation process via meta-gradients.  ...  Catastrophic forgetting is a problem caused by neural networks' inability to learn data in sequence. After learning two tasks in sequence, performance on the first one drops significantly.  ...  Acknowledgements Authors would like to thank Petr Hlubuek and GoodAI for publishing the code at https://github.com/ GoodAI/GTN.  ... 
arXiv:2004.14046v1 fatcat:x3bl77trmvf2peltwm6iv2xcqu

BI-MAML: Balanced Incremental Approach for Meta Learning [article]

Yang Zheng, Jinlin Xiang, Kun Su, Eli Shlizerman
2020 arXiv   pre-print
We present a novel Balanced Incremental Model Agnostic Meta Learning system (BI-MAML) for learning multiple tasks.  ...  Our method implements a meta-update rule to incrementally adapt its model to new tasks without forgetting old tasks. Such a capability is not possible in current state-of-the-art MAML approaches.  ...  Meta Learning Meta learning is the general approach for specifying flexible and generic rules for a system to learn new tasks.  ... 
arXiv:2006.07412v1 fatcat:h7kxhlbxrvdszcl2wqizlpumw4

Continual Learning with Deep Artificial Neurons [article]

Blake Camp, Jaya Krishna Mandivarapu, Rolando Estrada
2020 arXiv   pre-print
Here, we isolate continual learning as our meta-objective, and we show that a suitable neuronal phenotype can endow a single network with an innate ability to update its synapses with minimal forgetting  ...  We demonstrate that it is possible to meta-learn a single parameter vector, which we dub a neuronal phenotype, shared by all DANs in the network, which facilitates a meta-objective during deployment.  ...  Specifically, we sought to verify whether the meta-learning procedure was indeed endowing the DANs with an innate ability to assist in learning without forgetting.  ... 
arXiv:2011.07035v1 fatcat:zln4gdiuvra63hmjc3isut3awq
« Previous Showing results 1 — 15 out of 39,678 results