Filters








6,242 Hits in 6.4 sec

Faster Meta Update Strategy for Noise-Robust Deep Learning [article]

Youjiang Xu, Linchao Zhu, Lu Jiang, Yi Yang
2021 arXiv   pre-print
In this paper, we introduce a novel Faster Meta Update Strategy (FaMUS) to replace the most expensive step in the meta gradient computation with a faster layer-wise approximation.  ...  It has been shown that deep neural networks are prone to overfitting on biased training data. Towards addressing this issue, meta-learning employs a meta model for correcting the training bias.  ...  Faster Meta Update Strategy In this section, we introduce a Faster Meta Update Strategy (FaMUS) to efficiently approximate the total meta gradients by a layer-wise meta gradient sampling procedure. !  ... 
arXiv:2104.15092v1 fatcat:huvi3lkwrnc3bjqj3py67t4lrq

Learning to Explore with Meta-Policy Gradient [article]

Tianbing Xu, Qiang Liu, Liang Zhao, Jian Peng
2018 arXiv   pre-print
The performance of off-policy learning, including deep Q-learning and deep deterministic policy gradient (DDPG), critically depends on the choice of the exploration policy.  ...  In this work, we develop a simple meta-policy gradient algorithm that allows us to adaptively learn the exploration policy in DDPG.  ...  Acknowledgement We appreciate Kliegl Markus for his insightful discussions and helpful comments.  ... 
arXiv:1803.05044v2 fatcat:gxgkv6uljfbzjmfpwgzk6itof4

Learning to Explore via Meta-Policy Gradient

Tianbing Xu, Qiang Liu, Liang Zhao, Jian Peng
2018 International Conference on Machine Learning  
The performance of off-policy learning, including deep Q-learning and deep deterministic policy gradient (DDPG), critically depends on the choice of the exploration strategy.  ...  In this work, we develop a simple meta-policy gradient algorithm that allows us to adaptively learn the exploration policy in DDPG.  ...  Great thanks to Kliegl Markus for his insightful discussions and comments.  ... 
dblp:conf/icml/XuLZP18 fatcat:2yvq5wq4kzckdcdmdaqfzwweju

Meta-tracker: Fast and Robust Online Adaptation for Visual Object Trackers [chapter]

Eunbyung Park, Alexander C. Berg
2018 Lecture Notes in Computer Science  
By enforcing a small number of update iterations during meta-learning, the resulting networks train significantly faster.  ...  The meta learning is driven by the goal of deep networks that can quickly be adapted to robustly model a particular target in future frames.  ...  Acknowledgements We thank the reviewers for their valuable feedback and acknowledge support from NSF 1452851, 1526367, 1446631.  ... 
doi:10.1007/978-3-030-01219-9_35 fatcat:deqzaf5kqbdwfdtdfyfnq3wdke

Learning from Noisy Labels with Deep Neural Networks: A Survey [article]

Hwanjun Song, Minseok Kim, Dongmin Park, Yooju Shin, Jae-Gil Lee
2022 arXiv   pre-print
As noisy labels severely degrade the generalization performance of deep neural networks, learning from noisy labels (robust training) is becoming an important task in modern deep learning applications.  ...  In this survey, we first describe the problem of learning with label noise from a supervised learning perspective.  ...  A high level research overview of robust deep learning for noisy labels.  ... 
arXiv:2007.08199v7 fatcat:c5ztk4jfpfddrhqvf6phcy32de

Model-Agnostic Meta-Learning for EEG Motor Imagery Decoding in Brain-Computer-Interfacing [article]

Denghao Li, Pablo Ortega, Xiaoxi Wei, Aldo Faisal
2021 arXiv   pre-print
We apply here meta-learning to a simple Deep Learning BCI architecture and compare it to transfer learning on the same architecture.  ...  We introduce here the idea of Meta-Learning for training EEG BCI decoders. Meta-Learning is a way of training machine learning systems so they learn to learn.  ...  We find that this first stab at meta-learning in BCI was shown to be a robust algorithm for the problem. IV.  ... 
arXiv:2103.08664v1 fatcat:z3fby6ehiffilpuo3oi4fw7tre

Meta Transition Adaptation for Robust Deep Learning with Noisy Labels [article]

Jun Shu, Qian Zhao, Zongben Xu, Deyu Meng
2020 arXiv   pre-print
To alleviate these issues, this study proposes a new meta-transition-learning strategy for the task.  ...  To discover intrinsic inter-class transition probabilities underlying data, learning with noise transition has become an important approach for robust deep learning on corrupted labels.  ...  Conclusion We have proposed a novel meta-learning method for adaptively extracting transition matrix to guarantee robust deep learning in the presence of noisy labels.  ... 
arXiv:2006.05697v2 fatcat:3uyffbuvqzhdhfusqttveinzk4

Deep Speaker Recognition: Process, Progress, and Challenges

Abu Quwsar Ohi, M. F. Mridha, Md. Abdul Hamid, Muhammad Mostafa Monowar
2021 IEEE Access  
Though speaker recognition systems were previously constructed using handcrafted statistical means of machine learning, currently it is being shifted to state-of-the-art deep learning strategies.  ...  Further, deep learning being a fast-paced domain, an absence of comprehensive survey is observed in the current deep speaker recognition technologies.  ...  b: Model agnostic meta-learning Model agnostic meta-learning (MAML) [94] , [95] is a simpler approach of meta-learning where the weights of the learner f w (·) are updated based on the calculated loss  ... 
doi:10.1109/access.2021.3090109 fatcat:klq443bqmjh4vakpfvwqelujvi

Improved Action-Decision Network for Visual Tracking with Meta-learning

Detian Huang, Lingke Kong, Jianqing Zhu, Lixin Zheng
2019 IEEE Access  
Finally, an effective online adaptive update strategy is proposed to adapt to the appearance changes or deformation of the object during actual tracking.  ...  Specifically, meta-learning is utilized to pursue the most appropriate parameters for the network so that the parameters are closer to the optimal ones in the subsequent tracking process.  ...  based training stage, and utilizing meta-learning for adaptively updating the parameters of the network during actual tracking to enhance its robustness and real-time performance.  ... 
doi:10.1109/access.2019.2936551 fatcat:ot4ykjsenbektervnhcd57nnxq

A Nested Bi-level Optimization Framework for Robust Few Shot Learning

Krishnateja Killamsetty, Changbin Li, Chen Zhao, Feng Chen, Rishabh Iyer
2022 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
In this work, we propose a novel robust meta-learning algorithm, NESTEDMAML, which learns to assign weights to training tasks or instances.  ...  meta-learning methods.  ...  Acknowledgments We thank the AAAI area chairs and anonymous reviewers for offering their constructive comments on this paper.  ... 
doi:10.1609/aaai.v36i7.20678 fatcat:k2n45avijferrh3j5b7zadna2e

Meta-SE: A meta-learning framework for few-shot speech enhancement

Weili Zhou, Mingliang Lu, Ruijie Ji
2021 IEEE Access  
tasks using MAML update strategy.  ...  In meta-training stage, the meta-learner can learn a good initialization from metatraining tasks using MAML update strategy.  ... 
doi:10.1109/access.2021.3066609 fatcat:kjdirvjkwbgmbayngkx7dreq5q

Online Meta-Learning for Model Update Aggregation in Federated Learning for Click-Through Rate Prediction [article]

Xianghang Liu, Bartłomiej Twardowski, Tri Kurniawan Wijaya
2022 arXiv   pre-print
To address these challenges, we propose a simple online meta-learning method to learn a strategy of aggregating the model updates, which adaptively weighs the importance of the clients based on their attributes  ...  In Federated Learning (FL) of click-through rate (CTR) prediction, users' data is not shared for privacy protection.  ...  To obtain faster learning progress and better final solutions, a good client weighting strategy for update aggregation should be (1) aware of the client heterogeneity: the importance weight of a client  ... 
arXiv:2209.00629v1 fatcat:f3e7uv2bc5ft5hehojthbqnj4a

Test-time Adaptation for Real Image Denoising via Meta-transfer Learning [article]

Agus Gunawan, Muhammad Adi Nugroho, Se Jin Park
2022 arXiv   pre-print
Meanwhile, we use meta-learning for fine-tuning (meta-transfer learning) the network as the second stage of our training to enable test-time adaptation on real noisy images.  ...  The learning strategy is two stages where the first stage pre-train the network using meta-auxiliary learning to get better meta-initialization.  ...  One of the early deep learning approaches for denoising is (Zhang et al., 2017a) that propose DnCNN, a residual learning strategy to solve AWGN denoising tasks.  ... 
arXiv:2207.02066v1 fatcat:a4cf3uplznbo5decpaxante4zm

Diversity Transfer Network for Few-Shot Learning

Mengting Chen, Yuxin Fang, Xinggang Wang, Heng Luo, Yifeng Geng, Xinyu Zhang, Chang Huang, Wenyu Liu, Bo Wang
2020 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
Few-shot learning is a challenging task that aims at training a classifier for unseen classes with only a few training examples.  ...  The results show that DTN, with single-stage training and faster convergence speed, obtains the state-of-the-art results among the feature generation based few-shot learning methods.  ...  Ablation studies show that compared with the training strategy used in TADAM, DTN trained by OAT obtains better and more robust results with a faster convergence speed.  ... 
doi:10.1609/aaai.v34i07.6628 fatcat:hsngwfqqavhalag734wslejhhe

Meta-Tracker: Fast and Robust Online Adaptation for Visual Object Trackers [article]

Eunbyung Park, Alexander C. Berg
2018 arXiv   pre-print
By enforcing a small number of update iterations during meta-learning, the resulting networks train significantly faster.  ...  Our core contribution is an offline meta-learning-based method to adjust the initial deep networks used in online adaptation-based tracking.  ...  It aimed to learn optimal update strategies based on how accurate a learner can classify test images with few training examples when the learner follows the strategies from the meta-learner.  ... 
arXiv:1801.03049v2 fatcat:jd3duvwenvek7bw5rku7455gyq
« Previous Showing results 1 — 15 out of 6,242 results