Filters








14,529 Hits in 3.0 sec

Cooperative Learning for Noisy Supervision [article]

Hao Wu, Jiangchao Yao, Ya Zhang, Yanfeng Wang
2021 arXiv   pre-print
In this paper, we propose Cooperative Learning (CooL) framework for noisy supervision that analytically explains the effects of leveraging dual or multiple networks.  ...  Learning with noisy labels has gained the enormous interest in the robust deep learning area.  ...  To explore these limits and give a more general scope, we propose a Cooperative Learning (CooL) paradigm that multiple classifiers work cooperatively for noisy supervision.  ... 
arXiv:2108.05092v1 fatcat:rjrbesoakjavhki5uwwszuxuqa

ISCL: Interdependent Self-Cooperative Learning for Unpaired Image Denoising [article]

Kanggeun Lee, Won-Ki Jeong
2021 arXiv   pre-print
With the advent of advances in self-supervised learning, paired clean-noisy data are no longer required in deep learning-based image denoising.  ...  In this paper, we propose a novel image denoising scheme, Interdependent Self-Cooperative Learning (ISCL), that leverages unpaired learning by combining cyclic adversarial learning with self-supervised  ...  Note that cooperative learning enables the training between the unpaired clean-noisy-based denoiser and the noise extractor based on self-supervision to boost the performance cooperatively.  ... 
arXiv:2102.09858v2 fatcat:op32yghjnfff3k2xtmgv2ecuxe

Cooperative Denoising for Distantly Supervised Relation Extraction

Kai Lei, Daoyuan Chen, Yaliang Li, Nan Du, Min Yang, Wei Fan, Ying Shen
2018 International Conference on Computational Linguistics  
involving their mutual learning by the adaptive bi-directional knowledge distillation and dynamic ensemble with noisy-varying instances.  ...  Meanwhile, the useful information expressed in knowledge graph is still underutilized in the state-of-the-art methods for distantly supervised relation extraction.  ...  Acknowledgements We thank anonymous reviewers for their helpful comments.  ... 
dblp:conf/coling/LeiCLDY0S18 fatcat:e67x6ulty5c47nx67zots42hxe

Comparative Analytical Study Considering The Analogy of Learning Creativity Quantification versus Ant Colony Intelligence

Hassan M. H. Mustafa, Fadhel Ben Tourkia
2018 Advances in Social Sciences Research Journal  
Conclusively, presented results herein, for both swarm intelligence and neural networks models seemed to be well promising for future more elaborate, systematic, and innovative research in evaluation of  ...  Both simulated realistically for systematic investigational modeling of creatures' creativity phenomenon observed in nature.  ...  Analogy between Supervised ANN Model Versus Tandam Learning Referring to [29] , the two Figures (11& 12) , present two distinct models for either supervised learning ANN, or the ACS adopting tandem learning  ... 
doi:10.14738/assrj.53.4259 fatcat:wmw4mcgcnne2lpimlzoec3mro4

Generative Cooperative Learning for Unsupervised Video Anomaly Detection [article]

Muhammad Zaigham Zaheer, Arif Mahmood, Muhammad Haris Khan, Mattia Segu, Fisher Yu, Seung-Ik Lee
2022 arXiv   pre-print
To this end, we propose a novel unsupervised Generative Cooperative Learning (GCL) approach for video anomaly detection that exploits the low frequency of anomalies towards building a cross-supervision  ...  In essence, both networks get trained in a cooperative fashion, thereby allowing unsupervised learning.  ...  Proposed Generative Cooperative Learning (GCL) algorithm introduces cross-supervision for training a Generator G and a Discriminator D.  ... 
arXiv:2203.03962v1 fatcat:sq2nycz2ubhzlpn7biurdhb7xi

Deep Learning Search by Social Image Re-ranking

Yogita Dhole
2019 International Journal for Research in Applied Science and Engineering Technology  
subspace by cooperatively investigating the feebly directed tagging data, the visual structure, and the semantic structure.  ...  Unique in relation to past work, framework propose a novel weakly supervised deep matrix factorization algorithm, in which reveals the dormant picture portrayals and tag portrayals installed in the inert  ...  The learning information is for picture label refinement.  ... 
doi:10.22214/ijraset.2019.6382 fatcat:stx74zkhpjavbezjz7oeh2xkt4

Aligning Vector-spaces with Noisy Supervised Lexicon

Noa Yehezkel Lubin, Jacob Goldberger, Yoav Goldberg
2019 Proceedings of the 2019 Conference of the North  
We demonstrate that such noise substantially degrades the accuracy of the learned translation when using current methods. We propose a model that accounts for noisy pairs.  ...  The algorithm jointly learns the noise level in the lexicon, finds the set of noisy pairs, and learns the mapping between the spaces.  ...  We also, thank Roee Aharoni for helpful discussions and suggestions.  ... 
doi:10.18653/v1/n19-1045 dblp:conf/naacl/LubinGG19 fatcat:sd7uzuwrrbgutkhr75tfvxrq4q

Aligning Vector-spaces with Noisy Supervised Lexicons [article]

Noa Yehezkel Lubin, Jacob Goldberger, Yoav Goldberg
2019 arXiv   pre-print
We demonstrate that such noise substantially degrades the accuracy of the learned translation when using current methods. We propose a model that accounts for noisy pairs.  ...  The algorithm jointly learns the noise level in the lexicon, finds the set of noisy pairs, and learns the mapping between the spaces.  ...  We also, thank Roee Aharoni for helpful discussions and suggestions.  ... 
arXiv:1903.10238v1 fatcat:efeepxrhxzchpdbr7raeqdl5ra

Relabel the Noise: Joint Extraction of Entities and Relations via Cooperative Multiagents [article]

Daoyuan Chen, Yaliang Li, Kai Lei, Ying Shen
2020 arXiv   pre-print
Distant supervision based methods for entity and relation extraction have received increasing popularity due to the fact that these methods require light human annotation efforts.  ...  We propose a joint extraction approach to address this problem by re-labeling noisy instances with a group of cooperative multiagents.  ...  Curriculum Learning for Multiagents It is difficult to learn from scratch for many RL agents.  ... 
arXiv:2004.09930v1 fatcat:cqqs24nqazdebmcf3qjdh6ems4

A Recursive Ensemble Learning Approach with Noisy Labels or Unlabeled Data

Yuchen Wang, Yang Yang, Yun-Xia Liu, Anil Anthony Bharath
2019 IEEE Access  
INDEX TERMS Noisy labels, pruning strategy, semi-supervised learning, ensemble learning, deep learning, neural networks.  ...  For many tasks, the successful application of deep learning relies on having large amounts of training data, labeled to a high standard.  ...  semi-supervised learning.  ... 
doi:10.1109/access.2019.2904403 fatcat:43s3pigfbvcbvdqe7sg66szkny

Improving Co-Training Algorithm with Active Learning

Xitao Zou, Jiang Xiong, Xianchun Zou
2016 ICIC Express Letters  
Co-training is a seminal semi-supervised learning algorithm.  ...  Firstly, to increase the number of labeled samples, we define a strategy of uncertainty sampling and then cooperate active learning into co-training.  ...  [12] put forward a semi-supervised learning algorithm by combining the benefits of co-training and active learning.  ... 
doi:10.24507/icicel.10.11.2533 fatcat:3ta3zvjrdfdnlagvos2ofugsjm

Weakly Supervised Image Classification Through Noise Regularization

Mengying Hu, Hu Han, Shiguang Shan, Xilin Chen
2019 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)  
In this work, we propose an effective approach for weakly supervised image classification utilizing massive noisy labeled data with only a small set of clean labels (e.g., 5%).  ...  clean labels and noisy labels, respectively, in a multi-task learning manner.  ...  Acknowledgment This research was supported in part by the National Key R&D Program of China (grant 2017YFA0700800), Natural Science Foundation of China (grants 61672496, 61650202, and 61772500), and External Cooperation  ... 
doi:10.1109/cvpr.2019.01178 dblp:conf/cvpr/HuHSC19 fatcat:ejn3k6samrcslhqfx7yn7rm7py

Competition and Multiple Cause Models

Peter Dayan, Richard S. Zemel
1995 Neural Computation  
and to two anonymous reviewers for their helpful comments.  ...  Acknowledgments We are very grateful to Virginia d e Sa, Geoff Hinton, Terry Sejnowski, Paul Viola, and Chris Williams for helpful discussions, to Eric Saund for generously sharing unpublished results,  ...  A Competitive Activation Function For simplicity, we describe the model for the self-supervised learning case, but it applies more generally.  ... 
doi:10.1162/neco.1995.7.3.565 fatcat:e4dhxp6r55ftvm67qqobxwu5dy

Multi-task self-supervised learning for Robust Speech Recognition [article]

Mirco Ravanelli, Jianyuan Zhong, Santiago Pascual, Pawel Swietojanski, Joao Monteiro, Jan Trmal, Yoshua Bengio
2020 arXiv   pre-print
Finally, we refine the set of workers used in self-supervision to encourage better cooperation.  ...  This paper proposes PASE+, an improved version of PASE for robust speech recognition in noisy and reverberant environments.  ...  Fig. 1 . 1 The proposed PASE+ architecture for self-supervised learning. Non-stationary noises from the FreeSound and the DIRHA datasets.  ... 
arXiv:2001.09239v2 fatcat:4pjk4kmecjhm3kqakoajrk44om

Collaborative Unsupervised Domain Adaptation for Medical Image Diagnosis [article]

Yifan Zhang, Ying Wei, Peilin Zhao, Shuaicheng Niu, Qingyao Wu, Mingkui Tan, Junzhou Huang
2019 arXiv   pre-print
Deep learning based medical image diagnosis has shown great potential in clinical medicine.  ...  data or assume samples are equally transferable, we propose a novel Collaborative Unsupervised Domain Adaptation algorithm to conduct transferability-aware domain adaptation and conquer label noise in a cooperative  ...  In medical image diagnosis, however, such extensive supervision is often absent due to prohibitive costs of data labeling [11, 19] , which impedes the successful application of deep learning.  ... 
arXiv:1911.07293v1 fatcat:5kmr6pxdkvfkrnvvlxk34gzilq
« Previous Showing results 1 — 15 out of 14,529 results