247 Hits in 6.2 sec

A Topological Approach for Semi-Supervised Learning [article]

Adrián Inés, César Domínguez, Jónathan Heras, Gadea Mata, Julio Rubio
2022 arXiv   pre-print
The results show that the semi-supervised methods developed in this work outperform both the results obtained with models trained with only manually labelled data, and those obtained with classical semi-supervised  ...  This challenge can be tackled by means of semi-supervised learning methods that take advantage of both labelled and unlabelled data.  ...  In Self-Training methods a model is trained on labelled data and used to predict pseudo-labels for the unlabelled data.  ... 
arXiv:2205.09617v1 fatcat:3ugcz6z7g5e3tevdcp76mjolh4

Recent advances and clinical applications of deep learning in medical image analysis [article]

Xuxin Chen, Ximin Wang, Ke Zhang, Roy Zhang, Kar-Ming Fung, Theresa C. Thai, Kathleen Moore, Robert S. Mannel, Hong Liu, Bin Zheng, Yuchen Qiu
2021 arXiv   pre-print
Especially, we emphasize the latest progress and contributions of state-of-the-art unsupervised and semi-supervised deep learning in medical image analysis, which are summarized based on different application  ...  scenarios, including classification, segmentation, detection, and image registration.  ...  Semi-supervised pseudo labeling: Fan et al. (2020) presented a semi-supervised framework (Semi-InfNet) to tackle the lack of high-quality labeled data in COVID-19 lung infection segmentation from CT images  ... 
arXiv:2105.13381v2 fatcat:2k342a6rhjaavpoa2qoqxhg5rq

Survey on Implementations of Generative Adversarial Networks for Semi-Supervised Learning

Ali Reza Sajun, Imran Zualkernan
2022 Applied Sciences  
Previous work in applying GANs to SSL are classified into pseudo-labeling/classification, encoder-based, TripleGAN-based, two GAN, manifold regularization, and stacked discriminator approaches.  ...  Generative adversarial networks (GANs) represent one recent approach to semi-supervised learning (SSL). This paper presents a survey method using GANs for SSL.  ...  GANs for semi-supervised learning.  ... 
doi:10.3390/app12031718 fatcat:x4skf2zvvvfornkmhelaijiwwu

Self-Path: Self-supervision for Classification of Pathology Images with Limited Annotations [article]

Navid Alemi Koohbanani, Balagopal Unnikrishnan, Syed Ali Khurram, Pavitra Krishnaswamy, Nasir Rajpoot
2020 arXiv   pre-print
Further, we show that Self-Path improves domain adaptation for classification of histology image patches when there is no labeled data available for the target domain.  ...  We introduce novel domain specific self-supervision tasks that leverage contextual, multi-resolution and semantic features in pathology images for semi-supervised learning and domain adaptation.  ...  Shaw et. al [18] also proposed to use pseudo-labels of unlabeled images for fine-tuning the model iteratively to improve performance for colorectal image classification.  ... 
arXiv:2008.05571v1 fatcat:tbgp42venreotlofe2fien4jhu

Enhancing Pseudo Label Quality for Semi-Supervised Domain-Generalized Medical Image Segmentation [article]

Huifeng Yao, Xiaowei Hu, Xiaomeng Li
2022 arXiv   pre-print
This paper presents a novel confidence-aware cross pseudo supervision algorithm for semi-supervised domain generalized medical image segmentation.  ...  The main goal is to enhance the pseudo label quality for unlabeled images from unknown distributions.  ...  Semi-supervised Semantic Segmentation Unlike the image classification task, manually labeling pixel-wise annotations for the segmentation task is expensive and time-consuming.  ... 
arXiv:2201.08657v2 fatcat:cehwtd6imrad3es4a27huu6hny

Domain Confusion with Self Ensembling for Unsupervised Adaptation [article]

Jiawei Wang, Zhaoshui He, Chengjian Feng, Zhouping Zhu, Qinzhuang Lin, Jun Lv, Shengli Xie
2020 arXiv   pre-print
A common approach for this problem is to transfer knowledge from a related labeled domain to a target one. There are two popular ways to achieve this goal: adversarial learning and self training.  ...  Data collection and annotation are time-consuming in machine learning, expecially for large scale problem.  ...  Acknowledgements The authors are grateful to all reviewers for their very insightful comments and suggestions. This work was supported in part by National Natural Science  ... 
arXiv:1810.04472v3 fatcat:3mjgffzhibe2tnqfr3tcmio2aq

Semi-MCNN: A Semi-supervised Multi-CNN Ensemble Learning Method for Urban Land Cover Classification Using Sub-meter HRRS Images

Runyu Fan, Ruyi Feng, Lizhe Wang, Jining Yan, Xiaohan Zhang
2020 IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing  
Sub-meter high-resolution remote sensing (HRRS) image land cover classification could provide significant help for urban monitoring, management, and planning.  ...  Considering the lack of labelled samples, a semi-supervised learning strategy was adopted to leverage large amounts of unlabelled data.  ...  [15] proposed a semi-supervised learning-based pseudo-labelling and sample selection scheme to train transferable deep models for land-use classification with HRRS images.  ... 
doi:10.1109/jstars.2020.3019410 fatcat:jtzkx4uhojh5jh6nmnrmgim3ce

Poisoning Semi-supervised Federated Learning via Unlabeled Data: Attacks and Defenses [article]

Yi Liu, Xingliang Yuan, Ruihui Zhao, Cong Wang, Dusit Niyato, Yefeng Zheng
2022 arXiv   pre-print
In practice, these SSFL systems implement semi-supervised training by assigning a "guessed" label to the unlabeled data near the labeled data to convert the unsupervised problem into a fully supervised  ...  Extensive case studies have shown that our attacks are effective on different datasets and common semi-supervised learning methods.  ...  Note that λ is to set the confidence of the pseudo-label which affects the semi-supervised performance.  ... 
arXiv:2012.04432v2 fatcat:3wxbf2twhfcopenn2u3shyffoi

Improve conditional adversarial domain adaptation using self‐training

Zi Wang, Xiaoliang Sun, Ang Su, Gang Wang, Yang Li, Qifeng Yu
2021 IET Image Processing  
Domain adaptation for image classification is one of the most fundamental transfer learning tasks and a promising solution to overcome the annotation burden.  ...  Here, adversarial learning and self-training are unified in an objective function, where the neural network parameters and the pseudo-labels of target samples are jointly optimized.  ...  A complimentary survey of self-labelled techniques was reported in [32] . Several studies have shown that entropy minimization is useful for semi-supervised learning and self-training [8, 33, 34] .  ... 
doi:10.1049/ipr2.12184 fatcat:exhsxnsxhzgpvkagotixezxale

Countering Noisy Labels By Learning From Auxiliary Clean Labels [article]

Tsung Wei Tsai, Chongxuan Li, Jun Zhu
2019 arXiv   pre-print
In addition to the widely-studied synthetic noise in the NL literature, we also consider the pseudo labels in semi-supervised learning (Semi-SL) as a special case of NL.  ...  For both types of noise, we argue that the generalization performance of existing methods is highly coupled with the quality of noisy labels.  ...  simplified-NL and pseudo labels in Semi-SL. (2) We show that we can exploit additional training signals from all the provided images and strengthen the data cleansing mechanism with the self-supervised  ... 
arXiv:1905.13305v2 fatcat:iuxculw5ezdktdfbgj7ehmlcsi

MatchGAN: A Self-Supervised Semi-Supervised Conditional Generative Adversarial Network [article]

Jiaze Sun, Binod Bhattarai, Tae-Kyun Kim
2020 arXiv   pre-print
We present a novel self-supervised learning approach for conditional generative adversarial networks (GANs) under a semi-supervised setting.  ...  Unlike prior self-supervised approaches which often involve geometric augmentations on the image space such as predicting rotation angles, our pretext task leverages the label space.  ...  One of the popular approaches is to annotate unlabelled data with pseudo-labels [29] . Self-supervised approaches have also been explored in semi-supervised learning settings.  ... 
arXiv:2006.06614v2 fatcat:2e64halcebaqpij4bn5imnrhem

Low Budget Active Learning via Wasserstein Distance: An Integer Programming Approach [article]

Rafid Mahmood, Sanja Fidler, Marc T. Law
2022 arXiv   pre-print
This paper introduces an integer optimization problem for selecting a core set that minimizes the discrete Wasserstein distance from the unlabeled pool.  ...  Active learning is the process of training a model with limited labeled data by selecting a core subset of an unlabeled data pool to label.  ...  ACKNOWLEDGMENTS We would like to thank David Acuna, James Lucas and the anonymous reviewers for feedback on earlier versions of this paper, and Andre Cire and Jean-Franc ¸ois Puget for early discussions  ... 
arXiv:2106.02968v3 fatcat:bgiskip4u5enxeqsql7sbq2cdu

A Survey on Deep Semi-supervised Learning [article]

Xiangli Yang, Zixing Song, Irwin King, Zenglin Xu
2021 arXiv   pre-print
We first present a taxonomy for deep semi-supervised learning that categorizes existing methods, including deep generative methods, consistency regularization methods, graph-based methods, pseudo-labeling  ...  Deep semi-supervised learning is a fast-growing field with a range of practical applications.  ...  Self-training models Self-training algorithm leverages the model's own confident predictions to produce the pseudo labels for unlabeled data.  ... 
arXiv:2103.00550v2 fatcat:lymncf5wavgkhaenbvqlyvhuaa

Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer [article]

Jian Liang and Dapeng Hu and Yunbo Wang and Ran He and Jiashi Feng
2021 arXiv   pre-print
Furthermore, we propose a new labeling transfer strategy, which separates the target data into two splits based on the confidence of predictions (labeling information), and then employ semi-supervised  ...  Specifically, SHOT exploits both information maximization and self-supervised learning for the feature extraction module learning to ensure the target features are implicitly aligned with the features  ...  k We term ŷt as self-supervised pseudo labels since they are generated by the centroids obtained in an unsupervised manner.  ... 
arXiv:2012.07297v3 fatcat:sofsnofvpjegnhzumy277rjo3a

Better Pseudo-label: Joint Domain-aware Label and Dual-classifier for Semi-supervised Domain Generalization [article]

Ruiqi Wang, Lei Qi, Yinghuan Shi, Yang Gao
2021 arXiv   pre-print
to produce high-quality pseudo-labels.  ...  Concretely, to predict accurate pseudo-labels under domain shift, a domain-aware pseudo-labeling module is developed.  ...  CONCLUSION In this paper, we address the problem of semi-supervised domain generalization via producing better pseudo-labels for unlabeled data.  ... 
arXiv:2110.04820v1 fatcat:32fww64jlbhb7ai4fjtes4u5yy
« Previous Showing results 1 — 15 out of 247 results