Filters








11 Hits in 4.1 sec

EnAET: A Self-Trained framework for Semi-Supervised and Supervised Learning with Ensemble Transformations [article]

Xiao Wang, Daisuke Kihara, Jiebo Luo, Guo-Jun Qi
2021 arXiv   pre-print
In this study, we propose a new EnAET framework to further improve existing semi-supervised methods with self-supervised information.  ...  We are the first to explore the role of self-supervised representations in semi-supervised learning under a rich family of transformations.  ...  : A Self-Trained framework for Semi-Supervised and Supervised Learning with Ensemble Transformations  ... 
arXiv:1911.09265v2 fatcat:33nf6pnwl5ak3apvofvloetzdu

Art Style Classification with Self-Trained Ensemble of AutoEncoding Transformations [article]

Akshay Joshi, Ankit Agrawal, Sushmita Nair
2020 arXiv   pre-print
To achieve this, we train the EnAET semi-supervised learning model (Wang et al., 2019) with limited annotated data samples and supplement it with self-supervised representations learned from an ensemble  ...  In this paper, we investigate the use of deep self-supervised learning methods to solve the problem of recognizing complex artistic styles with high intra-class and low inter-class variation.  ...  The framework to recognize rare and exotic art styles is summarized as follows: • To train a semi-supervised model, an ensemble of both spatial and non-spatial transformations from both labeled and unlabeled  ... 
arXiv:2012.03377v1 fatcat:z7ilugtdfzgx5ckbab4fslpz4e

Pseudo-Representation Labeling Semi-Supervised Learning [article]

Song-Bo Yang, Tian-li Yu
2020 arXiv   pre-print
In addition, our framework is integrated with self-supervised representation learning such that the classifier gains benefits from representation learning of both labeled and unlabeled data.  ...  In recent years, semi-supervised learning (SSL) has shown tremendous success in leveraging unlabeled data to improve the performance of deep learning models, which significantly reduces the demand for  ...  This framework can be divided into two parts. First, it trains a classifier on unlabeled data with self-supervised learning. Next, it trains labeled data with supervised learning.  ... 
arXiv:2006.00429v1 fatcat:j33w4oglznbifj4apntcmkarqm

A Survey on Deep Semi-supervised Learning [article]

Xiangli Yang, Zixing Song, Irwin King, Zenglin Xu
2021 arXiv   pre-print
Deep semi-supervised learning is a fast-growing field with a range of practical applications.  ...  This paper provides a comprehensive survey on both fundamentals and recent advances in deep semi-supervised learning methods from perspectives of model design and unsupervised loss functions.  ...  The core part of this framework is that EnAET integrates an ensemble of spatial and nonspatial transformations to self-train a good feature representation [238] .  ... 
arXiv:2103.00550v2 fatcat:lymncf5wavgkhaenbvqlyvhuaa

A survey on Semi-, Self- and Unsupervised Learning for Image Classification [article]

Lars Schmarje, Monty Santarossa, Simon-Martin Schröder, Reinhard Koch
2021 arXiv   pre-print
The degree of supervision which is needed to achieve comparable results to the usage of all labels is decreasing and therefore methods need to be extended to settings with a variable number of classes.  ...  Therefore, it is common to incorporate unlabeled data into the training process to reach equal results with fewer labels.  ...  The architecture and loss is either copied from π-model [49] or Mean Teacher [48] . EnAET [73] combines the self-supervised pretext task AutoEncoding Transformations [74] with MixMatch [46] .  ... 
arXiv:2002.08721v5 fatcat:gy7w6nofjfer3fngr5eqp5ybva

DoubleMatch: Improving Semi-Supervised Learning with Self-Supervision [article]

Erik Wallin, Lennart Svensson, Fredrik Kahl, Lars Hammarstrand
2022 arXiv   pre-print
Following the success of supervised learning, semi-supervised learning (SSL) is now becoming increasingly popular.  ...  We propose a new SSL algorithm, DoubleMatch, which combines the pseudo-labeling technique with a self-supervised loss, enabling the model to utilize all unlabeled data in the training process.  ...  The experiments were enabled by resources provided by the Swedish National Infrastructure for Computing (SNIC) at Chalmers Centre for Computational Science and Engineering (C3SE), and National Supercomputer  ... 
arXiv:2205.05575v1 fatcat:bwxulmw7qfc2hmofbff4p6jpo4

A Survey on Semi-, Self-and Unsupervised Learning for Image Classification

Lars Schmarje, Monty Santarossa, Simon-Martin Schroder, Reinhard Koch
2021 IEEE Access  
The degree of supervision which is needed to achieve comparable results to the usage of all labels is decreasing and therefore methods need to be extended to settings with a variable number of classes.  ...  Therefore, it is common to incorporate unlabeled data into the training process to reach equal results with fewer labels.  ...  For details about the training and learning strategies (including self-supervised learning) see subsection II-A. Each method belongs to one training strategy and uses several common ideas.  ... 
doi:10.1109/access.2021.3084358 fatcat:aiznkxq47rdspha7xnpj5iwfxe

Credal Self-Supervised Learning [article]

Julian Lienen, Eyke Hüllermeier
2021 arXiv   pre-print
Self-training is an effective approach to semi-supervised learning.  ...  Thanks to this increased expressiveness, the learner is able to represent uncertainty and a lack of knowledge in a more flexible and more faithful manner.  ...  Moreover, the authors gratefully acknowledge the funding of this project by computing time provided by the Paderborn Center for Parallel Computing (PC 2 ) and the research group of Prof. Dr.  ... 
arXiv:2106.11853v2 fatcat:la5h77fjh5fzfmdb4wqg5pkusm

Meta Pseudo Labels [article]

Hieu Pham, Zihang Dai, Qizhe Xie, Minh-Thang Luong, Quoc V. Le
2021 arXiv   pre-print
We present Meta Pseudo Labels, a semi-supervised learning method that achieves a new state-of-the-art top-1 accuracy of 90.2% on ImageNet, which is 1.6% better than the existing state-of-the-art.  ...  Like Pseudo Labels, Meta Pseudo Labels has a teacher network to generate pseudo labels on unlabeled data to teach a student network.  ...  We also thank David Berthelot, Nicholas Carlini, Sylvain Gelly, Geoff Hinton, Mohammad Norouzi, and Colin Raffel for their comments on earlier drafts of the paper, and others in the Google Brain Team for  ... 
arXiv:2003.10580v4 fatcat:ntmbnko62ragfes44vhiql3yka

Semi-Supervised Learning of Visual Features by Non-Parametrically Predicting View Assignments with Support Samples [article]

Mahmoud Assran, Mathilde Caron, Ishan Misra, Piotr Bojanowski, Armand Joulin, Nicolas Ballas, Michael Rabbat
2021 arXiv   pre-print
Despite the simplicity of the approach, PAWS outperforms other semi-supervised methods across architectures, setting a new state-of-the-art for a ResNet-50 on ImageNet trained with either 10% or 1% of  ...  By non-parametrically incorporating labeled samples in this way, PAWS extends the distance-metric loss used in self-supervised methods such as BYOL and SwAV to the semi-supervised setting.  ...  It has been demonstrated that self-supervised pre-training produces image representations that can be leveraged effectively for semi-supervised learning [1] .  ... 
arXiv:2104.13963v3 fatcat:xui3kwibwre3xnu5eorcqdfoky

Semi-supervised associative classification using ant colony optimization algorithm

Hamid Hussain Awan, Waseem Shahzad
2021 PeerJ Computer Science  
Semi-supervised learning solves the problem of labeling the unlabeled instances through heuristics. Self-training is one of the most widely-used comprehensible approaches for labeling data.  ...  self-training classification accuracy by exploiting the association among attribute values (terms) and between a set of terms and class labels of the labeled instances.  ...  Wang et al. (2021) presented an ensemble framework named Ensemble of Auto-Encoding Transformation (EnAET) for self-training of images in Wang et al. (2021) .  ... 
doi:10.7717/peerj-cs.676 pmid:34604517 pmcid:PMC8444075 fatcat:qg6mlexw4zcqxlskozeenysvsi