Filters








20,937 Hits in 10.4 sec

All Labels Are Not Created Equal: Enhancing Semi-supervision via Label Grouping and Co-training [article]

Islam Nassar, Samitha Herath, Ehsan Abbasnejad, Wray Buntine, Gholamreza Haffari
2021 arXiv   pre-print
Pseudo-labeling is a key component in semi-supervised learning (SSL). It relies on iteratively using the model to generate artificial labels for the unlabeled data to train against.  ...  We propose SemCo, a method which leverages label semantics and co-training to address this problem.  ...  Background We are interested in a K-way semi-supervised image classification problem, where we train a model using batches of both labelled and unlabelled examples.  ... 
arXiv:2104.05248v1 fatcat:f3tpq5vnarcd5m2za7jgiptowu

Words are not Equal: Graded Weighting Model for building Composite Document Vectors [article]

Pranjal Singh, Amitabha Mukerjee
2015 arXiv   pre-print
Since these are language free models and can be obtained in an unsupervised manner, they are of interest also for under-resourced languages such as Hindi as well and many more languages.  ...  Some of these methods (particularly tf-idf) are seen to result in a significant improvement in performance over prior state of the art.  ...  All the datasets including our self created Hindi dataset are described below. We experimented on two Hindi review datasets.  ... 
arXiv:1512.03549v1 fatcat:5moc47i4ejcw7omu2cml44gubm

Chapter Three Equalities [chapter]

2016 Everyday Women's and Gender Studies  
By focusing on key concepts and not star theorists, ordinary life and not iconic examples, this book provides a welcome and innovative introduction to the twisted and all-too-ever-present ways in which  ...  She is the co-author of Troubling Women's Studies: Pasts, Presents, and Possibilities (Sumach Press, 2005) and co-editor (with Catherine M.  ...  , and two on her car; her husband has three flags on his van. equality does not mean total equality; it only means that some of us are more equal than others!  ... 
doi:10.4324/9781315643205-10 fatcat:ikiniciqyfcc5oaslgu2ouwsru

Uncertainty-Aware Deep Co-training for Semi-supervised Medical Image Segmentation [article]

Xu Zheng, Chong Fu, Haoyu Xie, Jialei Chen, Xingwei Wang, Chiu-Wing Sham
2021 arXiv   pre-print
Existing semi-supervised approaches enhance the ability to extract features from unlabeled data with prior knowledge obtained from limited labeled data.  ...  Simultaneously, in the backward process, we joint unsupervised and supervised losses to accelerate the convergence of the network via enhancing the gradient flow between different tasks.  ...  We joint unsupervised and supervised losses to accelerate the convergence of the network via enhancing the gradient flow between different tasks.  ... 
arXiv:2111.11629v2 fatcat:glk5vhrijvexxamaifao62smue

DCPE co-training for classification

Jin Xu, Haibo He, Hong Man
2012 Neurocomputing  
Co-training is a well-known semi-supervised learning technique that applies two basic learners to train the data source, which uses the most confident unlabeled data to augment labeled data in the learning  ...  The comparative studies with supervised learning methods and semi-supervised learning methods also demonstrate the effectiveness of the proposed approach.  ...  Acknowledgment This work was supported in part by the Defense Advanced Research Projects Agency (DARPA) under Grant FA8650-11-1-7148 and FA8650-11-1-7152.  ... 
doi:10.1016/j.neucom.2012.01.006 fatcat:pldazxyfmnbtrlux4yu4tf44uy

Pseudo-Labeling Optimization Based Ensemble Semi-Supervised Soft Sensor in the Process Industry

Youwei Li, Huaiping Jin, Shoulong Dong, Biao Yang, Xiangguang Chen
2021 Sensors  
Furthermore, a set of diverse semi-supervised NCLELM models (SSNCLELM) are developed from different enlarged labeled sets, which are obtained by combining the labeled and pseudo-labeled training data.  ...  supervised and semi-supervised soft sensor methods.  ...  , and co-training.  ... 
doi:10.3390/s21248471 pmid:34960564 pmcid:PMC8708742 fatcat:exo4otim3nb7tatuixdvhgdlry

Semi-supervised learning using multiple clusterings with limited labeled data

Germain Forestier, Cédric Wemmert
2016 Information Sciences  
We review and formalize eight semi-supervised learning algorithms and introduce a new method that combine supervised and unsupervised learning in order to use both labeled and unlabeled data.  ...  In this context, semi-supervised approaches have been proposed to leverage from both labeled and unlabeled data. In this paper, we focus on cases where the number of labeled samples is very limited.  ...  Unlike co-training, Assemble [16] can build semi-supervised ensemble of any size, and does not require the domain to have multiple views.  ... 
doi:10.1016/j.ins.2016.04.040 fatcat:d2cyzx33sreuhneh2t3lmid2je

A Weighted Voting Ensemble Self-Labeled Algorithm for the Detection of Lung Abnormalities from X-Rays

Ioannis Livieris, Andreas Kanavos, Vassilis Tampakas, Panagiotis Pintelas
2019 Algorithms  
Machine learning methods such as semi-supervised learning algorithms have been proposed as a new direction to address the problem of shortage of available labeled data, by exploiting the explicit classification  ...  Advances in digital chest radiography have enabled research and medical centers to accumulate large repositories of classified (labeled) images and mostly of unclassified (unlabeled) images from human  ...  Generally, self-labeled algorithm can be classified in two main groups: Self-training and Co-training.  ... 
doi:10.3390/a12030064 fatcat:2lrvdkbnafd6vcveoqbqoaxkta

Leveraging Unlabeled Data for Emotion Recognition With Enhanced Collaborative Semi-Supervised Learning

Zixing Zhang, Jing Han, Jun Deng, Xinzhou Xu, Fabien Ringeval, Bjorn Schuller
2018 IEEE Access  
Motivated by this concern, this article seeks to utmost exploit unlabelled data that are pervasively available in the real-world and easy to be collected, by means of novel Semi-Supervised Learning (SSL  ...  This strategy is supposed to not only improve the performance of the model for data annotation and consequently enhance the trustability of the automatically labelled data, but also to elevate the diversity  ...  Self-Training and Co-Training As mentioned in Section I, self-training and co-training are the two widely used inductive SSL approaches for emotion recognition.  ... 
doi:10.1109/access.2018.2821192 fatcat:fie6so25qrgp3goe5nlls64xhq

An empirical study of self-training and data balancing techniques for splice site prediction

Ana Stanescu, Doina Caragea
2017 International Journal of Bioinformatics Research and Applications  
However, semi-supervised learning has not been studied much for problems with highly skewed class distributions, which are prevalent in bioinformatics.  ...  Our results show that under certain conditions semi-supervised learning algorithms are a better choice than purely supervised classification algorithms.  ...  In under-sampling (of the majority class), we keep all positive instances and randomly pick an equal number of negative instances in order to create a balanced labelled training set for learning the self-training  ... 
doi:10.1504/ijbra.2017.082055 fatcat:lnexxy66brcnhn3azt6o72fwiy

Going Deeper into Semi-supervised Person Re-identification [article]

Olga Moskvyak, Frederic Maire, Feras Dayoub, Mahsa Baktashmotlagh
2021 arXiv   pre-print
To reduce the need for labeled data, we focus on a semi-supervised approach that requires only a subset of the training data to be labeled.  ...  We also propose a PartMixUp loss that improves the discriminative ability of learned part-based features for pseudo-labeling in semi-supervised settings.  ...  Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia.  ... 
arXiv:2107.11566v1 fatcat:kpbuly4pzrgvblouuifuq6eoem

Iterative, Deep Synthetic Aperture Sonar Image Segmentation [article]

Yung-Chen Sun, Isaac D. Gerg, Vishal Monga
2022 arXiv   pre-print
Finally, we also develop a semi-supervised (SS) extension of IDUS called IDSS and demonstrate experimentally that it can further enhance performance while outperforming supervised alternatives that exploit  ...  the same labeled training imagery.  ...  Section IV-F presents the semi-supervised training results of IDSS when only some pixel-level labels are present.  ... 
arXiv:2203.15082v1 fatcat:z64j3jdkabarpas5khh4xurz4q

Semi-supervised GANs to Infer Travel Modes in GPS Trajectories [article]

Ali Yazdizadeh and Zachary Patterson and Bilal Farooq
2021 arXiv   pre-print
Semi-supervised Generative Adversarial Networks (GANs) are developed in the context of travel mode inference with uni-dimensional smartphone trajectory data.  ...  The best semi-supervised GANs model led to a prediction accuracy of 83.4%, while the best CNN model was able to achieve the prediction accuracy of 81.3%.  ...  Future work will allow exploration of better performing models either with more channels and/or improved architectures.  ... 
arXiv:1902.10768v2 fatcat:zt25bsihezerdiwaz2xo7h3tma

A new ensemble self-labeled semi-supervised algorithm

Ioannis E. Livieris
2019 Informatica (Ljubljana, Tiskana izd.)  
The reported numerical results illustrate the efficacy of the proposed algorithm outperforming classical semi-supervised algorithms in term of classification accuracy, leading to more efficient and robust  ...  In this work, a new ensemble-based semi-supervised algorithm is proposed which is based on a maximum-probability voting scheme.  ...  In the literature, self-labeled methods are divided into self-training [41] and co-training [4] .  ... 
doi:10.31449/inf.v43i2.2217 fatcat:skxctlhzwjhzrooprj7qxxnaxu

Applying Efficient Selection Techniques of Unlabelled Instances for Wrapper-based Semi-supervised Methods

Cephas A. S. Barreto, Arthur Costa Gorgonio, Joao C. Xavier-Junior, Anne Magaly De Paula Canuto
2022 IEEE Access  
Semi-supervised learning (SSL) is a machine learning approach that integrates supervised and unsupervised learning mechanisms.  ...  Then, this created model is used in a labelling process, where some unlabelled instances are labelled, and consequently, these instances are incorporated into the labelled set.  ...  Additionally, Self-training and Co-training are two well-known semi-supervised methods that belong to the wrapper-based SSL sub-class.  ... 
doi:10.1109/access.2022.3169498 fatcat:h7fd3rfuy5b2hjb3kfb36ydema
« Previous Showing results 1 — 15 out of 20,937 results