Filters








14,618 Hits in 7.9 sec

On the Minimal Supervision for Training Any Binary Classifier from Only Unlabeled Data [article]

Nan Lu, Gang Niu, Aditya Krishna Menon, Masashi Sugiyama
2019 arXiv   pre-print
These two facts answer a fundamental question---what the minimal supervision is for training any binary classifier from only U data.  ...  In this paper, we study training arbitrary (from linear to deep) binary classifier from only unlabeled (U) data by ERM.  ...  We thank all anonymous reviewers for their helpful and constructive comments on the clarity of two earlier versions of this manuscript.  ... 
arXiv:1808.10585v4 fatcat:pofsmbwbdnelrmo2p56lmgyu3a

Mutual Exclusivity Loss for Semi-Supervised Deep Learning [article]

Mehdi Sajjadi, Mehran Javanmardi, Tolga Tasdizen
2016 arXiv   pre-print
Semi-supervised learning is motivated on the observation that unlabeled data is cheap and can be used to improve the accuracy of classifiers.  ...  to lie on the low density space between the manifolds corresponding to different classes of data.  ...  During training, the update for model parameters constitute of two parts. The first part is based on labeled data and the second part is from unlabeled data.  ... 
arXiv:1606.03141v1 fatcat:fs6t4kizkvg3hhnhn5glzcrmci

A Survey On Semi-Supervised Learning Techniques
English

V. Jothi Prakash, Dr. L.M. Nithya
2014 International Journal of Computer Trends and Technology  
There has been a large spectrum of ideas on semisupervised learning. In this paper we bring out some of the key approaches for semisupervised learning.  ...  Semisupervised learning is a learning standard which deals with the study of how computers and natural systems such as human beings acquire knowledge in the presence of both labeled and unlabeled data.  ...  One common approach is supervised learning. In supervised learning the training dataset comprises of only labeled data.  ... 
doi:10.14445/22312803/ijctt-v8p105 fatcat:6ai7xan6cngpjk2g2adntninoa

Spatially Adapted Manifold Learning for Classification of Hyperspectral Imagery with Insufficient Labeled Data

Wonkook Kim, Melba M. Crawford, Joydeep Ghosh
2008 IGARSS 2008 - 2008 IEEE International Geoscience and Remote Sensing Symposium  
A classifier derived from labeled samples acquired over an extended area may not perform well for a specific sub-region if the spectral signatures of classes vary across the image.  ...  This problem is addressed using semi-supervised learning and manifold learning, which both exploit the information provided by unlabeled samples in the image.  ...  In case of the OAA strategy, any unlabeled samples can be used for training the N class classifiers since the output space of the classifier exhausts all the classes.  ... 
doi:10.1109/igarss.2008.4778831 dblp:conf/igarss/KimCG08 fatcat:c4bazko4nfaphobylcs4tu2zbm

Generalized Product Quantization Network for Semi-supervised Image Retrieval [article]

Young Kyun Jang, Nam Ik Cho
2020 arXiv   pre-print
We design a novel metric learning strategy that preserves semantic similarity between labeled data, and employ entropy regularization term to fully exploit inherent potentials of unlabeled data.  ...  Extensive experimental results demonstrate that GPQ yields state-of-the-art performance on large-scale real image benchmark datasets.  ...  LN-P Q and L cls train the network to minimize errors using the labeled data, while LSEM trains the network to simultaneously maximize and minimize entropy using the unlabeled data.  ... 
arXiv:2002.11281v3 fatcat:v27lks4kvffbrkviy7ubveuebi

Regularized Boost for Semi-supervised Ranking [chapter]

Zhigao Miao, Juan Wang, Aimin Zhou, Ke Tang
2015 Proceedings in Adaptation, Learning and Optimization  
Semi-supervised inductive learning concerns how to learn a decision rule from a data set containing both labeled and unlabeled data.  ...  In this paper, we introduce a local smoothness regularizer to semi-supervised boosting algorithms based on the universal optimization framework of margin cost functionals.  ...  Conclusions We have proposed a local smoothness regularizer for semi-supervising boosting learning and demonstrated its effectiveness on different types of data sets.  ... 
doi:10.1007/978-3-319-13359-1_49 fatcat:4mafb7jwnzcbtoz4an5hrytkdq

Rademacher Complexity Bounds for a Penalized Multi-class Semi-supervised Algorithm (Extended Abstract)

Yury Maximov, Massih-Reza Amini, Zaid Harchaoui
2018 Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence  
We propose Rademacher complexity bounds for multi-class classifiers trained with a two-step semi-supervised model.  ...  In the second step, a classifier is trained by minimizing a margin empirical loss over the labeled training set and a penalization term measuring the disability of the learner to predict the k predominant  ...  Acknowledgments This work has been partially supported by the THANATOS project funded by Appelà projets Grenoble Innovation Recherche. The work of YM at LANL was supported by funding from the U.S.  ... 
doi:10.24963/ijcai.2018/800 dblp:conf/ijcai/MaximovAH18 fatcat:ey5qlhwnsfafrfyxqy447wu4o4

A high-performance semi-supervised learning method for text chunking

Rie Kubota Ando, Tong Zhang
2005 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics - ACL '05  
The idea is to find "what good classifiers are like" by learning from thousands of automatically generated auxiliary classification problems on unlabeled data.  ...  In machine learning, whether one can build a more accurate classifier by using unlabeled data (semi-supervised learning) is an important issue.  ...  Acknowledgments Part of the work was supported by ARDA under the NIMD program PNWD-SW-6059.  ... 
doi:10.3115/1219840.1219841 dblp:conf/acl/AndoZ05 fatcat:b222o7nponbn3dqifbdci5ng2i

SemiBoost: Boosting for Semi-Supervised Learning

P.K. Mallapragada, Rong Jin, A.K. Jain, Yi Liu
2009 IEEE Transactions on Pattern Analysis and Machine Intelligence  
Most previous studies have focused on designing special algorithms to effectively exploit the unlabeled data in conjunction with labeled data.  ...  The key advantages of the proposed semi-supervised learning approach are: (a) performance improvement of any supervised learning algorithm with a multitude of unlabeled data, (b) efficient computation  ...  ACKNOWLEDGEMENTS We thank the anonymous reviewers for their valuable comments. The research was partially supported by ONR grant no. N000140710225 and NSF grant no. IIS-0643494.  ... 
doi:10.1109/tpami.2008.235 pmid:19762927 fatcat:fa2ip5ylx5hotnkp2mlcv22b5y

Generalized Product Quantization Network for Semi-Supervised Image Retrieval

Young Kyun Jang, Nam Ik Cho
2020 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)  
We design a novel metric learning strategy that preserves semantic similarity between labeled data, and employ entropy regularization term to fully exploit inherent potentials of unlabeled data.  ...  Extensive experimental results demonstrate that GPQ yields state-of-the-art performance on large-scale real image benchmark datasets.  ...  LN-P Q and L cls train the network to minimize errors using the labeled data, while LSEM trains the network to simultaneously maximize and minimize entropy using the unlabeled data.  ... 
doi:10.1109/cvpr42600.2020.00348 dblp:conf/cvpr/JangC20 fatcat:c2sbj5vevngzfhfe5h6trnqrhe

Learning From Positive and Unlabeled Data: A Survey [article]

Jessa Bekker, Jesse Davis
2018 arXiv   pre-print
Learning from positive and unlabeled data or PU learning is the setting where a learner only has access to positive examples and unlabeled data.  ...  The assumption is that the unlabeled data can contain both positive and negative examples.  ...  It fits within the long standing interest in developing learning algorithms that do not require fully supervised data, such as learning from positive-only or one-class data [46] and semi-supervised learning  ... 
arXiv:1811.04820v1 fatcat:2qi5g4xke5fljpw2iz7nhudc5a

Regularized multi-class semi-supervised boosting

Amir Saffari, Christian Leistner, Horst Bischof
2009 2009 IEEE Conference on Computer Vision and Pattern Recognition  
Many semi-supervised learning algorithms only deal with binary classification. Their extension to the multi-class problem is usually obtained by repeatedly solving a set of binary problems.  ...  In particular, we introduce a new multi-class margin-maximizing loss function for the unlabeled data and use the generalized expectation regularization for incorporating cluster priors into the model.  ...  labeled data and then they label unlabeled data for re-training of the other one.  ... 
doi:10.1109/cvpr.2009.5206715 dblp:conf/cvpr/SaffariLB09 fatcat:6l6pwznyanf4locu6he3vycgnu

Regularized multi-class semi-supervised boosting

A. Saffari, C. Leistner, H. Bischof
2009 2009 IEEE Conference on Computer Vision and Pattern Recognition  
Many semi-supervised learning algorithms only deal with binary classification. Their extension to the multi-class problem is usually obtained by repeatedly solving a set of binary problems.  ...  In particular, we introduce a new multi-class margin-maximizing loss function for the unlabeled data and use the generalized expectation regularization for incorporating cluster priors into the model.  ...  labeled data and then they label unlabeled data for re-training of the other one.  ... 
doi:10.1109/cvprw.2009.5206715 fatcat:itmyu6go2jcxrnwu6isgllnjoq

An AdaBoost Algorithm for Multiclass Semi-supervised Learning

Jafar Tanha, Maarten van Someren, Hamideh Afsarmanesh
2012 2012 IEEE 12th International Conference on Data Mining  
Abstract-We present an algorithm for multiclass Semi-Supervised learning which is learning from a limited amount of labeled data and plenty of unlabeled data.  ...  The algorithm is based on a novel multiclass loss function consisting of the margin cost on labeled data and two regularization terms on labeled and unlabeled data.  ...  For a multiclass semi-supervised learning problem, the only available algorithms, beside supervised learning of the labeled data only, are now those based on pseudo-margin or the use of a binary method  ... 
doi:10.1109/icdm.2012.119 dblp:conf/icdm/TanhaSA12 fatcat:hrwoirfupngq3jzaagohauvisy

Learning from Positive and Unlabeled Data with Augmented Classes [article]

Zhongnian Li, Liutao Yang, Zhongchen Ma, Tongfeng Sun, Xinzheng Xu, Daoqiang Zhang
2022 arXiv   pre-print
Positive Unlabeled (PU) learning aims to learn a binary classifier from only positive and unlabeled data, which is utilized in many real-world scenarios.  ...  In this paper, we propose an unbiased risk estimator for PU learning with Augmented Classes (PUAC) by utilizing unlabeled data from the augmented classes distribution, which can be easily collected in  ...  The goal of PU learning is to train a binary classifier by using only positive and unlabeled data without the assistance of negative label, which requires huge costs in some tasks.  ... 
arXiv:2207.13274v1 fatcat:vvbguwycubdopeul2pu3rn22k4
« Previous Showing results 1 — 15 out of 14,618 results