Filters








370 Hits in 6.2 sec

A Robust AUC Maximization Framework with Simultaneous Outlier Detection and Feature Selection for Positive-Unlabeled Classification [article]

Ke Ren, Haichuan Yang, Yu Zhao, Mingshan Xue, Hongyu Miao, Shuai Huang, Ji Liu
2018 arXiv   pre-print
To address these three issues, we propose a robust learning framework that unifies AUC maximization (a robust metric for biased labels), outlier detection (for excluding wrong labels), and feature selection  ...  "positive" together with a large volume of "unlabeled" samples that may contain both positive and negative samples.  ...  To address these three issues jointly on PU classification, we propose a robust learning framework that unifies AUC (area under the curve) maximization -a robust metric for biased labels, outlier detection  ... 
arXiv:1803.06604v1 fatcat:cpsmd2pgh5fntilfwyx4xoj43i

Semi-Supervised Discriminative Classification Robust to Sample-Outliers and Feature-Noises

Ehsan Adeli, Kim-Han Thung, Le An, Guorong Wu, Feng Shi, Tao Wang, Dinggang Shen
2018 IEEE Transactions on Pattern Analysis and Machine Intelligence  
In this paper, we propose a semi-supervised robust discriminative classification method based on the least-squares formulation of linear discriminant analysis to detect sample-outliers and feature-noises  ...  simultaneously, using both labeled training and unlabeled testing data.  ...  In contrast, our method constructs the sample manifold using all labeled and unlabeled data to denoise the features and also selects the best features for classification, with a classification loss robust  ... 
doi:10.1109/tpami.2018.2794470 pmid:29994560 pmcid:PMC6050136 fatcat:vcztnlvzgfatta2uzikafhn2ii

Hyperspectral Target Detection with an Auxiliary Generative Adversarial Network

Yanlong Gao, Yan Feng, Xumin Yu
2021 Remote Sensing  
Compared to the training sets in classification tasks, the training sets for the target detection of hyperspectral images may only include a few target spectra which are quite limited and precious.  ...  However, the framework of DNN usually requires a large number of samples.  ...  for target detection (STD) [7] , and combined sparse and collaborative representation for target detection (CSCR) [29] ) and a neural network-based approach (i.e., semi-supervised classification based  ... 
doi:10.3390/rs13214454 fatcat:kuxgh34ecjf6fjd2zjgi455ixm

Active and Semi-supervised Data Domain Description [chapter]

Nico Görnitz, Marius Kloft, Ulf Brefeld
2009 Lecture Notes in Computer Science  
For instance, the support vector domain description (SVDD) learns a hypersphere enclosing the bulk of provided unlabeled data such that points lying outside of the ball are considered anomalous.  ...  In this paper, we rephrase data domain description as a semi-supervised learning task, that is, we propose a semi-supervised generalization of data domain description (SSSVDD) to process unlabeled and  ...  We thank Konrad Rieck for providing the kernels for the HTTP traffic and Christina Müller and Shinichi Nakajima for helping us with the object recognition task.  ... 
doi:10.1007/978-3-642-04180-8_44 fatcat:x6zxaik55nac5ntshfcd4hvkla

Anomaly Detection on Data Streams for Smart Agriculture

Juliet Chebet Moso, Stéphane Cormier, Cyril de Runz, Hacène Fouchal, John Mwangi Wandeto
2021 Agriculture  
This paper proposes an adaptation of an ensemble anomaly detector called enhanced locally selective combination in parallel outlier ensembles (ELSCP).  ...  Smart agriculture technologies are effective instruments for increasing farm sustainability and production.  ...  well for a wide range of classification tasks.  ... 
doi:10.3390/agriculture11111083 fatcat:zrjl5ptp35apfnpefikcpl4xq4

DEEP LEARNING-BASED CANCER CLASSIFICATION FOR MICROARRAY DATA: A SYSTEMATIC REVIEW

NASHAT ALREFAI, OTHMAN IBRAHIM
2021 Zenodo  
Deep neural networks are robust techniques and recently used extensively for building cancer classification models from different types of data.  ...  As a result, CNN considers the most common neural network architecture used in the medical field due to its robustness and high performance in cancer classification.  ...  The biggest pro of deep learning is the simultaneous training procedures of feature related tasks like selection, extraction, reduction and classification.  ... 
doi:10.5281/zenodo.6126510 fatcat:vmqa4zuoqrdsxdq7rsflgh362y

SimSearch: A Human-in-the-Loop Learning Framework for Fast Detection of Regions of Interest in Microscopy Images [article]

Ankit Gupta, Alan Sabirsh, Carolina Wahlby, Ida-Maria Sintorn
2022 bioRxiv   pre-print
Here we present SimSearch, a framework for quick and easy user-guided training of a deep neural model aimed at fast detection of ROIs in large-scale microscopy experiments.  ...  This is followed by feature extraction using a pre-trained deep-learning model, and interactive patch selection pruning, resulting in a smaller set of clean (user-approved) and a larger set of noisy (unapproved  ...  Supervised Refinement: For supervised contrastive training, the model is then trained with a batch size of 32 for 50 epochs with Adam optimizer and a learn- Results and Discussion Cell Classification  ... 
doi:10.1101/2022.04.05.487117 fatcat:rtm2endouneezjnul7euabtb4u

Self-Supervised Anomaly Detection: A Survey and Outlook [article]

Hadi Hojjati, Thi Kieu Khanh Ho, Narges Armanfard
2022 arXiv   pre-print
Over the past few years, anomaly detection, a subfield of machine learning that is mainly concerned with the detection of rare events, witnessed an immense improvement following the unprecedented growth  ...  Finally, we discuss a variety of new directions for improving the existing algorithms.  ...  and Computer Engineering at McGill University.  ... 
arXiv:2205.05173v2 fatcat:es7dkinhvrf7bepowfbbnj4hz4

A literature review on one-class classification and its potential applications in big data

Naeem Seliya, Azadeh Abdollah Zadeh, Taghi M. Khoshgoftaar
2021 Journal of Big Data  
Commonly used techniques in OCC for outlier detection and for novelty detection, respectively, are discussed.  ...  , noisy data, feature selection, and data reduction.  ...  Acknowledgements We would like to thank the various reviewers in the Data Mining and Machine Learning Laboratory at Florida Atlantic University, Boca Raton, FL 33431.  ... 
doi:10.1186/s40537-021-00514-x fatcat:iaqfshjii5butmn64yrecd5yxq

Semisupervised Multitask Learning

Qiuhua Liu, Xuejun Liao, Hui Li, J.R. Stack, L. Carin
2009 IEEE Transactions on Pattern Analysis and Machine Intelligence  
In addition, when performing many classification tasks one has simultaneous access to all unlabeled data that must be classified, and therefore there is an opportunity to place the classification of any  ...  one feature vector within the context of all unlabeled feature vectors; this is referred to as semi-supervised learning.  ...  For example, in airborne radar and electro-optic sensors, one may measure a large swath of terrain, and the unlabeled data {x i } i=n L +1:n L +n U may be defined simultaneously for this entire terrain  ... 
doi:10.1109/tpami.2008.296 pmid:19372611 fatcat:6j5pfgkjmzbbdmwhsz4icyvjwq

Event Detection and Identification in Distribution Networks Based on Invertible Neural Networks and Pseudo Labels

Fan Yang, Zenan Ling, Yuhang Zhang, Xing He, Qian Ai, Robert C. Qiu
2022 Frontiers in Energy Research  
In this paper, a framework for event detection, localization, and classification is studied to extract event features from measurements in distribution networks.  ...  Finally, as the events in practical power grids are mostly recorded unlabeled, the pseudo label (PL) based approach, superior in the separating ability for events under a low labeling rate, and is used  ...  In contrast, semi-supervised approaches simultaneously utilize labeled and unlabeled data, and thus they can realize refined classification with only a limited number of labeled samples.  ... 
doi:10.3389/fenrg.2022.858665 fatcat:pabzg5glpzezbikjm5wvq5vh2m

Regularized maximum correntropy machine [article]

Jim Jing-Yan Wang, Yunji Wang, Bing-Yi Jing, Xin Gao
2015 arXiv   pre-print
In this paper we investigate the usage of regularized correntropy framework for learning of classifiers from noisy labels.  ...  The experiments on two chal- lenging pattern classification tasks show that it significantly outperforms the machines with transitional loss functions.  ...  However, in support vector classification, this regularization term is either obtained by a "maximal margin" regularization or obtained by a "maximal robustness" regularization for certain type of feature  ... 
arXiv:1501.04282v1 fatcat:tsxsmf5lmjbn7ci2hkyafnaxna

A Unifying Review of Deep and Shallow Anomaly Detection [article]

Lukas Ruff, Jacob R. Kauffmann, Robert A. Vandermeulen, Grégoire Montavon, Wojciech Samek, Marius Kloft, Thomas G. Dietterich, Klaus-Robert Müller
2020 arXiv   pre-print
With the emergence of numerous such methods, including approaches based on generative models, one-class classification, and reconstruction, there is a growing need to bring methods of this field into a  ...  Finally, we outline critical open challenges and identify specific paths for future research in anomaly detection.  ...  Deep One-Class Classification Selecting kernels and hand-crafting relevant features can be challenging and quickly become impractical for complex data.  ... 
arXiv:2009.11732v2 fatcat:4ppfpds3ivd3bk5xcdoxmzmlie

Deep Weakly-supervised Anomaly Detection [article]

Guansong Pang, Chunhua Shen, Huidong Jin, Anton van den Hengel
2020 arXiv   pre-print
Learning with the small labeled anomaly data enables anomaly-informed modeling, which helps identify anomalies of interest and address the notorious high false positives in unsupervised anomaly detection  ...  that a very small number (e.g." a few dozens) of labeled anomalies can often be made available with small/trivial cost in many real-world anomaly detection applications.  ...  , in which a large scalar value for the instance pairs with two labeled anomalies, and an intermediate value for the pairs with one labeled anomaly and one unlabeled instance, and a small value for the  ... 
arXiv:1910.13601v3 fatcat:uwvyxx6l7vckdjrsjnnvjfvy54

Regularized maximum correntropy machine

Jim Jing-Yan Wang, Yunji Wang, Bing-Yi Jing, Xin Gao
2015 Neurocomputing  
In this paper we investigate the usage of regularized correntropy framework for learning of classifiers from noisy labels.  ...  The experiments on two challenging pattern classification tasks show that it significantly outperforms the machines with transitional loss functions.  ...  However, in support vector classification, this regularization term is either obtained by a "maximal margin" regularization or obtained by a "maximal robustness" regularization for certain type of feature  ... 
doi:10.1016/j.neucom.2014.09.080 fatcat:c25l5dutjbajvkgqsrhuwvxqia
« Previous Showing results 1 — 15 out of 370 results