Filters








3,702 Hits in 5.7 sec

Adaptive Hypergraph Learning for Unsupervised Feature Selection

Xiaofeng Zhu, Yonghua Zhu, Shichao Zhang, Rongyao Hu, Wei He
2017 Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence  
To address these issues, we propose a new UFS method to jointly learn the similarity matrix and conduct both subspace learning (via learning a dynamic hypergraph) and feature selection (via a sparsity  ...  Current unsupervised feature selection (UFS) methods learn the similarity matrix by using a simple graph which is learnt from the original data as well as is independent from the process of feature selection  ...  Thus the variable S is used to simultaneously select the informative features (via the sparsity constraint) and conduct subspace learning (i.e., preserving the local structures via the first term of Eq  ... 
doi:10.24963/ijcai.2017/501 dblp:conf/ijcai/ZhuZZHH17 fatcat:xha7r2f3erbapairz3qyllgyki

Joint Clustering and Feature Selection [chapter]

Liang Du, Yi-Dong Shen
2013 Lecture Notes in Computer Science  
Inspired from the recent developments on discriminative clustering, we propose in this paper a novel unsupervised feature selection approach via Joint Clustering and Feature Selection (JCFS).  ...  Due to the absence of class labels, unsupervised feature selection is much more difficult than supervised feature selection.  ...  , joint Feature Selection and Subspace Learning (FSSL) [8] and Joint Embedding Learning and Sparse Regression (JELSR) [9] .  ... 
doi:10.1007/978-3-642-38562-9_25 fatcat:q7xmswrd6rexjh7t2rfuilp3kq

Joint Multi-view Unsupervised Feature Selection and Graph Learning [article]

Si-Guo Fang, Dong Huang, Chang-Dong Wang, Yong Tang
2022 arXiv   pre-print
In light of this, this paper presents a joint multi-view unsupervised feature selection and graph learning (JMVFG) approach.  ...  First, they generally utilize either cluster structure or similarity structure to guide the feature selection, neglecting the possibility of a joint formulation with mutual benefits.  ...  In this paper, our focus lies on the intersection of multi-view unsupervised feature selection and multi-view clustering (especially via graph learning).  ... 
arXiv:2204.08247v1 fatcat:uegepff2ebcivnw36esrrz526q

Cluster Density Properties Define a Graph for Effective Pattern Feature Selection

Khadidja Henni, Neila Mezghani, Amar Mitiche
2020 IEEE Access  
The purpose of this study is to develop and investigate a new unsupervised feature selection method which uses the k-influence space concept and subspace learning to map features onto a weighted graph  ...  feature selection algorithms.  ...  Via Manifold Regularization (FS-Manifold) [14] and, (iii) unsupervised methods, such as Unsupervised Graph-based Feature Selection (UGFS) and Unsupervised Discriminative Feature Selection (UDFS) [15  ... 
doi:10.1109/access.2020.2981265 fatcat:vmgfcfexxbavjny3ocsrbpzvrm

Deep Clustering via Joint Convolutional Autoencoder Embedding and Relative Entropy Minimization [article]

Kamran Ghasedi Dizaji, Amirhossein Herandi, Cheng Deng, Weidong Cai, Heng Huang
2017 arXiv   pre-print
In this paper, we propose a new clustering model, called DEeP Embedded RegularIzed ClusTering (DEPICT), which efficiently maps data into a discriminative embedding subspace and precisely predicts cluster  ...  In order to benefit from end-to-end optimization and eliminate the necessity for layer-wise pretraining, we introduce a joint learning framework to minimize the unified clustering and reconstruction loss  ...  assignment q ik by clustering the embedding subspace features via simple algorithms like K-means or AC-PIC.  ... 
arXiv:1704.06327v3 fatcat:cz4wkdslizfyje3wkxhdabhy3e

Unsupervised Feature Extraction for Reliable Hyperspectral Imagery Clustering via Dual Adaptive Graphs

Jinyong Chen, Qidi Wu, Kang Sun
2021 IEEE Access  
Without available label information in clustering task, the clustering performance heavily depends on the reliability of unsupervised feature learned from HSI.  ...  To address this problem, in this paper, a dual graph-based robust unsupervised feature extraction framework for HSI is proposed to realize reliable clustering.  ...  CONCLUSION In this paper, a robust unsupervised feature leaning method via dual graphs is proposed for reliable clustering on HSIs.  ... 
doi:10.1109/access.2021.3071425 fatcat:hvxtbpth3bgpdb2y444edy77ea

Subspace Clustering for Action Recognition with Covariance Representations and Temporal Pruning [article]

Giancarlo Paoletti, Jacopo Cavazza, Cigdem Beyan, Alessio Del Bue
2020 arXiv   pre-print
To this end, we propose a novel subspace clustering method, which exploits covariance matrix to enhance the action's discriminability and a timestamp pruning approach that allow us to better handle the  ...  Albeit state-of-the-art approaches designed for this application are all supervised, in this paper we pursue a more challenging direction: Solving the problem with unsupervised learning.  ...  Temporal pruning via Sparse Subspace Clustering (tempo-ralSSC) In addition to utilize subspace clustering as an unsupervised learning method to perform action recognition, in this paper, we also exploit  ... 
arXiv:2006.11812v1 fatcat:d5pdt6dmlzehboxavtw4fmnz4e

Cross-Modal Learning via Pairwise Constraints [article]

Ran He and Man Zhang and Liang Wang and Ye Ji and Qiyue Yin
2014 arXiv   pre-print
For unsupervised learning, we propose a cross-modal subspace clustering method to learn a common structure for different modalities.  ...  gap between different modalities and improve the clustering/retrieval accuracy.  ...  LSR M: The subspace clustering via least squares regression [16] is used to perform clustering on the concatenated features of all modalities.  ... 
arXiv:1411.7798v1 fatcat:pp77pnvwmvftnkrwql5gu4my34

Self-Paced Cross-Modal Subspace Matching

Jian Liang, Zhihang Li, Dong Cao, Ran He, Jingdong Wang
2016 Proceedings of the 39th International ACM SIGIR conference on Research and Development in Information Retrieval - SIGIR '16  
Then, we formulate the unsupervised cross-modal matching problem as a non-convex joint feature learning and data grouping problem.  ...  This paper proposes a Self-Paced Cross-Modal Subspace Matching (SCSM) method for unsupervised multimodal data.  ...  A joint feature learning and data grouping formulation has been accordingly developed.  ... 
doi:10.1145/2911451.2911527 dblp:conf/sigir/LiangLCHW16 fatcat:ogq5jwnxtvhlbkbbzbm7rzzgoa

Graph Autoencoder-Based Unsupervised Feature Selection with Broad and Local Data Structure Preservation [article]

Siwei Feng, Marco F.Duarte
2018 arXiv   pre-print
To leverage a more sophisticated embedding, we propose an autoencoder-based unsupervised feature selection approach that leverages a single-layer autoencoder for a joint framework of feature selection  ...  These works first map data onto a low-dimensional subspace and then select features by posing a sparsity constraint on the transformation matrix.  ...  Subspace clustering guided unsupervised feature selection (SCUFS) [38] combines both self-representation and subspace clustering with unsupervised feature selection.  ... 
arXiv:1801.02251v2 fatcat:tgtkgizpnzf3xdvgibg3ep54xe

CRAFT: ClusteR-specific Assorted Feature selecTion [article]

Vikas K. Garg, Cynthia Rudin, Tommi Jaakkola
2015 arXiv   pre-print
We present a framework for clustering with cluster-specific feature selection.  ...  The framework, CRAFT, is derived from asymptotic log posterior formulations of nonparametric MAP-based clustering models.  ...  Cluster-specific unsupervised feature selection is even harder since separate, possibly overlapping, subspaces need to be inferred.  ... 
arXiv:1506.07609v1 fatcat:mkbjg3dcqvh5happctzzukik7q

Unsupervised Domain Adaptation via Structured Prediction Based Selective Pseudo-Labeling [article]

Qian Wang, Toby P. Breckon
2019 arXiv   pre-print
The idea of structured prediction is inspired by the fact that samples in the target domain are well clustered within the deep feature space so that unsupervised clustering analysis can be used to facilitate  ...  In this paper, we propose a novel selective pseudo-labeling strategy based on structured prediction.  ...  In this paper, we explore such structural information via unsupervised learning (i.e. K-means) and propose a novel UDA approach based on selective pseudo-labeling and structured prediction.  ... 
arXiv:1911.07982v1 fatcat:xnmx7nsu2zaavj5alnve3hsg3i

Discriminative codebook learning for Web image search

Xinmei Tian, Yijuan Lu
2013 Signal Processing  
The conventional codebook, generated via unsupervised clustering approaches, does not embed the labeling information of images and therefore has less discriminative ability.  ...  Although some research has been conducted to construct codebooks with the labeling information considered, very few attempts have been made to exploit manifold geometry of the local feature space to improve  ...  One is an unsupervised codebook generated via K-means clustering algorithm (denoted KM) and the other is a supervised codebook generated via extremely randomized clustering forests [15] (denoted ERCF  ... 
doi:10.1016/j.sigpro.2012.04.018 fatcat:y5viyp4m55c2tgzs33lfto2xfi

Noise-Resistant Unsupervised Feature Selection via Multi-perspective Correlations

Hao Huang, Shinjae Yoo, Dantong Yu, Hong Qin
2014 2014 IEEE International Conference on Data Mining  
Unsupervised feature selection is an important issue for high dimensional dataset analysis. However popular methods are susceptible to noisy instances (observations) or noisy features.  ...  Our proposed approach, called Noise-Resistant Unsupervised Feature Selection (NRFS), is based on multi-perspective correlation that reflects the importance of feature with respect to noise-resistant representative  ...  Among the four popular feature selection algorithms, NDFS has the most noticeable improvement after filtering out the noisy observations, since it performs a joint and iterative learning between cluster  ... 
doi:10.1109/icdm.2014.88 dblp:conf/icdm/HuangYYQ14a fatcat:hkx7wtr26ja3bbmve6quqheu7q

Approaches to working in high-dimensional data spaces: gene expression microarrays

Y Wang, D J Miller, R Clarke
2008 British Journal of Cancer  
We identify the unique challenges posed by high dimensionality to highlight methodological problems and discuss recent methods in predictive classification, unsupervised subclass discovery, and marker  ...  In unsupervised clustering in high dimensions, feature selection is likewise essential for discerning the underlying grouping tendency that may be 'buried' in a much lower-dimensional subspace -with many  ...  Rather than clustering samples using all genes, a practical alternative is to embed gene selection within unsupervised clustering -removal of noisy features improves clustering accuracy, which, in turn  ... 
doi:10.1038/sj.bjc.6604207 pmid:18283324 pmcid:PMC2275474 fatcat:fjgeokq5rjbyjh6rtlikkbmb4e
« Previous Showing results 1 — 15 out of 3,702 results