Filters








1,889 Hits in 5.2 sec

Theoretical developments for interpreting kernel spectral clustering from alternative viewpoints

Diego Peluffo-Ordóñez, Paul Rosero-Montalvo, Ana Umaquinga-Criollo, Luis Suárez-Zambrano, Hernan Domínguez-Limaico, Omar Oña-Rocha, Stefany Flores-Armas, Edgar Maya-Olalla
2017 Advances in Science, Technology and Engineering Systems  
In this work, we explore the relationship between (KSC) and other well-known approaches, namely normalized cut clustering and kernel k-means.  ...  A B S T R A C T To perform an exploration process over complex structured data within unsupervised settings, the so-called kernel spectral clustering (KSC) is one of the most recommended and appealing  ...  Another study [25] explores the links of KSC with spectral dimensionality reduction from a kernel viewpoint.  ... 
doi:10.25046/aj0203208 fatcat:6usp5h23zrekpgxrvg5zuntt5u

A Unified Semi-Supervised Dimensionality Reduction Framework for Manifold Learning [article]

Ratthachat Chatpatanasiri, Boonserm Kijsirikul
2009 arXiv   pre-print
We present a general framework of semi-supervised dimensionality reduction for manifold learning which naturally generalizes existing supervised and unsupervised learning frameworks which apply the spectral  ...  Furthermore, a new semi-supervised kernelization framework called "KPCA trick" is proposed to handle non-linear problems.  ...  In this paper, we present a general semi-supervised dimensionality reduction framework which is able to employ information from both labeled and unlabeled examples.  ... 
arXiv:0804.0924v2 fatcat:jehljybwvfhf7obkufhpgqdg7y

Spectral Clustering with Neighborhood Attribute Reduction Based on Information Entropy

Hongjie Jia, Shifei Ding, Heng Ma, Wanqiu Xing
2014 Journal of Computers  
Then we introduce this attribute reduction method to improve spectral clustering and propose NRSR-SC algorithm.  ...  Its clustering accuracy is higher, and has strong robustness to the noise in high-dimensional data.  ...  In order to effectively deal with high-dimensional data, we combine attribute reduction with clustering analysis, and present a novel spectral clustering algorithm based on neighborhood rough sets reduction  ... 
doi:10.4304/jcp.9.6.1316-1324 fatcat:e6bm3mpsqbhqvcpck2ygniupme

Semi-supervised Distance Metric Learning in High-Dimensional Spaces by Using Equivalence Constraints [chapter]

Hakan Cevikalp
2010 Communications in Computer and Information Science  
Experimental results on high-dimensional visual object classification problems show that the computed distance metric improves the performances of the subsequent classification and clustering algorithms  ...  The proposed method works in both the input and kernel induced-feature space, and the distance metric is found by a gradient descent procedure that involves an eigen-decomposition in each step.  ...  for (a) k-means clustering, (b) spectral clustering on Birds database different positions and scales, either densely, randomly or based on the output of some kind of salient region detector.  ... 
doi:10.1007/978-3-642-11840-1_18 fatcat:cav7rn37ungcrjo7jtqcz4eaxa

Graph Convolutional Subspace Clustering: A Robust Subspace Clustering Framework for Hyperspectral Image [article]

Yaoming Cai, Zijia Zhang, Zhihua Cai, Xiaobo Liu, Xinwei Jiang, Qin Yan
2020 arXiv   pre-print
Subspace clustering has been proven to be powerful for exploiting the intrinsic relationship between data points.  ...  Basing on the framework, we further propose two novel subspace clustering models by using the Frobenius norm, namely Efficient GCSC (EGCSC) and Efficient Kernel GCSC (EKGCSC).  ...  ACKNOWLEGMENT The authors would like to thank the anonymous reviewers for their constructive suggestions and criticisms. We would also like to thank Prof.  ... 
arXiv:2004.10476v1 fatcat:wnkylwbj2fgr3k3tvhyewiwnem

p-Spectral Clustering Based on Neighborhood Attribute Granulation [chapter]

Shifei Ding, Hongjie Jia, Mingjing Du, Qiankun Hu
2016 IFIP Advances in Information and Communication Technology  
Data clustering is to find the intrinsic links between objects and describe the internal structures of data sets. p-Spectral clustering is based on Cheeger cut criterion.  ...  It has good performance on many challenging data sets. But the original p-spectral clustering algorithm is not suitable for highdimensional data.  ...  Conclusions To improve the performance of p-spectral clustering on high-dimensional data, we modify the attribute reduction method based on neighborhood rough sets.  ... 
doi:10.1007/978-3-319-48390-0_6 fatcat:y4r4iwyih5fflod5ky6zch7dee

Iterative Discovery of Multiple AlternativeClustering Views

Donglin Niu, Jennifer G. Dy, and Michael I. Jordan
2014 IEEE Transactions on Pattern Analysis and Machine Intelligence  
We present a range of experiments that compare our approach to alternatives and explore the connections between simultaneous and iterative modes of discovery of multiple clusterings.  ...  The algorithm is based on an optimization procedure that incorporates terms for cluster quality and novelty relative to previously discovered clustering solutions.  ...  Spectral clustering can be presented from different points of view [25] ; here, we focus on the graph partitioning viewpoint.  ... 
doi:10.1109/tpami.2013.180 pmid:26353307 fatcat:pmkrmsa5cjep7h55ebrmzyg2yq

Graph signal processing for machine learning: A review and new perspectives [article]

Xiaowen Dong, Dorina Thanou, Laura Toni, Michael Bronstein, Pascal Frossard
2020 arXiv   pre-print
Furthermore, we provide new perspectives on future development of GSP techniques that may serve as a bridge between applied mathematics and signal processing on one side, and machine learning and network  ...  science on the other.  ...  Graph-based clustering and dimensionality reduction Unsupervised learning is another major paradigm in machine learning, where clustering and dimensionality reduction are two main problems of interest.  ... 
arXiv:2007.16061v1 fatcat:76jhe3mhlnfkrkyjcyibmkth24

Robust non-linear dimensionality reduction using successive 1-dimensional Laplacian Eigenmaps

Samuel Gerber, Tolga Tasdizen, Ross Whitaker
2007 Proceedings of the 24th international conference on Machine learning - ICML '07  
Recent results in the literature show that spectral decomposition, as used for example by the Laplacian Eigenmaps algorithm, provides a powerful tool for non-linear dimensionality reduction and manifold  ...  Non-linear dimensionality reduction of noisy data is a challenging problem encountered in a variety of data analysis applications.  ...  We also thank Xavier Tricoche for the helpful discussions and the anonymous reviewers for many helpful comments.  ... 
doi:10.1145/1273496.1273532 dblp:conf/icml/GerberTW07 fatcat:zukwxpt5sbedpemryposbdq5ha

Clustering on multi-layer graphs via subspace analysis on Grassmann manifolds

Xiaowen Dong, Pascal Frossard, Pierre Vandergheynst, Nikolai Nefedov
2013 2013 IEEE Global Conference on Signal and Information Processing  
The resulting combination can then be viewed as a low dimensional representation of the original data which preserves the most important information from diverse relationships between entities.  ...  This information can naturally be modeled by a set of weighted and undirected graphs that form a global multi-layer graph, where the common vertex set represents the entities and the edges on different  ...  The proposed method is a dimensionality reduction algorithm for the original data; it leads to a summarization of the information contained in the multiple graph layers, which reveals the intrinsic relationships  ... 
doi:10.1109/globalsip.2013.6737060 dblp:conf/globalsip/DongFVN13 fatcat:crfgy54xazfjndsybg6psymdua

Clustering on Multi-Layer Graphs via Subspace Analysis on Grassmann Manifolds

Xiaowen Dong, Pascal Frossard, Pierre Vandergheynst, Nikolai Nefedov
2014 IEEE Transactions on Signal Processing  
The resulting combination can then be viewed as a low dimensional representation of the original data which preserves the most important information from diverse relationships between entities.  ...  This information can naturally be modeled by a set of weighted and undirected graphs that form a global multilayer graph, where the common vertex set represents the entities and the edges on different  ...  The proposed method is a dimensionality reduction algorithm for the original data; it leads to a summarization of the information contained in the multiple graph layers, which reveals the intrinsic relationships  ... 
doi:10.1109/tsp.2013.2295553 fatcat:4ugu5ogjxve53iesfgjtvt2kc4

Scale-Dependent Signal Identification in Low-Dimensional Subspace: Motor Imagery Task Classification

Qingshan She, Haitao Gan, Yuliang Ma, Zhizeng Luo, Tom Potter, Yingchun Zhang
2016 Neural Plasticity  
Our method trains a Gaussian mixture model (GMM) of the composite data, which is comprised of the IMFs from both the original signal and noise, by employing kernel spectral regression to reduce the dimension  ...  The informative IMFs are then discriminated using a GMM clustering algorithm, the common spatial pattern (CSP) approach is exploited to extract the task-related features from the reconstructed signals,  ...  The authors would like to thank the providers of the BCI Competition IV Dataset I and BCI Competition III Dataset Iva which were used to test the algorithms proposed in this study.  ... 
doi:10.1155/2016/7431012 pmid:27891256 pmcid:PMC5112353 fatcat:hmqd5sbg2zhoxgjgxtxblf5f2y

Using Kernel Methods in a Learning Machine Approach for Multispectral Data Classification. An Application in Agriculture [chapter]

Adrian Gonzalez, Jose Moreno, Graham Russell, Astrid Marquez
2009 Geoscience and Remote Sensing  
In particular, we focus on the practical applicability of learning machine methods to the task of inducting a relationship between the spectral response of farms land cover to their informational typology  ...  This chapter will cover the following phases: a)learning from instances in agriculture; b)feature extraction of both multispectral and attributive data and; c) kernel supervised classification.  ...  relationship could be induced between a pattern Φ(x) and a label y.  ... 
doi:10.5772/8307 fatcat:l3pb5jb4y5fsznupiealzwa72a

A Multigraph Representation for Improved Unsupervised/Semi-supervised Learning of Human Actions

Simon Jones, Ling Shao
2014 2014 IEEE Conference on Computer Vision and Pattern Recognition  
Finally, a spectral embedding is calculated on each graph, and the embeddings are scaled/aggregated into a single representation.  ...  We propose that multimedia such as images or videos consist of multiple separate components, and therefore more than one graph is required to fully capture the relationship between them.  ...  applied to information retrieval tasks with great success [3] ; and Laplacian Eigenmaps (LE) [1] , which are applied to dimensionality reduction.  ... 
doi:10.1109/cvpr.2014.110 dblp:conf/cvpr/JonesS14a fatcat:6qj5wkbpcbcrzpb2npf2dppday

Efficient Semidefinite Spectral Clustering via Lagrange Duality [article]

Yan Yan, Chunhua Shen, Hanzi Wang
2014 arXiv   pre-print
Experimental results on both UCI data sets and real-world image data sets demonstrate that 1) compared with the state-of-the-art spectral clustering methods, the proposed algorithm achieves better clustering  ...  We propose an efficient approach to semidefinite spectral clustering (SSC), which addresses the Frobenius normalization with the positive semidefinite (p.s.d.) constraint for spectral clustering.  ...  To effectively perform spectral clustering, the dimensionality reduction technique is used for preprocessing.  ... 
arXiv:1402.5497v1 fatcat:aum3ieoi7rbvbkq3rbowpwz6xq
« Previous Showing results 1 — 15 out of 1,889 results