A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Filters
Kernel k-means
2004
Proceedings of the 2004 ACM SIGKDD international conference on Knowledge discovery and data mining - KDD '04
We show the generality of the weighted kernel k -means objective function, and derive the spectral clustering objective of normalized cut as a special case. ...
Kernel k -means and spectral clustering have both been used to identify clusters that are non-linearly separable in input space. ...
THE SPECTRAL CONNECTION At first glance, weighted kernel k -means and normalized cuts using spectral clustering appear to be quite different. ...
doi:10.1145/1014052.1014118
dblp:conf/kdd/DhillonGK04
fatcat:fr4t6zydqzhoxbavejilprg36u
Soft Geodesic Kernel K-Means
2007
2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07
In this paper we present a kernel method for data clustering, where the soft k-means is carried out in a feature space, instead of input data space, leading to soft kernel k-means. ...
In addition, soft responsibilities as well as geodesic kernel, improve the clustering performance, compared to kernel k-means. ...
It was shown in [5] that kernel k-means is closely related to spectral clustering and normalized cut. ...
doi:10.1109/icassp.2007.366264
dblp:conf/icassp/KimSC07
fatcat:cu6zqb45wjegfjsc6jsbkpsy4m
Hypergraph Modeling via Spectral Embedding Connection: Hypergraph Cut, Weighted Kernel k-means, and Heat Kernel
[article]
2022
arXiv
pre-print
We show a theoretical connection between our formulation and hypergraph cut in two ways, generalizing both weighted kernel k-means and the heat kernel, by which we justify our formulation. ...
For graph cut based spectral clustering, it is common to model real-valued data into graph by modeling pairwise similarities using kernel function. ...
As pointed out in (Hein et al. 2013) , the clustering result of normalized cut tends to bias towards large weights. This could enhance existing negative biases as a clustering result. ...
arXiv:2203.09888v1
fatcat:duwoel2jknd3jcvnid6ii6sovq
Kernel matrix trimming for improved Kernel K-means clustering
2015
2015 IEEE International Conference on Image Processing (ICIP)
The Kernel k-Means algorithm for clustering extends the classic k-Means clustering algorithm. ...
This matrix is then referenced to calculate the distance between an element and a cluster center, as per classic k-Means. ...
Interestingly, it has been proven that Kernel k-Means, Spectral Clustering and Normalized Graph Cuts are, under the right conditions, mathematically equivalent tasks [5] . ...
doi:10.1109/icip.2015.7351209
dblp:conf/icip/TsapanosTNP15
fatcat:43q3f6ipcjc5hj4pvmktf36dmq
Multiple Kernel k-Means Clustering by Selecting Representative Kernels
[article]
2018
arXiv
pre-print
However, the performance of kernel k-means clustering largely depends on the choice of kernel function. ...
the framework of multiple k-means clustering. ...
kernel k-means method. ...
arXiv:1811.00264v1
fatcat:kajgxnnhyzhmtpibsogp2kb6qm
Secrets of GrabCut and Kernel K-Means
2015
2015 IEEE International Conference on Computer Vision (ICCV)
Unlike histogram or GMM fitting [39, 28] , our approach is closely related to average association and normalized cut. ...
We propose an alternative approach to color clustering using kernel K-means energy with wellknown properties such as non-linear separation and scalability to higher-dimensional feature spaces. ...
[11, 17] first observed the equivalence between kernel k-means and popular spectral clustering criteria. ...
doi:10.1109/iccv.2015.182
dblp:conf/iccv/TangAMB15
fatcat:vj4ne2rmhjg6xpzaw3uwuo2gte
Principal Direction Divisive Partitioning with Kernels and k-Means Steering
[chapter]
2008
Survey of Text Mining II
It is also shown that KPDDP can provide results of superior quality than kernel k-means. ...
We propose, implement and evaluate several schemes that combine partitioning and hierarchical algorithms, specifically k-means and Principal Direction Divisive Partitioning (PDDP). ...
The Workshop was organized by Michael Berry and Malu Castellanos. We thank them and Murray Browne for inviting us to contribute to the current volume. ...
doi:10.1007/978-1-84800-046-9_3
fatcat:nqe4a6vq6bdvnb3wvkc3klpaea
Nearly Optimal Clustering Risk Bounds for Kernel K-Means
[article]
2020
arXiv
pre-print
In this paper, we study the statistical properties of kernel k-means and obtain a nearly optimal excess clustering risk bound, substantially improving the state-of-art bounds in the existing clustering ...
We further analyze the statistical effect of computational approximations of the Nyström kernel k-means, and prove that it achieves the same statistical accuracy as the exact kernel k-means considering ...
Guan, and B. Kulis. Kernel k-means: spectral clustering and normalized cuts. ...
arXiv:2003.03888v2
fatcat:n46agji7zrca3flkeuvyjqabay
The Global Kernel $k$-Means Algorithm for Clustering in Feature Space
2009
IEEE Transactions on Neural Networks
In order to overcome the cluster initialization problem associated with this method, we propose the global kernel -means algorithm, a deterministic and incremental approach to kernel-based clustering. ...
Kernel -means is an extension of the standard -means clustering algorithm that identifies nonlinearly separable clusters. ...
Fig. 4 . 4 Normalized cut values achieved by global kernel k-means and its variants as well as by 50 restarts of kernel k-means: (a) 32 clusters; (b) 64 clusters;(c) 128 clusters. ...
doi:10.1109/tnn.2009.2019722
pmid:19493848
fatcat:binf4vdmbjgafaumx7jznvxnwm
A distributed framework for trimmed Kernel k-Means clustering
2015
Pattern Recognition
Our distributed approach follows the MapReduce programming model and consists of 3 stages, the kernel matrix computation, a novel matrix trimming method and the Kernel k-Means clustering algorithm. ...
In this paper, we present such an approach to the Kernel k-Means clustering algorithm, in order to make its application to a large number of samples feasible and, thus, achieve high performance clustering ...
Interestingly, it has been proven that Kernel k-Means, Spectral Clustering and Normalized Graph Cuts are closely related tasks [11] . ...
doi:10.1016/j.patcog.2015.02.020
fatcat:xhxoq5447vc2jgruaawkyiazlm
Efficient High-Dimensional Kernel k-Means with Random Projection++
2021
Applied Sciences
We approximate the kernel matrix and distances in a lower-dimensional space Rd before the kernel k-means clustering motivated by upper error bounds. ...
Using random projection, a method to speed up both kernel k-means and centroid initialization with k-means++ is proposed. ...
That connection leads to a weighted kernel k-means that monotonically decreases the normalized cut. ...
doi:10.3390/app11156963
fatcat:4bj474a26jeo3hrikxmiftzzt4
Beyond the Nystrom Approximation: Speeding up Spectral Clustering using Uniform Sampling and Weighted Kernel k-means
2017
Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
k-means (Dhillon et al. 2004) and use the resulting centers to compute a clustering for the remaining data points. ...
In this paper we present a framework for spectral clustering based on the following simple scheme: sample a subset of the input points, compute the clusters for the sampled subset using weighted kernel ...
The connection between spectral clustering and weighted kernel k-means was first introduced in [Dhillon et al., 2004] . ...
doi:10.24963/ijcai.2017/347
dblp:conf/ijcai/MohanM17
fatcat:zut2hsyshrdkfdviuzdeaig74y
Local Network Community Detection with Continuous Optimization of Conductance and Weighted Kernel K-Means
[article]
2016
arXiv
pre-print
We investigate the relation of conductance with weighted kernel k-means for a single community, which leads to the introduction of a new objective function, σ-conductance. ...
We prove locality and give performance guarantees for EMc and PGDc for a class of dense and well separated communities centered around the seeds. ...
The Relation to Weighted Kernel K-Means Clustering Another view on conductance is by the connection to weighted kernel k-means clustering. ...
arXiv:1601.05775v2
fatcat:q7bcazup7rdehoreohbuvfwvwy
Kernel Cuts: MRF meets Kernel & Spectral Clustering
[article]
2016
arXiv
pre-print
While focusing on graph cut and move-making techniques, our new unary (linear) kernel and spectral bound formulations for common pairwise clustering criteria allow to integrate them with any regularization ...
Markov Random Field (MRF) potentials, and standard pairwise clustering criteria like Normalized Cut (NC), average association (AA), etc. ...
We also thank Jianbo Shi (UPenn, USA) for his feedback and his excellent and highly appreciated spectral-relaxation optimization code for normalized cuts. ...
arXiv:1506.07439v6
fatcat:ltuxk3oflzfhlfvlfx3kfxjxee
Distributed, MapReduce-Based Nearest Neighbor and E-Ball Kernel k-Means
2015
2015 IEEE Symposium Series on Computational Intelligence
Kernel k-Means is a state of the art clustering algorithm, which employs the kernel trick, in order to perform clustering on a higher dimensionality space, thus overcoming the limitations of classic k-Means ...
In this paper, we present a MapReduce based distributed implementation of Nearest Neighbor and -ball Kernel k-Means. ...
Interestingly, it has been proven that Kernel k-Means, Spectral Clustering and Normalized Graph Cuts are closely related tasks [11] . ...
doi:10.1109/ssci.2015.81
dblp:conf/ssci/TsapanosTNP15
fatcat:djn6hiv2xnazjfngdrji26g7mm
« Previous
Showing results 1 — 15 out of 3,491 results