Filters








54,986 Hits in 9.6 sec

Deep clustering: On the link between discriminative models and K-means [article]

Mohammed Jabi, Marco Pedersoli, Amar Mitiche, Ismail Ben Ayed
2019 arXiv   pre-print
On the surface, several recent discriminative models may seem unrelated to K-means.  ...  In the context of recent deep clustering studies, discriminative models dominate the literature and report the most competitive performances.  ...  Results Data sets In order to confirm the theoretical link in Proposition 2 between discriminative model MI-ADM in (13) and generative model SR-K-means in (18) , we evaluated them on two handwriting  ... 
arXiv:1810.04246v2 fatcat:oxixjov3sbgmhh4tp7xvwwkjmi

Discovering New Intents with Deep Aligned Clustering [article]

Hanlei Zhang, Hua Xu, Ting-En Lin, Rui Lyu
2021 arXiv   pre-print
Firstly, we leverage a few labeled known intent samples as prior knowledge to pre-train the model. Then, we perform k-means to produce cluster assignments as pseudo-labels.  ...  Extensive experiments on two benchmark datasets show that our method is more robust and achieves substantial improvements over the state-of-the-art methods.  ...  This work is also supported by seed fund of Tsinghua University (Department of Computer Science and Technology)-Siemens Ltd., China Joint Research Center for Industrial Intelligence and Internet of Things  ... 
arXiv:2012.08987v7 fatcat:w444n65osjdldjds6qhkqresge

Deep Goal-Oriented Clustering [article]

Yifeng Shi, Christopher M. Bender, Junier B. Oliva, Marc Niethammer
2020 arXiv   pre-print
To this end, we introduce Deep Goal-Oriented Clustering (DGC), a probabilistic framework that clusters the data by jointly using supervision via side-information and unsupervised modeling of the inherent  ...  One could reasonably expect appropriately clustering the data would aid the downstream prediction task and, conversely, a better prediction performance for the downstream task could potentially inform  ...  Acknowledgement Research detailed in this work was supported by the National Science Foundation (NSF) under award numbers NSF EECS-1711776 and NSF EECS-1610762.  ... 
arXiv:2006.04259v3 fatcat:hoorwgiohvh7db4aigkuw2e6b4

Deep Lifetime Clustering [article]

S Chandra Mouli, Leonardo Teixeira, Jennifer Neville, Bruno Ribeiro
2019 arXiv   pre-print
We introduce a neural-network based lifetime clustering model that can find cluster assignments by directly maximizing the divergence between the empirical lifetime distributions of the clusters.  ...  The goal of lifetime clustering is to develop an inductive model that maps subjects into K clusters according to their underlying (unobserved) lifetime distribution.  ...  We compare the following lifetime clustering approaches: (a) SSC-Bair, a semi-supervised clustering method (Bair and Tibshirani, 2004 ) that performs k-means clustering on selected covariates that have  ... 
arXiv:1910.00547v2 fatcat:euggux4vgbfivmepehug4k7opy

Deep Clustering with a Dynamic Autoencoder: From Reconstruction towards Centroids Construction [article]

Nairouz Mrabah, Naimul Mefraz Khan, Riadh Ksantini, Zied Lachiri
2020 arXiv   pre-print
In this paper, we propose Dynamic Autoencoder (DynAE), a novel model for deep clustering that overcomes a clustering-reconstruction trade-off, by gradually and smoothly eliminating the reconstruction objective  ...  Experimental evaluations on benchmark datasets show that our approach achieves state-of-the-art results compared to the most relevant deep clustering methods.  ...  We have also compared our model against AE+K-Means and AE+FINCH where, the latent samples are clustered using K-Means and FINCH, respectively, after training an autoencoder to reconstruct the data.  ... 
arXiv:1901.07752v5 fatcat:xhdct5kx4rhq7ghllwddoisbzq

Semi-supervised deep embedded clustering

Yazhou Ren, Kangrong Hu, Xinyi Dai, Lili Pan, Steven C.H. Hoi, Zenglin Xu
2019 Neurocomputing  
Deep embedded clustering (DEC) is one of the state-of-theart deep clustering methods. However, DEC does not make use of prior knowledge to guide the learning process.  ...  Extensive experiments on real benchmark data sets validate the effectiveness and robustness of the proposed method.  ...  Ren). simultaneously by integrating k -means and linear discriminant analysis (LDA) into a joint framework.  ... 
doi:10.1016/j.neucom.2018.10.016 fatcat:gnskmuuy55fitjukpooqs2gr5q

Deep Fusion Clustering Network [article]

Wenxuan Tu, Sihang Zhou, Xinwang Liu, Xifeng Guo, Zhiping Cai, En zhu, Jieren Cheng
2020 arXiv   pre-print
Extensive experiments on six benchmark datasets have demonstrated that the proposed DFCN consistently outperforms the state-of-the-art deep clustering methods.  ...  To tackle the above issues, we propose a Deep Fusion Clustering Network (DFCN).  ...  Among them, K-means (Hartigan and Wong 1979) is the representative one of classic shallow clustering methods.  ... 
arXiv:2012.09600v1 fatcat:xypxfi43u5c6fgpsxscpzmfdoe

Deep convolutional embedding for digitized painting clustering [article]

Giovanna Castellano, Gennaro Vessio
2020 arXiv   pre-print
The model is also capable of outperforming other state-of-the-art deep clustering approaches to the same problem.  ...  On the other hand, applying traditional clustering and feature reduction techniques to the highly dimensional pixel space can be ineffective.  ...  ACKNOWLEDGMENT Gennaro Vessio acknowledges funding support from the Italian Ministry of Education, University and Research through the PON AIM 1852414 project.  ... 
arXiv:2003.08597v2 fatcat:mhxomrao65g5vmeuwx3ehp3lsu

Regression Based Clustering by Deep Adversarial Learning

Fei Tang, Dabin Zhang, Tie Cai, Qin Li
2020 IEEE Access  
Despite the great success, existing regression clustering methods based on shallow models are vulnerable due to: (1) They often pay no attention to the combination between learning representations and  ...  By seamlessly combining with the stacked autoencoder, the proposed model integrates learning deep nonlinear latent representation and clustering in a unified framework.  ...  is the cluster assignment produced by the algorithm, and m(·) ranges over all possible one-to-one mappings between clusters and labels and n means the number of samples.  ... 
doi:10.1109/access.2020.3014631 fatcat:zmws2m7o4fezhetj2pkbwwpgjy

Joint Unsupervised Learning of Deep Representations and Image Clusters [article]

Jianwei Yang, Devi Parikh, Dhruv Batra
2016 arXiv   pre-print
In this paper, we propose a recurrent framework for Joint Unsupervised LEarning (JULE) of deep representations and image clusters.  ...  Extensive experiments show that our method outperforms the state-of-the-art on image clustering across a variety of image datasets.  ...  As we can see, based on raw image data, all datasets have high ratios when K is smaller, and the ratios increase further when using our learned deep representations.  ... 
arXiv:1604.03628v3 fatcat:uzzpuxn7ynbc7bw3qi4gcucvzq

Deep Transductive Semi-supervised Maximum Margin Clustering [article]

Gang Chen
2015 arXiv   pre-print
The experimental results shows that our model is significantly better than the state of the art on semi-supervised clustering.  ...  Thus, our model unifies transductive learning, feature learning and maximum margin techniques in the semi-supervised clustering framework.  ...  Thus, we used those methods to learn the metric and calculate the distances between all instances, then we used the kernel k-means [7] for clustering.  ... 
arXiv:1501.06237v1 fatcat:lgagevqxd5fhncpw225etxhcx4

Deep Conditional Gaussian Mixture Model for Constrained Clustering [article]

Laura Manduchi, Kieran Chin-Cheong, Holger Michel, Sven Wellmann, Julia E. Vogt
2022 arXiv   pre-print
Following recent advances in deep generative models, we propose a novel framework for constrained clustering that is intuitive, interpretable, and can be trained efficiently in the framework of stochastic  ...  We provide extensive experiments to demonstrate that DC-GMM shows superior clustering performances and robustness compared to state-of-the-art deep constrained clustering methods on a wide range of data  ...  Acknowledgments and Disclosure of Funding We would like to thank Luca Corinzia (ETH Zurich) for the fruitful discussions that contributed to shape this work.  ... 
arXiv:2106.06385v3 fatcat:pho4neagszau5aqi6tbotlzwkq

Discovering New Intents via Constrained Deep Adaptive Clustering with Cluster Refinement [article]

Ting-En Lin, Hua Xu, Hanlei Zhang
2019 arXiv   pre-print
Moreover, we refine the clusters by forcing the model to learn from the high confidence assignments.  ...  In this paper, we propose constrained deep adaptive clustering with cluster refinement (CDAC+), an end-to-end clustering method that can naturally incorporate pairwise constraints as prior knowledge to  ...  For example, COP-KMeans (Wagstaff et al. 2001) use must-link and cannot-link between samples as hard constraints.  ... 
arXiv:1911.08891v1 fatcat:uvaea6pvofd7rdjlfjxojf6ylq

Hierarchical clustering with deep Q-learning

Richárd Forster, Agnes Fülöp
2018 Acta Universitatis Sapientiae: Informatica  
The technique chosen for training is reinforcement learning, that allows the system to evolve based on interactions between the model and the underlying graph.  ...  Following up on our previous study on applying hierarchical clustering algorithms to high energy particle physics, this paper explores the possibilities to use deep learning to generate models capable  ...  A comparison is given on modularity level of the clusterization between the standard Louvain method and a modified version using the generated model for predicting future communities.  ... 
doi:10.2478/ausi-2018-0006 fatcat:bltuutgltbdlrfvvfjoqsy625m

Semi-Supervised Deep Time-Delay Embedded Clustering for Stress Speech Analysis

Barlian Henryranu Prasetio, Hiroki Tamura, Koichi Tanno
2019 Electronics  
However, DTEC has not confirmed yet the compatibility between the output class and informational classes.  ...  , called deep time-delay embedded clustering (DTEC).  ...  For more than two decades, K-means become the most popular clustering method due to its fastness to find the similarities between data points.  ... 
doi:10.3390/electronics8111263 fatcat:43nfk7mgbzc3xdeoltbr3elp4a
« Previous Showing results 1 — 15 out of 54,986 results