Filters








4,400 Hits in 5.6 sec

GRASPEL: Graph Spectral Learning at Scale [article]

Yongyu Wang, Zhiqiang Zhao, Zhuo Feng
2020 arXiv   pre-print
learning applications, such as spectral clustering (SC), and t-Distributed Stochastic Neighbor Embedding (t-SNE).  ...  By interleaving the latest high-performance nearly-linear time spectral methods for graph sparsification, coarsening and embedding, ultra-sparse yet spectrally-robust graphs can be learned by identifying  ...  The following naive scheme can be exploited for checking the spectral stability of each graph learning iteration: 1) In each iteration, we compute and record the several smallest eigenvalues of the latest  ... 
arXiv:1911.10373v3 fatcat:csaj7yuokrg7fbuhrgk53jvx5u

Similarity-Aware Spectral Sparsification by Edge Filtering [article]

Zhuo Feng
2018 arXiv   pre-print
Motivated by recent graph signal processing techniques, this paper proposes a similarity-aware spectral graph sparsification framework that leverages efficient spectral off-tree edge embedding and filtering  ...  An iterative graph densification scheme is introduced to facilitate efficient and effective filtering of off-tree edges for highly ill-conditioned problems.  ...  Estimating λmax via Power Iterations Since generalized power iterations converge at a geometric rate determined by the separation of the two largest generalized eigenvalues λmax = λ1 > λ2 , the error of  ... 
arXiv:1711.05135v3 fatcat:cxtdqiq5ljexlgpxdkskg5df2a

A Unified Deep Metric Representation for Mesh Saliency Detection and Non-rigid Shape Matching

Shanfeng Hu, Hubert Shum, Nauman Aslam, Frederick W.B. Li, Xiaohui Liang
2019 IEEE transactions on multimedia  
Index Terms-mesh saliency, non-rigid shape matching, metric learning, deep learning, recurrent neural network  ...  To parameterize the representation from a mesh, we also propose a deep recurrent neural network (RNN) for effectively integrating multi-scale shape features and a soft-thresholding operator for adaptively  ...  our soft-thresholding operator further improves the performance via adaptive metric sparsification.  ... 
doi:10.1109/tmm.2019.2952983 fatcat:mk2kwldvgnf3va2vobx7cjsqc4

Iterative Min Cut Clustering Based on Graph Cuts

Bowen Liu, Zhaoying Liu, Yujian Li, Ting Zhang, Zhilin Zhang
2021 Sensors  
The proposed method solves the min cut model by iteratively computing only one simple formula.  ...  Clustering nonlinearly separable datasets is always an important problem in unsupervised machine learning.  ...  The cost of computing eigenvectors and eigenvalues is high without built-in tool. Iterative Min Cut Clustering In this section, we propose an iterative min cut clustering (IMC).  ... 
doi:10.3390/s21020474 pmid:33440849 fatcat:xj54vycnwzgbrhkwsxwlsibr24

Structured Graph Learning Via Laplacian Spectral Constraints [article]

Sandeep Kumar, Jiaxi Ying, Jos'e Vin'icius de M. Cardoso, Daniel P.Palomar
2019 arXiv   pre-print
Then we introduce a unified graph learning framework, lying at the integration of the spectral properties of the Laplacian matrix with Gaussian graphical modeling that is capable of learning structures  ...  In this paper, we first show that for a set of important graph families it is possible to convert the structural constraints of structure into eigenvalue constraints of the graph Laplacian matrix.  ...  The importance of the graph Laplacian has been well recognized as a tool for embedding, manifold learning, spectral sparsification, clustering and semi-supervised learning [16, 17, 18, 19, 20, 21, 22]  ... 
arXiv:1909.11594v1 fatcat:jczrvhteu5fldpukacj3tryoia

Stochastic Parallelizable Eigengap Dilation for Large Graph Clustering [article]

Elise van der Pol, Ian Gemp, Yoram Bachrach, Richard Everett
2022 arXiv   pre-print
The convergence of iterative singular value decomposition approaches depends on the eigengaps of the spectrum of the given matrix, i.e., the difference between consecutive eigenvalues.  ...  This is accomplished via polynomial approximations to matrix operations that favorably transform the spectrum of a matrix without changing its eigenvectors.  ...  The convergence rate of many iterative solvers depends on the normalized eigengap, i.e. the difference between consecutive eigenvalues relative to the spectral radius (Balcan et al., 2016; Gemp et al.  ... 
arXiv:2207.14589v1 fatcat:plxvtjpcrfdubojcl3344notk4

GRASS: Graph Spectral Sparsification Leveraging Scalable Spectral Perturbation Analysis [article]

Zhuo Feng
2020 arXiv   pre-print
Spectral graph sparsification aims to find ultra-sparse subgraphs whose Laplacian matrix can well approximate the original Laplacian eigenvalues and eigenvectors.  ...  Motivated by recent graph signal processing techniques, this paper proposes a similarity-aware spectral graph sparsification framework that leverages efficient spectral off-tree edge embedding and filtering  ...  (generalized eigenvalue) embedding and filtering of off-tree edges by leveraging the recent spectral perturbation analysis framework [12] ; (c) iterative sparsifier improvement (graph densification)  ... 
arXiv:1911.04382v3 fatcat:rpwnqsymhfbq7m4ykfnqqowmne

Hashing with Non-Linear Manifold Learning

Yanzhen Liu, Xiao Bai, Cheng Yan, Jing Wang, Jun Zhou
2016 2016 International Conference on Digital Image Computing: Techniques and Applications (DICTA)  
The retrieval performance is further boosted after iterative quantization process is added to the Diffusion Hashing and Spectral Hashing.  ...  The experiments show that our manifold learning method outperforms several alternative hashing methods.  ...  Laplace Eigenmaps (LE) [22] , Locally Linear Embedding (LLE) [23] , Hessian Locally Linear Embedding (HLLE) [24] and Diffusion Map (DFM) [25] , [26] are local preserving manifold learning methods  ... 
doi:10.1109/dicta.2016.7797046 dblp:conf/dicta/Liu0YWZ16 fatcat:ksvnur5ca5bd3p7ydhiidyeldy

Entrywise convergence of iterative methods for eigenproblems [article]

Vasileios Charisopoulos, Austin R. Benson, Anil Damle
2020 arXiv   pre-print
For large scale problems, the computation of these eigenvectors is typically performed via iterative schemes such as subspace iteration or Krylov methods.  ...  Several problems in machine learning, statistics, and other fields rely on computing eigenvectors.  ...  Broader impact Due to the pervasiveness of spectral methods in machine learning and data mining, our results may be embedded in applications having a wide range of ethical and societal consequences.  ... 
arXiv:2002.08491v2 fatcat:lan5ciy22vbehpfxuq2r5xvas4

Weakly supervised learning from scale invariant feature transform keypoints: an approach combining fast eigendecompostion, regularization, and diffusion on graphs

Youssef Chahir, Abderraouf Bouziane, Messaoud Mostefai, Adnan Al Alwani
2014 Journal of Electronic Imaging (JEI)  
We use iterative deflation to speed up the eigendecomposition of the underlying Laplacian matrix of the embedded graph.  ...  First, we propose a spectral graph embedding of the SIFT points for dimensionality reduction, which provides efficient keypoints transcription into a Euclidean manifold.  ...  A spectral embedding of this graph is performed to define a Euclidean reduced space. 17 To speed up the spectral graph embedding, we propose to use the power iteration algorithm combined with the deflation  ... 
doi:10.1117/1.jei.23.1.013009 fatcat:k6vplew62nbtnhytg5d5bhe4re

Spectral Network Embedding: A Fast and Scalable Method via Sparsity [article]

Jie Zhang and Yan Wang and Jie Tang and Ming Ding
2018 arXiv   pre-print
Then we introduce a network propagation pattern via spectral analysis to incorporate local and global structure information into the embedding.  ...  In Progle, we first construct a sparse proximity matrix and train the network embedding efficiently via sparse matrix decomposition.  ...  MODEL FRAMEWORK In this section, we first leverage the sparse property of networks to learn sparse network embedding, then incorporate local and global network information into the embedding via spectral  ... 
arXiv:1806.02623v2 fatcat:gppkwgrerfeczlpgob2yj4gnga

LODES: Local Density Meets Spectral Outlier Detection

Saket Sathe, Charu Aggarwal
2016 Proceedings of the 2016 SIAM International Conference on Data Mining  
In this paper, we show how to combine spectral techniques with local density-based methods in order to discover interesting outliers.  ...  Spectral methods, however, are particularly well suited to nding outliers when the data is distributed along manifolds of arbitrary shape.  ...  of the iterative spectral method in our approach and the local density-based spectral embedding used for outlier analysis.  ... 
doi:10.1137/1.9781611974348.20 dblp:conf/sdm/SatheA16 fatcat:xmzbdmpoubfgbjyor6jcqm62gu

Scalable Metric Learning for Co-Embedding [chapter]

Farzaneh Mirzazadeh, Martha White, András György, Dale Schuurmans
2015 Lecture Notes in Computer Science  
For training we provide a fast iterative algorithm that improves the scalability of existing metric learning approaches.  ...  We present a general formulation of metric learning for co-embedding, where the goal is to relate objects from different sets.  ...  To provide decision thresholds two dummy items can also be embedded from each space, parameterized by u 0 and v 0 respectively.  ... 
doi:10.1007/978-3-319-23528-8_39 fatcat:jtynxxafkre5lmgfnd6yb7ln7a

Fast semi-supervised discriminant analysis for binary classification of large data sets

Joris Tavernier, Jaak Simm, Karl Meerbergen, Joerg Kurt Wegner, Hugo Ceulemans, Yves Moreau
2019 Pattern Recognition  
Spectral regression For the spectral regression (SR) framework, we need to solve the spectral SDA eigenvalue problem.  ...  Semi-supervised Orthogonal Discriminant Analysis via label propagation (SODA) operates in two stages [24] . First estimates of the unknown labels are computed via label propagation.  ... 
doi:10.1016/j.patcog.2019.02.015 fatcat:po2watge25gzlj53gri7eny2fu

SpecNet2: Orthogonalization-free spectral embedding by neural networks [article]

Ziyu Chen, Yingzhou Li, Xiuyuan Cheng
2022 arXiv   pre-print
In many application scenarios, parametrizing the spectral embedding by a neural network that can be trained over batches of data samples gives a promising way to achieve automatic out-of-sample extension  ...  The current paper introduces a new neural network approach, named SpecNet2, to compute spectral embedding which optimizes an equivalent objective of the eigen-problem and removes the orthogonalization  ...  spectral embedding of data.  ... 
arXiv:2206.06644v1 fatcat:hiu3awd7qvcrvkwpraruaekz2m
« Previous Showing results 1 — 15 out of 4,400 results