Filters








169 Hits in 5.9 sec

Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering

Yoshua Bengio, Jean-François Paiement, Pascal Vincent, Olivier Delalleau, Nicolas Le Roux, Marie Ouimet
2003 Neural Information Processing Systems  
This paper provides a unified framework for extending Local Linear Embedding (LLE), Isomap, Laplacian Eigenmaps, Multi-Dimensional Scaling (for dimensionality reduction) as well as for Spectral Clustering  ...  Several unsupervised learning algorithms based on an eigendecomposition provide either an embedding or a clustering only for given training points, with no straightforward extension for out-of-sample examples  ...  The generalized embedding for Isomap and MDS is e k (x) = √ λ k y k (x) whereas the one for spectral clustering, Laplacian eigenmaps and LLE is y k (x).  ... 
dblp:conf/nips/BengioPVDRO03 fatcat:ewgfb2mxrvaqrkz2o4nf4ih7gq

Learning Eigenfunctions Links Spectral Embedding and Kernel PCA

Yoshua Bengio, Olivier Delalleau, Nicolas Le Roux, Jean-François Paiement, Pascal Vincent, Marie Ouimet
2004 Neural Computation  
to out-of-sample examples (the Nyström formula) for Multi-Dimensional Scaling, spectral clustering, Laplacian eigenmaps, Locally Linear Embedding (LLE) and Isomap.  ...  Experiments with LLE, Isomap, spectral clustering and MDS show that this out-of-sample embedding formula generalizes well, with a level of error comparable to the effect of small perturbations of the training  ...  Acknowledgments The authors would like to thank Léon Bottou, Christian Léger, Sam Roweis, Yann Le Cun, and Yves Grandvalet for helpful discussions, the anonymous reviewers for their comments, and the following  ... 
doi:10.1162/0899766041732396 pmid:15333211 fatcat:uzc4rzlorzfczdclqyeq7hohxy

Spectral Dimensionality Reduction [chapter]

Yoshua Bengio, Olivier Delalleau, Nicolas Le Roux, Jean-François Paiement, Pascal Vincent, Marie Ouimet
2006 Studies in Fuzziness and Soft Computing  
Acknowledgments The authors would like to thank Léon Bottou, Christian Léger, Sam Roweis, Yann Le Cun, and Yves Grandvalet for helpful discussions, and the following funding organizations: NSERC, MITACS  ...  , IRIS, and the Canada Research Chairs.  ...  is often the case for MDS and Isomap), a convergence theorem linking the Nyström formula to the eigenfunctions of G, as well as experiments on MDS, Isomap, LLE and spectral clustering / Laplacian eigenmaps  ... 
doi:10.1007/978-3-540-35488-8_28 fatcat:evyhhvxqdjgjvhp5k33isoqqwa

Biased Manifold Embedding: A Framework for Person-Independent Head Pose Estimation

Vineeth Nallure Balasubramanian, Jieping Ye, Sethuraman Panchanathan
2007 2007 IEEE Conference on Computer Vision and Pattern Recognition  
The proposed BME approach is formulated as an extensible framework, and validated with the Isomap, Locally Linear Embedding (LLE) and Laplacian Eigenmaps techniques.  ...  The results showed substantial reduction in the error of pose angle estimation, and robustness to variations in feature spaces, dimensionality of embedding and other parameters.  ...  out-of-sample data points after the training phase.  ... 
doi:10.1109/cvpr.2007.383280 dblp:conf/cvpr/BalasubramanianYP07 fatcat:p2ywm7g2brh2blijffyezdz5cq

Large-scale manifold learning

Ameet Talwalkar, Sanjiv Kumar, Henry Rowley
2008 2008 IEEE Conference on Computer Vision and Pattern Recognition  
Furthermore, approximate Isomap tends to perform better than Laplacian Eigenmaps on both clustering and classification with the labeled CMU-PIE dataset.  ...  Since most manifold learning techniques rely on spectral decomposition, we first analyze two approximate spectral decomposition techniques for large dense matrices (Nyström and Column-sampling), providing  ...  Acknowledgments We thank Sergey Ioffe for use of his face detection code, Michael Riley for his assistance with OpenFST.  ... 
doi:10.1109/cvpr.2008.4587670 dblp:conf/cvpr/TalwalkarKR08 fatcat:pmgtmeuobva5bkvxzspgwi4a5u

Large-Scale Manifold Learning [chapter]

Ameet Talwalkar, Sanjiv Kumar, Mehryar Mohri, Henry Rowley
2011 Manifold Learning Theory and Applications  
Furthermore, approximate Isomap tends to perform better than Laplacian Eigenmaps on both clustering and classification with the labeled CMU-PIE dataset.  ...  Since most manifold learning techniques rely on spectral decomposition, we first analyze two approximate spectral decomposition techniques for large dense matrices (Nyström and Column-sampling), providing  ...  Acknowledgments We thank Sergey Ioffe for use of his face detection code, Michael Riley for his assistance with OpenFST.  ... 
doi:10.1201/b11431-7 fatcat:t4v2ozja25c3ne7zcoyzlus4wy

Dimensionality reduction for classification of object weight from electromyography

Elnaz Lashgari, Uri Maoz, Luca Citi
2021 PLoS ONE  
What is more, optimal classification accuracy was achieved using a combination of Laplacian Eigenmaps (simple-minded) and k-Nearest Neighbors (88% F1 score for 3-way classification).  ...  the reach-and-grasp motions of the human hand.  ...  Nystro ¨m approximation supports out-of-sample extensions for spectral techniques such as ISOMAP, LLE, and Laplacian Eigenmaps.  ... 
doi:10.1371/journal.pone.0255926 pmid:34398924 pmcid:PMC8367006 fatcat:jkg7r63g5ffndnpwx4nzzzkj4y

Robust non-linear dimensionality reduction using successive 1-dimensional Laplacian Eigenmaps

Samuel Gerber, Tolga Tasdizen, Ross Whitaker
2007 Proceedings of the 24th international conference on Machine learning - ICML '07  
Recent results in the literature show that spectral decomposition, as used for example by the Laplacian Eigenmaps algorithm, provides a powerful tool for non-linear dimensionality reduction and manifold  ...  Experiments with artificial and real data illustrate the advantages of the proposed method over existing approaches.  ...  We also thank Xavier Tricoche for the helpful discussions and the anonymous reviewers for many helpful comments.  ... 
doi:10.1145/1273496.1273532 dblp:conf/icml/GerberTW07 fatcat:zukwxpt5sbedpemryposbdq5ha

Electromyography Classification during Reach-to-Grasp Motion using Manifold Learning [article]

Elnaz Lashgari, Uri Maoz
2020 bioRxiv   pre-print
What is more, optimal classification accuracy was achieved using a combination of Laplacian Eigenmaps (simple-minded) and k-Nearest Neighbors (88% for 3-way classification).  ...  They also demonstrate the usefulness of dimensionality reduction when classifying movement based on EMG signals and more generally the usefulness of EMG for movement classification.  ...  Further, spectral techniques such as ISOMAP, LLE, and Laplacian Eigenmaps support out-ofsample extensions via the Nyström approximation.  ... 
doi:10.1101/2020.07.16.207639 fatcat:rcbsg3rhvndyxibdwz3xgmoece

Out-of-Sample Embedding for Manifold Learning Applied to Face Recognition

F. Dornaika, B. Raduncanu
2013 2013 IEEE Conference on Computer Vision and Pattern Recognition Workshops  
Manifold learning techniques are affected by two critical aspects: (i) the design of the adjacency graphs, and (ii) the embedding of new test data-the out-of-sample problem.  ...  To evaluate the effectiveness of the proposed out-of-sample embedding, experiments are conducted using the k-nearest neighbor (KNN) and Kernel Support Vector Machines (KSVM) classifiers on four public  ...  In [2] , the authors cast MDS, ISOMAP, LLE, and LE in a common framework, in which these methods are seen as learning eigenfunctions of a kernel.  ... 
doi:10.1109/cvprw.2013.127 dblp:conf/cvpr/DornaikaR13 fatcat:7embzgl2wbcntc3nkedihio3ua

Unsupervised dimensionality reduction: Overview and recent advances

John A. Lee, Michel Verleysen
2010 The 2010 International Joint Conference on Neural Networks (IJCNN)  
This paper attempts to give a broad overview of the domain. Past develoments are briefly introduced and pinned up on the time line of the last eleven decades.  ...  Next, the principles and techniques involved in the major methods are described. A taxonomy of the methods is suggested, taking into account various properties.  ...  For some nonlinear spectral methods, such as kernel PCA, Isomap, LLE, and Laplacian eigenmaps, Nyström formula can be applied [4] , since they consists in applying classical metric MDS on a Gram matrix  ... 
doi:10.1109/ijcnn.2010.5596721 dblp:conf/ijcnn/LeeV10a fatcat:6cbvynny4ngmnoxfzjpfmy5g2e

Hyperspectral image segmentation using spatial-spectral graphs

David B. Gillis, Jeffrey H. Bowles, Sylvia S. Shen, Paul E. Lewis
2012 Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery XVIII  
and spectral distances between nodes.  ...  The advantages of our approach are that, first, the graph structure naturally incorporates both the spatial and spectral information present in HSI; also, by using only spatial neighbors, the adjacency  ...  ACKNOWLEDGEMENTS This research was funded in part by the US Office of Naval Research and the US Department of Defense.  ... 
doi:10.1117/12.919743 fatcat:lblyrm2sfvbbndpcfmpfw42xoe

An online generalized eigenvalue version of Laplacian Eigenmaps for visual big data

Zeeshan Khawar Malik, Amir Hussain, Jonathan Wu
2016 Neurocomputing  
batch isomap method and other manifold-based learning techniques.  ...  This paper presents a novel online version of laplacian eigenmap termed as generalized incremental laplacian eigenmap (GENILE), one of the most popular manifold-based dimensionality reduction technique  ...  The author in [23] has presented a generalized common framework for local linear embedding (LLE), multidimensional scaling (MDS), isomap and laplacian eigenmap by proposing a novel nystrom formula for  ... 
doi:10.1016/j.neucom.2014.12.119 fatcat:jydmcce32bhsxewawy3q74pyju

Laplacian-Based Dimensionality Reduction Including Spectral Clustering, Laplacian Eigenmap, Locality Preserving Projection, Graph Embedding, and Diffusion Map: Tutorial and Survey [article]

Benyamin Ghojogh, Ali Ghodsi, Fakhri Karray, Mark Crowley
2021 arXiv   pre-print
Different optimization variants of Laplacian eigenmap and its out-of-sample extension are explained.  ...  Then, we cover the cuts of graph and spectral clustering which applies clustering in a subspace of data.  ...  Acknowledgement Some parts of this tutorial paper (particularly some parts of spectral clustering and Laplacian eigenmap) have been covered by Prof.  ... 
arXiv:2106.02154v1 fatcat:s6rilq4fqzgwxmz346cbk27nqi

A Survey of Manifold-Based Learning Methods [chapter]

Xiaoming Huo, Xuelei (Sherry) Ni, Andrew K. Smith
2008 Recent Advances in Data Mining of Enterprise Data: Algorithms and Applications  
The representative methods include locally linear embedding (LLE), ISOMAP, Laplacian eigenmaps, Hessian eigenmaps, local tangent space alignment (LTSA), and charting.  ...  We review the ideas, algorithms, and numerical performance of manifold-based machine learning and dimension reduction methods.  ...  Locally Linear Embedding (LLE) Locally linear embedding (LLE) and ISOMAP comprise a new generation of dimension reduction methods.  ... 
doi:10.1142/9789812779861_0015 fatcat:imart6ri7nce3nxhtvcqgez4ci
« Previous Showing results 1 — 15 out of 169 results