A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Filters
Information-Geometric Dimensionality Reduction
2011
IEEE Signal Processing Magazine
We consider the problem of dimensionality reduction and manifold learning when the domain of interest is a set of probability distributions instead of a set of Euclidean data vectors. ...
This article presents methods that are specifically designed for the low-dimensional embedding of information-geometric data, and we illustrate these methods for visualization in flow cytometry and demography ...
The authors also thank Christine 21 Kim of the University of Michigan who collected the data used for Fig. 5 while a summer intern at AFRL. ...
doi:10.1109/msp.2010.939536
fatcat:math4f3xpjfbtg7jjty6rgjcye
Gaussian Process Subspace Regression for Model Reduction
[article]
2021
arXiv
pre-print
This method is extrinsic and intrinsic at the same time: with multivariate Gaussian distributions on the Euclidean space, it induces a joint probability model on the Grassmann manifold, the set of fixed-dimensional ...
For PROM, the GPS provides a probabilistic prediction at a new parameter point that retains the accuracy of local reduced models, at a computational complexity that does not depend on system dimension, ...
Zimmermann [48] reviewed interpolation methods on the Grassmann manifold and other matrix manifolds that arise in model reduction. ...
arXiv:2107.04668v1
fatcat:hnpvdwacn5b27mzmqluqfyzt7q
Recent advances in directional statistics
2021
Test (Madrid)
curve estimation, methods for dimension reduction, classification and clustering, and the modelling of time series, spatial and spatio-temporal data. ...
There are, however, numerous contexts of considerable scientific interest in which the natural supports for the data under consideration are Riemannian manifolds like the unit circle, torus, sphere, and ...
Dimension reduction methods
Principal component analysis
General manifolds Principal component analysis (PCA) for data on a Riemannian manifold M of dimension d, such as S d or T d , has received considerable ...
doi:10.1007/s11749-021-00759-x
fatcat:abrego4mybeqpfqnjsb3kysbu4
LAVENDER: latent axes discovery from multiple cytometry samples with non-parametric divergence estimation and multidimensional scaling reconstruction
[article]
2019
bioRxiv
pre-print
Computational cytometry methods are now frequently used in flow and mass cytometric data analyses. ...
Here, we devised a computational method termed LAVENDER (latent axes discovery from multiple cytometry samples with nonparametric divergence estimation and multidimensional scaling reconstruction). ...
James Cai, Texas A&M University, for his helpful comments on the manuscript and Dr. Maiko Narahara for her contribution in the initial phase of the study. ...
doi:10.1101/673434
fatcat:5ya3gww5vjay3phxji6cpmwffq
Recent advances in directional statistics
[article]
2020
arXiv
pre-print
curve estimation, methods for dimension reduction, classification and clustering, and the modelling of time series, spatial and spatio-temporal data. ...
There are, however, numerous contexts of considerable scientific interest in which the natural supports for the data under consideration are Riemannian manifolds like the unit circle, torus, sphere and ...
their enthusiastic feedback on our original submission and helpful suggestions as to how it might be further improved. ...
arXiv:2005.06889v3
fatcat:wcsr6hn5tvbjlmvxnwcjwaolte
Learning Image Manifolds from Local Features
[chapter]
2011
Manifold Learning Theory and Applications
Chapter 8 discusses the Ricci flow for designing Riemannian metrics by prescribed curvatures on surfaces and 3-dimensional manifolds. ...
Spectral Embedding Methods for Manifold Learning There are many techniques that can be used for either linear dimensionality reduction or linear manifold learning. ...
doi:10.1201/b11431-16
fatcat:hgcv2z3zprf4vlju2cg5wv7pqe
Discretized Gradient Flow for Manifold Learning in the Space of Embeddings
[article]
2020
arXiv
pre-print
Many implementations of gradient descent rely on a discretized version, i.e., moving in the direction of the gradient for a set step size, recomputing the gradient, and continuing. ...
Manifold learning/dimensionality reduction, which seeks a low dimensional manifold that best represents data in a high dimensional Euclidean space, is an inherently infinite dimensional problem. ...
Based on Riemannian geometry estimates dating to the 1980s, it is reasonable to assume that we want to consider manifolds of a fixed dimension with a priori a lower bound on volume, an upper bound on diameter ...
arXiv:1901.09057v3
fatcat:bar6e3ojdva4zkraghr3eezvwa
3D face tracking and expression inference from a 2D sequence using manifold learning
2008
2008 IEEE Conference on Computer Vision and Pattern Recognition
We propose a person-dependent, manifold-based approach for modeling and tracking rigid and nonrigid 3D facial deformations from a monocular video sequence. ...
We do not represent all nonrigid facial deformations as a simple complex manifold, but instead decompose them on a basis of eight 1D manifolds. ...
Suppose we have a set of samples in a high dimensional space V and these samples lie on a manifold M of much lower dimension. Our objective is to infer the geometric structure of this manifold. ...
doi:10.1109/cvpr.2008.4587805
dblp:conf/cvpr/LiaoM08
fatcat:excjjunfy5a3hjnbyteb3kp7dq
Class Probability Estimation via Differential Geometric Regularization
[article]
2016
arXiv
pre-print
In experiments, we apply our regularization technique to standard loss functions for classification, our RBF-based implementation compares favorably to widely used regularization methods for both binary ...
The regularization term measures the volume of this submanifold, based on the intuition that overfitting produces rapid local oscillations and hence large volume of the estimator. ...
Since our gradient flow method is actually applied on the infinite dimensional manifold M, we have to understand both the topology and the Riemannian geometry of M. ...
arXiv:1503.01436v7
fatcat:fr2l2zo5rvaa7gfjbfo4uuypnm
Model order reduction assisted by deep neural networks (ROM-net)
2020
Advanced Modeling and Simulation in Engineering Sciences
The training examples are represented by points on a Grassmann manifold, on which distances are computed for clustering. ...
In this paper, we propose a general framework for projection-based model order reduction assisted by deep neural networks. ...
In this paper, principal component analysis (PCA) is applied for linear dimensionality reduction with 30 principal components, the dimension 30 being a compromise between large dimensions and low dimensions ...
doi:10.1186/s40323-020-00153-6
fatcat:dvqk757qmzcybcyssaptu7s7tu
Constrained manifold learning for the characterization of pathological deviations from normality
2012
Medical Image Analysis
Dimensionality reduction and k-NN An overview of methods for estimating the intrinsic dimensionality of a dataset was given in Camastra (2003) , but there is no standard manner of performing this step ...
Principal geodesic analysis (Fletcher et al., 2004 ) is a generalization of PCA for data lying on a manifold, but is described in cases where the manifold structure is already known and independent of ...
0 are the PCA-based reconstruction of I and I 0 using the first M principal directions, and x ? m and x ? 0,m correspond to the mth component of x ? and x ? 0 , respectively. Proof: We denote e ? ...
doi:10.1016/j.media.2012.07.003
pmid:22906821
fatcat:j7lyaz36c5by7mu64ef4ttiwl4
Single and Multiple Object Tracking Using Log-Euclidean Riemannian Subspace and Block-Division Appearance Model
2012
IEEE Transactions on Pattern Analysis and Machine Intelligence
Based on the log-euclidean Riemannian metric on symmetric positive definite matrices, we propose an incremental log-euclidean Riemannian subspace learning algorithm in which covariance matrices of image ...
features are mapped into a vector space with the log-euclidean Riemannian metric. ...
For each block, a low dimensional logeuclidean Riemannian subspace model is learned online. ...
doi:10.1109/tpami.2012.42
pmid:22331855
fatcat:w2vwys5wnjgntjltuah3sq2hoy
Riemannian Submanifolds: A Survey
[article]
2013
arXiv
pre-print
In this article, the author provides a broad review of Riemannian submanifolds in differential geometry. ...
Submanifold theory is a very active vast research field which plays an important role in the development of modern differential geometry. ...
A reduction theorem. Let M be an n-dimensional submanifold of a Riemannian m-manifold N and E a subbundle of the normal bundle T ⊥ M . ...
arXiv:1307.1875v1
fatcat:hl35xuvgajdklfmg7pohlwpwem
Modern Dimension Reduction
[article]
2021
arXiv
pre-print
The result is a well-stocked toolbox of unsupervised algorithms for tackling the complexities of high dimensional data so common in modern society. All code is publicly accessible on Github. ...
This Element offers readers a suite of modern unsupervised dimension reduction techniques along with hundreds of lines of R code, to efficiently represent the original high dimensional data space in a ...
At a basic level, Riemannian manifold learning is premised on the idea that a latent, simpler manifold characterizes the input space, but in some high dimensional ambient space. ...
arXiv:2103.06885v1
fatcat:npdejgjbxndg5d73zhaahkm4ra
The Diffusion Geometry of Fibre Bundles: Horizontal Diffusion Maps
[article]
2019
arXiv
pre-print
In a broader context, HDM reveals the sub-Riemannian structure of high-dimensional datasets, and provides a nonparametric learning framework for datasets with structural correspondences. ...
Kernel-based non-linear dimensionality reduction methods, such as Local Linear Embedding (LLE) and Laplacian Eigenmaps, rely heavily upon pairwise distances or similarity scores, with which one can construct ...
Note also that HBDM embeds the base data set B into a Euclidean space of dimension κ 2 , which is of much higher dimensionality than the size of the original data set. ...
arXiv:1602.02330v4
fatcat:gxevtslj35etvikmp2vzmxjhyu
« Previous
Showing results 1 — 15 out of 89 results