Filters








74,153 Hits in 3.9 sec

Closed-form supervised dimensionality reduction with generalized linear models

Irina Rish, Genady Grabarnik, Guillermo Cecchi, Francisco Pereira, Geoffrey J. Gordon
2008 Proceedings of the 25th international conference on Machine learning - ICML '08  
We propose a family of supervised dimensionality reduction (SDR) algorithms that combine feature extraction (dimensionality reduction) with learning a predictive model in a unified optimization framework  ...  , using data-and class-appropriate generalized linear models (GLMs), and handling both classification and regression problems.  ...  linear models, thus handling both classification and regression, with both discrete and real-valued data 1 .  ... 
doi:10.1145/1390156.1390261 dblp:conf/icml/RishGCPG08 fatcat:fsgic4au45e4lmxpuy3tckwxbq

A General Model for Semi-Supervised Dimensionality Reduction

Xuesong Yin, Ting Shu, Qi Huang
2012 Procedia Engineering  
This paper focuses on semi-supervised dimensionality reduction. In this scenario, we present a general model for semi-supervised dimensionality reduction with pairwise constraints (SSPC).  ...  Experimental results on a collection of benchmark data sets show that SSPC is superior to many established dimensionality reduction methods.  ...  In this paper, we focus on side information in the form of pairwise constraints, and present a general model for semi-supervised dimensionality reduction with pairwise constraints (SSPC).  ... 
doi:10.1016/j.proeng.2012.01.529 fatcat:f32o37juxzbljhoa2t7joeayi4

Supervised Exponential Family Principal Component Analysis via Convex Optimization

Yuhong Guo
2008 Neural Information Processing Systems  
for nonlinear supervised dimensionality reduction.  ...  Recently, supervised dimensionality reduction has been gaining attention, owing to the realization that data labels are often available and indicate important underlying structure in the data.  ...  A more general supervised dimensionality reduction approach with generalized linear models (SDR GLM) was proposed in [12] .  ... 
dblp:conf/nips/Guo08 fatcat:lxh5zoaypbgv3oopzt7nel3avm

Supervised principal component analysis: Visualization, classification and regression on subspaces and submanifolds

Elnaz Barshan, Ali Ghodsi, Zohreh Azimifar, Mansoor Zolghadri Jahromi
2011 Pattern Recognition  
We propose "Supervised Principal Component Analysis (Supervised PCA)", a generalization of PCA that is uniquely effective for regression and classification problems with high-dimensional input data.  ...  Furthermore, we show how the algorithm can be kernelized, which makes it applicable to non-linear dimensionality reduction tasks.  ...  Related Works Most of the research in supervised dimensionality reduction is focused on learning a linear subspace.  ... 
doi:10.1016/j.patcog.2010.12.015 fatcat:baocam3kwjh3blfd7gvziyondq

Transductive De-Noising and Dimensionality Reduction using Total Bregman Regression [chapter]

Sreangsu Acharyya
2006 Proceedings of the 2006 SIAM International Conference on Data Mining  
Our goal on one hand is to use labels or other forms of ground truth data to guide the tasks of de-noising and dimensionality reduction and balance the objectives of better prediction and better data summarization  ...  , on the other hand it is to explicitly model the noise in the feature values.  ...  Canonical Generalized Linear Model In canonical generalized linear models (GLM) [9] , θ i is assumed to be a linear function of X i of the form θ i = β † X i , following which the maximum likelihood reduces  ... 
doi:10.1137/1.9781611972764.51 dblp:conf/sdm/Acharyya06 fatcat:hwjkpqxmozdsvjtultizmnspfe

A New Discriminant Principal Component Analysis Method with Partial Supervision

Dan Sun, Daoqiang Zhang
2009 Neural Processing Letters  
On the other hand, traditional supervised dimensionality reduction methods such as linear discriminant analysis perform on only labeled data.  ...  The derived DPCA algorithm is efficient and has a closed form solution.  ...  However, their method is based on probabilistic PCA which is a generative model. Also, their algorithm needs iteration and has no closed form solution. Lu et al.  ... 
doi:10.1007/s11063-009-9112-6 fatcat:brkoeyq6lff6baffhl4maa3wvy

A Shallow High-Order Parametric Approach to Data Visualization and Compression [article]

Martin Renqiang Min, Hongyu Guo, Dongjin Song
2016 arXiv   pre-print
of supervised shallow models with high-order feature interactions.  ...  Compared to deep embedding models with complicated deep architectures, HOPE generates more effective high-order feature mapping through an embarrassingly simple shallow model.  ...  Compared to supervised deep embedding methods with complicated deep architectures, the above linear projection method has limited modeling power.  ... 
arXiv:1608.04689v1 fatcat:6lch322fajhsfbnpgunjife4fq

Multivariate strategies in functional magnetic resonance imaging

L HANSEN
2007 Brain and Language  
In a case study we analyze linear and non-linear dimensional reduction tools in the context of a 'mind reading' predictive multivariate fMRI model.  ...  We discuss aspects of multivariate fMRI modeling, including the statistical evaluation of multivariate models and means for dimensional reduction.  ...  To adapt models we need supervised data sets with both inputs x) and outputs g.  ... 
doi:10.1016/j.bandl.2006.12.004 pmid:17223190 fatcat:jbprl2o2p5azdoem55k6dk3ijm

Nonnegative Matrix Factorization for Semi-supervised Dimensionality Reduction [article]

Youngmin Cho, Lawrence K. Saul
2011 arXiv   pre-print
We evaluate these updates for dimensionality reduction when they are used as a precursor to linear classification.  ...  We show how to incorporate information from labeled examples into nonnegative matrix factorization (NMF), a popular unsupervised learning algorithm for dimensionality reduction.  ...  By setting its partial derivatives to zero, the bound can be minimized in closed form with respect to either of the nonnegative matrices V or H.  ... 
arXiv:1112.3714v1 fatcat:aqte53dh5bgebafthhdkrkgaqq

Bayesian Inference on Matrix Manifolds for Linear Dimensionality Reduction [article]

Andrew Holbrook, Alexander Vandenberg-Rodes, Babak Shahbaba
2016 arXiv   pre-print
This natural paradigm extends the Bayesian framework to dimensionality reduction tasks in higher dimensions with simpler models at greater speeds.  ...  We reframe linear dimensionality reduction as a problem of Bayesian inference on matrix manifolds.  ...  These new implementations will not necessarily resemble past iterations of probabilistic linear dimensionality reduction.  ... 
arXiv:1606.04478v1 fatcat:5tq3pjvarzcvdpkctgmboiwtqe

Supervised Linear Dimension-Reduction Methods: Review, Extensions, and Comparisons [article]

Shaojie Xu, Joel Vaughan, Jie Chen, Agus Sudjianto, Vijayan Nair
2021 arXiv   pre-print
To address this concern, several supervised linear dimension-reduction techniques have been proposed in the literature.  ...  Principal component analysis (PCA) is a well-known linear dimension-reduction method that has been widely used in data analysis and modeling.  ...  known closed-form solution.  ... 
arXiv:2109.04244v1 fatcat:uu57c5bnobgwtpleiqd6zp7fsq

On the Principles of Parsimony and Self-Consistency for the Emergence of Intelligence [article]

Yi Ma and Doris Tsao and Heung-Yeung Shum
2022 arXiv   pre-print
Ten years into the revival of deep networks and artificial intelligence, we propose a theoretical framework that sheds light on understanding deep networks within a bigger picture of Intelligence in general  ...  More specifically, the two principles lead to an effective and efficient computational framework, compressive closed-loop transcription, that unifies and explains the evolution of modern deep networks  ...  However, it is computable (with closed-form) only for a mixture of subspaces or Gaussians, not for general distributions!  ... 
arXiv:2207.04630v3 fatcat:yaazh2ok2fdklpv5l3kakdon2u

ivis Dimensionality Reduction Framework for Biomacromolecular Simulations [article]

Hao Tian, Peng Tao
2020 arXiv   pre-print
Linear dimensionality reduction methods, such as principal component analysis (PCA) and time-structure based independent component analysis (t-ICA), could not preserve sufficient structural information  ...  Compared with other methods, ivis is shown to be superior in constructing Markov state model (MSM), preserving information of both local and global distances and maintaining similarity between high dimension  ...  Acknowledgement Research reported in this paper was supported by the National Institute of General  ... 
arXiv:2004.10718v2 fatcat:qhid67dx4rgnxisff7dlftctqe

Graph Diffusion-Embedding Networks [article]

Bo Jiang, Doudou Lin, Jin Tang
2018 arXiv   pre-print
GDEN is motivated by our closed-form formulation on regularized feature diffusion on graph.  ...  Experiments on semi-supervised learning tasks on several benchmark datasets demonstrate the better performance of the proposed GDEN when comparing with the traditional GCN models.  ...  have been widely used in dimensionality reduction and label prediction.  ... 
arXiv:1810.00797v1 fatcat:66iqdsh5enhs3ogrtdda5ldfca

Learning Through Non-linearly Supervised Dimensionality Reduction [chapter]

Josif Grabocka, Lars Schmidt-Thieme
2015 Lecture Notes in Computer Science  
If the hyper-plane separating the class regions in the original data space is non-linear, then a nonlinear dimensionality reduction helps improving the generalization over the test instances.  ...  In the light of such inspiration, we propose a novel dimensionality reduction which simultaneously reconstructs the predictors using matrix factorization and estimates the target variable via a dual-form  ...  space via jointly optimizing a dual form together with dimensionality reduction.  ... 
doi:10.1007/978-3-662-46335-2_4 fatcat:tnjtbjja3zbqfofbfoh37vkpaa
« Previous Showing results 1 — 15 out of 74,153 results