Filters








146 Hits in 9.6 sec

Canonical Correlation Analysis based on Hilbert-Schmidt Independence Criterion and Centered Kernel Target Alignment

Billy Chang, Uwe Krüger, Rafal Kustra, Junping Zhang
2013 International Conference on Machine Learning  
The aim of this article is to introduce two nonlinear CCA extensions that rely on the recently proposed Hilbert-Schmidt independence criterion and the centered kernel target alignment.  ...  Canonical correlation analysis (CCA) is a well established technique for identifying linear relationships among two variable sets.  ...  al., 2005) , and (ii) the centered kernel target alignment (KTA) (Cortes et al., 2012) .  ... 
dblp:conf/icml/ChangKKZ13 fatcat:wqbva3dfc5hbvmridnk7jdlpg4

Independence test and canonical correlation analysis based on the alignment between kernel matrices for multivariate functional data

Tomasz Górecki, Mirosław Krzyśko, Waldemar Wołyński
2018 Artificial Intelligence Review  
Springer, Berlin, pp 63-77, 2005) defined Hilbert-Schmidt independence criterion, and next Cortes et al.  ...  (J Mach Learn Res 13:795-828, 2012) introduced concept of the centered kernel target alignment (KTA).  ...  Acknowledgements The authors are grateful to editor and two anonymous reviewers for giving many insightful and constructive comments and suggestions which led to the improvement of the earlier manuscript  ... 
doi:10.1007/s10462-018-9666-7 fatcat:bg6uwugzvnfl3k4zzo4u6xoe34

Simultaneous Twin Kernel Learning Using Polynomial Transformations for Structured Prediction

Chetan Tonde, Ahmed Elgammal
2014 2014 IEEE Conference on Computer Vision and Pattern Recognition  
Kernel methods such as Structured Support Vector Machines , Twin Gaussian Processes (TGP), Structured Gaussian Processes, and vector-valued Reproducing Kernel Hilbert Spaces (RKHS), offer powerful ways  ...  In this work, we propose a novel and efficient algorithm for learning kernel functions simultaneously, on both input and output domains.  ...  Other techniques are similar to KCCA and optimize different criteria such as Hilbert Schmidt Independence Criterion (HSIC) [8, 15] or Kernel Target Alignment (KTA) [8, 10] ) but have similar drawbacks  ... 
doi:10.1109/cvpr.2014.547 dblp:conf/cvpr/TondeE14 fatcat:l6wdshjwxfe3pmwhoal5xnagve

Learning Kernels for Structured Prediction using Polynomial Kernel Transformations [article]

Chetan Tonde, Ahmed Elgammal
2016 arXiv   pre-print
We learn kernels over input and output for structured data, such that, dependency between kernel features is maximized. We use Hilbert-Schmidt Independence Criterion (HSIC) to measure this.  ...  We also give an efficient, matrix decomposition-based algorithm to learn these kernel transformations, and demonstrate state-of-the-art results on several real-world datasets.  ...  Hilbert Schmidt Independence Criterion To measure cross-correlation or dependence between structured input and output data in kernel feature Gretton et al. [2005] proposed a Hilbert Schmidt Independence  ... 
arXiv:1601.01411v1 fatcat:g3gjwhn6gnafll4kayjep3iqfm

Supervised principal component analysis: Visualization, classification and regression on subspaces and submanifolds

Elnaz Barshan, Ali Ghodsi, Zohreh Azimifar, Mansoor Zolghadri Jahromi
2011 Pattern Recognition  
Experimental results on various visualization, classification and regression problems show significant improvement over other supervised approaches both in accuracy and computational efficiency.  ...  We propose "Supervised Principal Component Analysis (Supervised PCA)", a generalization of PCA that is uniquely effective for regression and classification problems with high-dimensional input data.  ...  Hilbert-Schmidt Independence Criterion Gretton et al. [24] proposed an independence criterion in RKHSs.  ... 
doi:10.1016/j.patcog.2010.12.015 fatcat:baocam3kwjh3blfd7gvziyondq

Cross-Domain Matching with Squared-Loss Mutual Information

Makoto Yamada, Leonid Sigal, Michalis Raptis, Machiko Toyoda, Yi Chang, Masashi Sugiyama
2015 IEEE Transactions on Pattern Analysis and Machine Intelligence  
To overcome the limitation of KS-MI, Quadrianto et al. [2] proposed using the kernel-based dependence measure called the Hilbert-Schmidt independence criterion (HSIC) [9] for KS.  ...  To overcome the weaknesses of DTW, canonical time warping (CTW) was introduced in [3]. CTW performs sequence alignment in a common latent space found by canonical correlation analysis (CCA) [17] .  ...  Fernando Villavicencio and Dr. Akisato Kimura for their valuable comments. We also thank Dr. Feng Zhou and Dr. Fernando de la Torre for data and valuable discussions.  ... 
doi:10.1109/tpami.2014.2388235 pmid:26353125 fatcat:hwmu6tskcnaajb7a4p3frf7iba

Analyzing and Controlling Inter-Head Diversity in Multi-Head Attention

Hyeongu Yun, Taegwan Kang, Kyomin Jung
2021 Applied Sciences  
Analysis (SVCCA) and Centered Kernel Alignment (CKA).  ...  To examine our hypothesis, we deeply inspect three techniques to control the inter-head diversity; (1) Hilbert-Schmidt Independence Criterion regularizer among representation subspaces, (2) Orthogonality  ...  This work was also supported by the BK21 FOUR program of the Education and Research Program for Future ICT Pioneers, Seoul National University in 2021.  ... 
doi:10.3390/app11041548 fatcat:cdwm3pslbrdvngw6nucx62yp4a

Distance-based and RKHS-based Dependence Metrics in High Dimension [article]

Changbo Zhu, Shun Yao, Xianyang Zhang, Xiaofeng Shao
2019 arXiv   pre-print
We further extend the distance correlation based t-test to those based on Hilbert-Schmidt covariance and marginal distance/Hilbert-Schmidt covariance.  ...  In this paper, we study distance covariance, Hilbert-Schmidt covariance (aka Hilbert-Schmidt independence criterion [Gretton et al. (2008)]) and related independence tests under the high dimensional scenario  ...  We further extend the distance correlation based t test to those based on Hilbert-Schmidt covariance and marginal distance/Hilbert-Schmidt covariance.  ... 
arXiv:1902.03291v1 fatcat:kd2t75begngevatxnsu2ylny2q

Matching samples of multiple views

Abhishek Tripathi, Arto Klami, Matej Orešič, Samuel Kaski
2010 Data mining and knowledge discovery  
The method finds a matching that maximizes statistical dependency between the views, which is particularly suitable for multi-view Matching samples of multiple views 301 methods such as canonical correlation  ...  As a practical example, joint analysis of mRNA and protein concentrations requires mapping between genes and proteins.  ...  Acknowledgements AK and SK belong to Finnish Center of Excellence in Adaptive Informatics Research of the Academy of Finland.  ... 
doi:10.1007/s10618-010-0205-7 fatcat:bgsijrqslrgephcfccsyjjhhtq

Comparative analysis of molecular representations in prediction of cancer drug combination synergy and sensitivity [article]

Bulat Zagidullin, Ziyan Wang, Yuanfang Guan, Esa Pitkänen, Jing Tang
2021 bioRxiv   pre-print
We evaluate the clustering performance of molecular fingerprints and quantify their similarity by adapting Centred Kernel Alignment metric.  ...  To this end we compare rule-based and data-driven molecular representations in prediction of drug combination sensitivity and drug synergy scores using standardized results of 14 high throughput screening  ...  Hilbert-Schmidt Independence Criterion (HSIC) is a test statistic equal to 0 when and are independent [103] .  ... 
doi:10.1101/2021.04.16.439299 fatcat:p5jq6uzuq5gofgjrd45fso3vtm

Discriminative Supervised Subspace Learning for Cross-modal Retrieval [article]

Haoming Zhang, Xiao-Jun Wu, Tianyang Xu, Donglin Zhang
2022 arXiv   pre-print
Subsequently, the Hilbert-Schmidt Independence Criterion(HSIC) is introduced to preserve the consistence between feature-similarity and semantic-similarity of samples.  ...  As one of the mainstream, approaches based on subspace learning pay attention to learning a common subspace where the similarity among multi-modal data can be measured directly.  ...  In this paper, we use the Hilbert-Schmidt independence criteria(HSIC) which is briefly introduced in the preceding subsection to maximize the kernel correlation among multi-modal data.  ... 
arXiv:2201.11843v1 fatcat:6b7c4xgewzab3b32wreh27cddm

Robust Kernel (Cross-) Covariance Operators in Reproducing Kernel Hilbert Space toward Kernel Methods [article]

Md. Ashad Alam, Kenji Fukumizu, Yu-Ping Wang
2016 arXiv   pre-print
Second, we propose influence function of classical kernel canonical correlation analysis (classical kernel CCA).  ...  Finally, we propose a method based on robust kernel CO and robust kernel CCO, called robust kernel CCA, which is designed for contaminated data and less sensitive to noise than classical kernel CCA.  ...  In recent years, two canonical correlation analysis (CCA) methods based on Hilbert-Schmidt independence criterion (hsicCCA) and centered kernel target alignment (ktaCCA) have been proposed by Chang et  ... 
arXiv:1602.05563v1 fatcat:d66fe4357jf6rjifm2ekkkmvlm

Measuring multivariate association and beyond

Julie Josse, Susan Holmes
2016 Statistics Survey  
Coefficients such as the RV coefficient, the distance covariance (dCov) coefficient and kernel based coefficients are being used by different research communities.  ...  We illustrate these different strategies on several examples of real data and suggest directions for future research.  ...  We thank Persi Diaconis and Jerry Friedman for comments on the manuscript.  ... 
doi:10.1214/16-ss116 pmid:29081877 pmcid:PMC5658146 fatcat:vntza6lkgbch7bz6zttxwbvvxy

Gaussian RBF Centered Kernel Alignment (CKA) in the Large Bandwidth Limit [article]

Sergio A. Alvarez
2021 arXiv   pre-print
We prove that Centered Kernel Alignment (CKA) based on a Gaussian RBF kernel converges to linear CKA in the large-bandwidth limit.  ...  We show that convergence onset is sensitive to the geometry of the feature representations, and that representation eccentricity bounds the range of bandwidths for which Gaussian CKA behaves nonlinearly  ...  ACKNOWLEDGMENTS The author thanks Sarun Paisarnsrisomsuk, whose experiments for [16] motivated this paper, and Carolina Ruiz, for helpful comments on an earlier version of the manuscript.  ... 
arXiv:2112.09305v1 fatcat:zykqvmvm6zbbxnjjcj2qekmaby

Kernel Multivariate Analysis Framework for Supervised Subspace Learning: A Tutorial on Linear and Kernel Multivariate Methods

Jeronimo Arenas-Garcia, Kaare Brandt Petersen, Gustavo Camps-Valls, Lars Kai Hansen
2013 IEEE Signal Processing Magazine  
This paper provides a uniform treatment of several methods: Principal Component Analysis (PCA), Partial Least Squares (PLS), Canonical Correlation Analysis (CCA) and Orthonormalized PLS (OPLS), as well  ...  as their non-linear extensions derived by means of the theory of reproducing kernel Hilbert spaces.  ...  For instance, the Hilbert-Schmidt independence criterion (HSIC) [32] is a simple yet very effective method to estimate statistical dependence between random variables.  ... 
doi:10.1109/msp.2013.2250591 fatcat:fxcugxbr6bfzfgkroedwf23ddu
« Previous Showing results 1 — 15 out of 146 results