Filters








164 Hits in 6.0 sec

Page 3242 of Mathematical Reviews Vol. , Issue 2004d [page]

2004 Mathematical Reviews  
greedy learning algorithms for sparse regression and classification with Mercer kernels.  ...  Summary: “We present some greedy learning algorithms for build- ing sparse nonlinear regression and classification models from ob- servational data using Mercer kernels.  ... 

Sparse Online Greedy Support Vector Regression [chapter]

Yaakov Engel, Shie Mannor, Ron Meir
2002 Lecture Notes in Computer Science  
We present a novel algorithm for sparse online greedy kernelbased nonlinear regression. This algorithm improves current approaches to kernel-based regression in two aspects.  ...  We show that the algorithm implements a form of gradient ascent and demonstrate its scaling and noise tolerance properties on three benchmark regression problems.  ...  Third, some technical improvements to the algorithm seem worthwhile. Specifically, the learning rates may be optimized resulting in faster convergence.  ... 
doi:10.1007/3-540-36755-1_8 fatcat:26ea64jskzawrdjxa37bi7yq6a

Learning Bounds for Greedy Approximation with Explicit Feature Maps from Multiple Kernels [article]

Shahin Shahrampour, Vahid Tarokh
2018 arXiv   pre-print
Nonlinear kernels can be approximated using finite-dimensional feature maps for efficient risk minimization.  ...  In this work, we tackle this problem by efficiently choosing such features from multiple kernels in a greedy fashion.  ...  In the similar spirit is the work of [40] , which concentrates on sparse regression and classification models using Mercer kernels, as well as the work of [41] that considers sparse regression with  ... 
arXiv:1810.03817v1 fatcat:qnivtflz7jgqtbd324plonc2xu

Kernel conditional random fields

John Lafferty, Xiaojin Zhu, Yan Liu
2004 Twenty-first international conference on Machine learning - ICML '04  
By incorporating kernels and implicit feature spaces into conditional graphical models, the framework enables semi-supervised learning algorithms for structured data through the use of graph kernels.  ...  A representer theorem for conditional graphical models is given which shows how kernel conditional random fields arise from risk minimization procedures defined using Mercer kernels on labeled graphs.  ...  Acknowledgments This work was supported in part by NSF ITR grants CCR-0122581, IIS-0205456 and IIS-0312814.  ... 
doi:10.1145/1015330.1015337 dblp:conf/icml/LaffertyZL04 fatcat:uhrihq6tkfgyviat42mbcxl27e

Updates for Nonlinear Discriminants

Edin Andelic, Martin Schafföner, Marcel Katz, Sven E. Krüger, Andreas Wendemuth
2007 International Joint Conference on Artificial Intelligence  
A novel training algorithm for nonlinear discriminants for classification and regression in Reproducing Kernel Hilbert Spaces (RKHSs) is presented.  ...  Various experiments for both classification and regression are performed to show the competitiveness of the proposed method.  ...  Experiments To show the usefulness of the proposed method empirically, some experiments for regression and classification are performed.  ... 
dblp:conf/ijcai/AndelicSKKW07 fatcat:dkhwix5kjbcc5m62yfkk3jqe64

A Kernel-Based Framework for Medical Big-Data Analytics [chapter]

David Windridge, Miroslaw Bober
2014 Lecture Notes in Computer Science  
For pre-processing of image-based MR data we advocate a Deep Learning solution for contextual areal segmentation, with edit-distance based kernel measurement then used to characterize relevant morphology  ...  The challenge typically arises from the nature of the data which may be heterogeneous, sparse, very highdimensional, incomplete and inaccurate.  ...  Dealing with Heterogeneous Data Kernel methods provide an excellent framework for combination of heterogeneous data for two principle reasons: Mercer Kernels Are Now for All Data Types A kernel obeying  ... 
doi:10.1007/978-3-662-43968-5_11 fatcat:44khrkn4mbcl5acquwkluy2tky

Speed up kernel discriminant analysis

Deng Cai, Xiaofei He, Jiawei Han
2010 The VLDB journal  
In this paper, we present a new algorithm for kernel discriminant analysis, called Spectral Regression Kernel Discriminant Analysis (SRKDA).  ...  Moreover, it is easy to produce sparse projections (Sparse KDA) with a L 1 -norm regularizer.  ...  In [24, 25] , Moghaddam et al. proposed a spectral bounds framework for sparse subspace learning. Particularly, they proposed both exact and greedy algorithms for sparse PCA and sparse LDA.  ... 
doi:10.1007/s00778-010-0189-3 fatcat:it4o4ogxlrevjno7mkan4jmwrm

Combining multiple kernels for efficient image classification

Behjat Siddiquie, Shiv N. Vitaladevuni, Larry S. Davis
2009 2009 Workshop on Applications of Computer Vision (WACV)  
To use these methods with several feature channels, one needs to combine base kernels computed from them. Multiple kernel learning is an effective method for combining the base kernels.  ...  Discriminative kernel based methods, such as SVMs, have been shown to be quite effective for image classification.  ...  The classification results were compared with those of the Efficient Multiple Kernel Learning (EMKL) algorithm described in [22] .  ... 
doi:10.1109/wacv.2009.5403040 dblp:conf/wacv/SiddiquieVD09 fatcat:dnemxn22e5fbxdxxitto65hf7a

The Kernel Recursive Least-Squares Algorithm

Y. Engel, S. Mannor, R. Meir
2004 IEEE Transactions on Signal Processing  
Our Kernel-RLS (KRLS) algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum meansquared-error regressor.  ...  We demonstrate the performance and scaling properties of KRLS and compare it to a stateof-the-art Support Vector Regression algorithm, using both synthetic and real data.  ...  [22] , along with some positive results concerning the convergence rates for sparse greedy algorithms [22, 44] .  ... 
doi:10.1109/tsp.2004.830985 fatcat:mdgkagkwgvhclekfl2ochsonta

A Mathematical Programming Approach to the Kernel Fisher Algorithm

Sebastian Mika, Gunnar Rätsch, Klaus-Robert Müller
2000 Neural Information Processing Systems  
From this understanding, we are able to outline an interesting kernel-regression technique based upon the KFD algorithm. Simulations support the usefulness of our approach.  ...  We find that both, KFD and the proposed sparse KFD, can be understood in an unifying probabilistic context. Furthermore, we show connections to Support Vector Machines and Relevance Vector Machines.  ...  Tsuda for helpful comments and discussions.  ... 
dblp:conf/nips/MikaRM00 fatcat:xc52ftrh2jcilkfclba4r5ejsa

Design of Non-Linear Discriminative Dictionaries for Image Classification [chapter]

Ashish Shrivastava, Hien V. Nguyen, Vishal M. Patel, Rama Chellappa
2013 Lecture Notes in Computer Science  
We propose a kernel driven simultaneous orthogonal matching pursuit algorithm for the task of sparse coding in the feature space.  ...  learning algorithms.  ...  In [13] , Zhang et al. propose a kernel version of the sparse representation-based classification algorithm which was originally proposed for robust face recognition [15] .  ... 
doi:10.1007/978-3-642-37331-2_50 fatcat:vy4oi52mtrb27aadu2mbciktzq

Super-Sparse Regression for Fast Age Estimation from Faces at Test Time [chapter]

Ambra Demontis, Battista Biggio, Giorgio Fumera, Fabio Roli
2015 Lecture Notes in Computer Science  
Many current methods for age estimation rely on extracting computationally-demanding features from face images, and then use nonlinear regression to estimate the subject's age.  ...  Given a similarity measure between faces, our technique learns a sparse set of virtual face prototypes, whose number is fixed a priori, along with a set of optimal weight coefficients to perform linear  ...  As mentioned before, interpretability of decisions is another important property to understand whether the regression algorithm has properly learned some aging pattern, and, thus, if it may correctly predict  ... 
doi:10.1007/978-3-319-23234-8_51 fatcat:wuwlhnpxxnh4bclwatggzurfoa

Fast Kernel Sparse Representation

Hanxi Li, Yongsheng Gao, Jun Sun
2011 2011 International Conference on Digital Image Computing: Techniques and Applications  
The proposed Kernel OMP (KOMP) is much faster than the existing methods, and illustrates higher accuracy in some scenarios.  ...  A remarkable improvement (up to 2, 750 times) in efficiency is reported for S-KOMP, with only a negligible loss of accuracy.  ...  Some state-of-the-art performances were reported with SR approaches. To achieve higher classification accuracy, Gao et al. [6] equip sparse representations with the kernel trick [7] .  ... 
doi:10.1109/dicta.2011.20 dblp:conf/dicta/LiGS11 fatcat:7ugudibngjd6nicbsfhlyru57i

A Sparse Data Preprocessing Using Support Vector Regression
Support Vector Regression을 이용한 희소 데이터의 전처리

Sung-Hae Jun, Jung-Eun Park, Kyung-Whan Oh
2004 Journal of Korean institute of intelligent systems  
SVR has been applied in various fields -time series and financial (noisy and risky) prediction, approximation of complex engineering analyses, convex quadratic programming and choices of loss functions  ...  Instead of minimizing the observed training error, Support Vector Regression (SVR) attempts to minimize the generalization error bound so as to achieve generalized performance.  ...  A novel algorithm for sparse online greedy kernel-based nonlinear regression has implemented a form of gradient ascent and demonstrated its scaling and noise tolerance properties on three benchmark regression  ... 
doi:10.5391/jkiis.2004.14.6.789 fatcat:7ae6trcmqzdfnaqkwklnt7lgvy

Learning "best" kernels from data in Gaussian process regression. With application to aerodynamics [article]

Jean-Luc Akian and Luc Bonnet and Houman Owhadi and Éric Savin
2022 arXiv   pre-print
A first class of algorithms is kernel flow, which was introduced in a context of classification in machine learning.  ...  A second class of algorithms is called spectral kernel ridge regression, and aims at selecting a "best" kernel such that the norm of the function to be approximated is minimal in the associated RKHS.  ...  It was first used in a machine learning context for classification [55, 92] and more recently in geophysical forecasting [34] and with dynamical systems [20, 33] .  ... 
arXiv:2206.02563v1 fatcat:r6biz4lt4jdxjfnk6ax42hxvge
« Previous Showing results 1 — 15 out of 164 results