Filters








29,871 Hits in 2.8 sec

Probabilistic kernel least mean squares algorithms

Il Memming Park, Sohan Seth, Steven Van Vaerenbergh
2014 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)  
The kernel least mean squares (KLMS) algorithm is a computationally efficient nonlinear adaptive filtering method that "kernelizes" the celebrated (linear) least mean squares algorithm.  ...  We demonstrate that the least mean squares algorithm is closely related to the Kalman filtering, and thus, the KLMS can be interpreted as an approximate Bayesian filtering method.  ...  Inspired by the success of the LMS algorithm, a "kernelization" has been recently proposed under the name kernel least mean squares (KLMS) algorithm [1] .  ... 
doi:10.1109/icassp.2014.6855214 dblp:conf/icassp/ParkSV14 fatcat:rm6gllcivrfm5bhkht66xacnk4

On the Relationship between Online Gaussian Process Regression and Kernel Least Mean Squares Algorithms [article]

Steven Van Vaerenbergh, Jesus Fernandez-Bes, Víctor Elvira
2016 arXiv   pre-print
We study the relationship between online Gaussian process (GP) regression and kernel least mean squares (KLMS) algorithms.  ...  The probabilistic perspective allows us to understand how each of them handles uncertainty, which could explain some of their performance differences.  ...  Kernel least-mean-squares (KLMS) algorithms alleviate this computational burden by performing stochastic gradient descent of the mean square error, resulting in linear complexity per time step [6] .  ... 
arXiv:1609.03164v1 fatcat:aihah5mnyncyrckyzr4dfbzakq

Improving the Accuracy of Least-Squares Probabilistic Classifiers

Makoto YAMADA, Masashi SUGIYAMA, Gordon WICHERN, Jaak SIMM
2011 IEICE transactions on information and systems  
The least-squares probabilistic classifier (LSPC) is a computationally-efficient alternative to kernel logistic regression.  ...  Least-Squares Approach to Probabilistic Classification In this section, we review the least-squares probabilistic classifier 1 (LSPC) [1] .  ...  Conclusions Least-squares probabilistic classifier (LSPC) has been demonstrated to be a computationally-efficient alternative to kernel logistic regression (KLR).  ... 
doi:10.1587/transinf.e94.d.1337 fatcat:dzhndkm4f5fqjp7zgndnw5d7dq

Initialising Kernel Adaptive Filters via Probabilistic Inference [article]

Iván Castro, Cristóbal Silva, Felipe Tobar
2017 arXiv   pre-print
The proposed framework was validated on nonlinear time series of both synthetic and real-world nature, where it outperformed standard KAFs in terms of mean square error and the sparsity of the learnt dictionaries  ...  We present a probabilistic framework for both (i) determining the initial settings of kernel adaptive filters (KAFs) and (ii) constructing fully-adaptive KAFs whereby in addition to weights and dictionaries  ...  By adapting these model parameters, algorithms, such as kernel least mean square (KLMS) [4] , [5] provide an efficient way to improve signal estimation over time as more data become available.  ... 
arXiv:1707.03450v1 fatcat:hh64oe4ztzffvap7x5uhwua4sa

Superfast-Trainable Multi-Class Probabilistic Classifier by Least-Squares Posterior Fitting

Masashi SUGIYAMA
2010 IEICE transactions on information and systems  
In this paper, we propose an alternative probabilistic classification algorithm called Least-Squares Probabilistic Classifier (LSPC).  ...  In contrast, LSPC employs the linear combination of kernel functions and its parameters are learned by regularized least-squares fitting of the true class-posterior probability.  ...  In this paper, we proposed a simple probabilistic classification algorithm called Least-Squares Probabilistic Classifier (LSPC).  ... 
doi:10.1587/transinf.e93.d.2690 fatcat:ywnz7kdisndjrkvr4iqep37yli

The Generalized LASSO

V. Roth
2004 IEEE Transactions on Neural Networks  
For regression functionals which can be modeled as iteratively reweighted least-squares (IRLS) problems, we present a highly efficient algorithm with guaranteed global convergence.  ...  In the last few years, the support vector machine (SVM) method has motivated new interest in kernel regression techniques.  ...  Furthermore, also the SVM has been generalized to squared loss functions and iteratively reweighted least-squares functionals, allowing probabilistic interpretations, see, e.g., [11] - [14] .  ... 
doi:10.1109/tnn.2003.809398 pmid:15387244 fatcat:65x4jj3hkfaohc5x3gxibz3dxu

On the embedding parameters in kernel identification problem of nonlinear dynamical systems

N R Antropov, E D Agafonov
2020 IOP Conference Series: Materials Science and Engineering  
The paper presents simulations of the kernel least mean squares algorithm on one-step prediction problem for various values of embedding lag and embedding dimension.  ...  A common question in identification of dynamical system is sensitivity of kernel-based models to selected embedding lag and embedding dimension.  ...  In particular, the probabilistic approach made possible to develop a modification of the kernel least-squares method for non-stationary systems [18] allows increasing the adaptability of the original  ... 
doi:10.1088/1757-899x/734/1/012143 fatcat:d3gdqnqe35b75iogvgwczswgui

A least-squares approach to anomaly detection in static and sequential data

John A. Quinn, Masashi Sugiyama
2014 Pattern Recognition Letters  
We describe a probabilistic, nonparametric method for anomaly detection, based on a squared-loss objective function which has a simple analytical solution.  ...  The method shares the flexibility of other kernel-based anomaly detection methods, yet is typically much faster to train and test.  ...  Least-squares probabilistic classification We now give a brief review of least-squares probabilistic classification (Sugiyama, 2010) .  ... 
doi:10.1016/j.patrec.2013.12.016 fatcat:j4gxfqqjsjdq5bpp3ejuv6azda

Fast Sparse Approximation for Least Squares Support Vector Machine

Licheng Jiao, Liefeng Bo, Ling Wang
2007 IEEE Transactions on Neural Networks  
Index Terms-Fast algorithm, greedy algorithm, least squares support vector machine (LS-SVM), sparse approximation.  ...  In this paper, we present two fast sparse approximation schemes for least squares support vector machine (LS-SVM), named FSALS-SVM and PFSALS-SVM, to overcome the limitation of LS-SVM that it is not applicable  ...  [17] introduced the similar idea to solve kernel partial least squares regression.  ... 
doi:10.1109/tnn.2006.889500 pmid:17526336 fatcat:evegripr2rgbvpq4ng7fa5dgiy

A probabilistic least-mean-squares filter

Jesus Fernandez-Bes, Victor Elvira, Steven Van Vaerenbergh
2015 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)  
By means of an efficient approximation, this approach provides an adaptable step-size LMS algorithm together with a measure of uncertainty about the estimation.  ...  We introduce a probabilistic approach to the LMS filter.  ...  In this work, we provide a similar connection between state-space models and least-mean-squares (LMS).  ... 
doi:10.1109/icassp.2015.7178361 dblp:conf/icassp/Fernandez-BesEV15 fatcat:ygyxgfswzfgelgctepdepklgui

Prediction error quantification through probabilistic scaling – EXTENDED VERSION [article]

Victor Mirasierra, Martina Mammarella, Fabrizio Dabbene, Teodoro Alamo
2021 arXiv   pre-print
We illustrate the results of the paper by means of a numerical example.  ...  In this paper, we address the probabilistic error quantification of a general class of prediction methods.  ...  We notice that the proposed estimator is a weighted least square estimator with a ridge regression regularization term [11, 10] .  ... 
arXiv:2105.14187v2 fatcat:s3kgr6xvjrcjnbpmztqa3y7yei

A Constrained EM Algorithm for Principal Component Analysis

Jong-Hoon Ahn, Jong-Hoon Oh
2003 Neural Computation  
The method is easily applied to kernel PCA. It is also shown that the new EM algorithm is derived from a generalized least-squares formulation.  ...  The single probabilistic PCA, especially for the case where there is no noise, can find only a vector set that is a linear superposition of principal components and requires postprocessing, such as diagonalization  ...  The constrained EM algorithm derived from them gradually recovers the actual principal components, and it is naturally applied to the kernel PCA, (4.2) which are derived in appendix A.  ... 
doi:10.1162/089976603321043694 pmid:12590819 fatcat:33wfbbz2l5b73hskymcadepcri

Parallel Computing of Kernel Density Estimates with MPI [chapter]

Szymon Łukasik
2007 Lecture Notes in Computer Science  
Kernel density estimation is nowadays a very popular tool for nonparametric probabilistic density estimation.  ...  For n-dimensional probabilistic variable X with sample x i of length m, kernel K and bandwidth h, the kernel density estimation evaluated for x is defined as a function:f (x) = 1 mh n m i=1 K x − x i h  ...  Also least squares cross-validation method (LSCV) [10, 11] , where selecting optimal bandwidth is based on minimizing objective function g(h), has the same polynomial time complexity.  ... 
doi:10.1007/978-3-540-72588-6_120 fatcat:35oslyii4va5rgrg7pukbgh4hy

Nonlinear Adaptive Estimation by Kernel Least Mean Square with Surprise Criterion and Parallel Hyperslab Projection along Affine Subspaces Algorithm

Angie Forero, Celso P. Bottura
2018 Proceedings of the 15th International Conference on Informatics in Control, Automation and Robotics  
adaptive nonlinear estimation problem;the kernel least mean square with surprise criterion that uses concepts of likelihood and bayesian inference to predict the posterior distribution of data, guaranteeing  ...  It is based on the combination of: -the reproducing kernel to deal with the high complexity of nonlinear systems; -the parallel hyperslab projection along affine subspace learning algorithm, to deal with  ...  at low computational cost using ideas of kernel least mean square.  ... 
doi:10.5220/0006867803720378 dblp:conf/icinco/ForeroB18 fatcat:o7u6h2hxlnhcbflpwavdkehqk4

http://www.ijmlc.org/index.php?m=content&c=index&a=show&catid=108&id=1140

Hiroyuki Yoda, University of Tsukuba, Ibaraki, Japan, Akira Imakura, Momo Matsuda, Xiucai Ye, Tetsuya Sakurai
2020 International Journal of Machine Learning and Computing  
Our method is inspired by the Least-Squares Probabilistic Classifier (LSPC), which is an efficient multi-class classification method.  ...  Index Terms-Novelty detection, multimodal datasets, least-square probabilistic analysis.  Our method shows competitive results in both artificial dataset and benchmark datasets.  ...  Algorithm 1 1 Novelty Detection based on Least Square Probabilistic Analysis Input: Training samples {( , )} =1 , test sample te , bandwidth for the kernel function, and regularization parameter .  ... 
doi:10.18178/ijmlc.2020.10.4.968 fatcat:mohj5acvubga3jwwkgl3mrocmy
« Previous Showing results 1 — 15 out of 29,871 results