A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2018; you can also visit the original URL.
The file type is `application/pdf`

.

## Filters

##
###
Probabilistic kernel least mean squares algorithms

2014
*
2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
*

The

doi:10.1109/icassp.2014.6855214
dblp:conf/icassp/ParkSV14
fatcat:rm6gllcivrfm5bhkht66xacnk4
*kernel**least**mean**squares*(KLMS)*algorithm*is a computationally efficient nonlinear adaptive filtering method that "*kernelizes*" the celebrated (linear)*least**mean**squares**algorithm*. ... We demonstrate that the*least**mean**squares**algorithm*is closely related to the Kalman filtering, and thus, the KLMS can be interpreted as an approximate Bayesian filtering method. ... Inspired by the success of the LMS*algorithm*, a "*kernelization*" has been recently proposed under the name*kernel**least**mean**squares*(KLMS)*algorithm*[1] . ...##
###
On the Relationship between Online Gaussian Process Regression and Kernel Least Mean Squares Algorithms
[article]

2016
*
arXiv
*
pre-print

We study the relationship between online Gaussian process (GP) regression and

arXiv:1609.03164v1
fatcat:aihah5mnyncyrckyzr4dfbzakq
*kernel**least**mean**squares*(KLMS)*algorithms*. ... The*probabilistic*perspective allows us to understand how each of them handles uncertainty, which could explain some of their performance differences. ...*Kernel**least*-*mean*-*squares*(KLMS)*algorithms*alleviate this computational burden by performing stochastic gradient descent of the*mean**square*error, resulting in linear complexity per time step [6] . ...##
###
Improving the Accuracy of Least-Squares Probabilistic Classifiers

2011
*
IEICE transactions on information and systems
*

The

doi:10.1587/transinf.e94.d.1337
fatcat:dzhndkm4f5fqjp7zgndnw5d7dq
*least*-*squares**probabilistic*classifier (LSPC) is a computationally-efficient alternative to*kernel*logistic regression. ...*Least*-*Squares*Approach to*Probabilistic*Classification In this section, we review the*least*-*squares**probabilistic*classifier 1 (LSPC) [1] . ... Conclusions*Least*-*squares**probabilistic*classifier (LSPC) has been demonstrated to be a computationally-efficient alternative to*kernel*logistic regression (KLR). ...##
###
Initialising Kernel Adaptive Filters via Probabilistic Inference
[article]

2017
*
arXiv
*
pre-print

The proposed framework was validated on nonlinear time series of both synthetic and real-world nature, where it outperformed standard KAFs in terms of

arXiv:1707.03450v1
fatcat:hh64oe4ztzffvap7x5uhwua4sa
*mean**square*error and the sparsity of the learnt dictionaries ... We present a*probabilistic*framework for both (i) determining the initial settings of*kernel*adaptive filters (KAFs) and (ii) constructing fully-adaptive KAFs whereby in addition to weights and dictionaries ... By adapting these model parameters,*algorithms*, such as*kernel**least**mean**square*(KLMS) [4] , [5] provide an efficient way to improve signal estimation over time as more data become available. ...##
###
Superfast-Trainable Multi-Class Probabilistic Classifier by Least-Squares Posterior Fitting

2010
*
IEICE transactions on information and systems
*

In this paper, we propose an alternative

doi:10.1587/transinf.e93.d.2690
fatcat:ywnz7kdisndjrkvr4iqep37yli
*probabilistic*classification*algorithm*called*Least*-*Squares**Probabilistic*Classifier (LSPC). ... In contrast, LSPC employs the linear combination of*kernel*functions and its parameters are learned by regularized*least*-*squares*fitting of the true class-posterior probability. ... In this paper, we proposed a simple*probabilistic*classification*algorithm*called*Least*-*Squares**Probabilistic*Classifier (LSPC). ...##
###
The Generalized LASSO

2004
*
IEEE Transactions on Neural Networks
*

For regression functionals which can be modeled as iteratively reweighted

doi:10.1109/tnn.2003.809398
pmid:15387244
fatcat:65x4jj3hkfaohc5x3gxibz3dxu
*least*-*squares*(IRLS) problems, we present a highly efficient*algorithm*with guaranteed global convergence. ... In the last few years, the support vector machine (SVM) method has motivated new interest in*kernel*regression techniques. ... Furthermore, also the SVM has been generalized to*squared*loss functions and iteratively reweighted*least*-*squares*functionals, allowing*probabilistic*interpretations, see, e.g., [11] - [14] . ...##
###
On the embedding parameters in kernel identification problem of nonlinear dynamical systems

2020
*
IOP Conference Series: Materials Science and Engineering
*

The paper presents simulations of the

doi:10.1088/1757-899x/734/1/012143
fatcat:d3gdqnqe35b75iogvgwczswgui
*kernel**least**mean**squares**algorithm*on one-step prediction problem for various values of embedding lag and embedding dimension. ... A common question in identification of dynamical system is sensitivity of*kernel*-based models to selected embedding lag and embedding dimension. ... In particular, the*probabilistic*approach made possible to develop a modification of the*kernel**least*-*squares*method for non-stationary systems [18] allows increasing the adaptability of the original ...##
###
A least-squares approach to anomaly detection in static and sequential data

2014
*
Pattern Recognition Letters
*

We describe a

doi:10.1016/j.patrec.2013.12.016
fatcat:j4gxfqqjsjdq5bpp3ejuv6azda
*probabilistic*, nonparametric method for anomaly detection, based on a*squared*-loss objective function which has a simple analytical solution. ... The method shares the flexibility of other*kernel*-based anomaly detection methods, yet is typically much faster to train and test. ...*Least*-*squares**probabilistic*classification We now give a brief review of*least*-*squares**probabilistic*classification (Sugiyama, 2010) . ...##
###
Fast Sparse Approximation for Least Squares Support Vector Machine

2007
*
IEEE Transactions on Neural Networks
*

Index Terms-Fast

doi:10.1109/tnn.2006.889500
pmid:17526336
fatcat:evegripr2rgbvpq4ng7fa5dgiy
*algorithm*, greedy*algorithm*,*least**squares*support vector machine (LS-SVM), sparse approximation. ... In this paper, we present two fast sparse approximation schemes for*least**squares*support vector machine (LS-SVM), named FSALS-SVM and PFSALS-SVM, to overcome the limitation of LS-SVM that it is not applicable ... [17] introduced the similar idea to solve*kernel*partial*least**squares*regression. ...##
###
A probabilistic least-mean-squares filter

2015
*
2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
*

By

doi:10.1109/icassp.2015.7178361
dblp:conf/icassp/Fernandez-BesEV15
fatcat:ygyxgfswzfgelgctepdepklgui
*means*of an efficient approximation, this approach provides an adaptable step-size LMS*algorithm*together with a measure of uncertainty about the estimation. ... We introduce a*probabilistic*approach to the LMS filter. ... In this work, we provide a similar connection between state-space models and*least*-*mean*-*squares*(LMS). ...##
###
Prediction error quantification through probabilistic scaling – EXTENDED VERSION
[article]

2021
*
arXiv
*
pre-print

We illustrate the results of the paper by

arXiv:2105.14187v2
fatcat:s3kgr6xvjrcjnbpmztqa3y7yei
*means*of a numerical example. ... In this paper, we address the*probabilistic*error quantification of a general class of prediction methods. ... We notice that the proposed estimator is a weighted*least**square*estimator with a ridge regression regularization term [11, 10] . ...##
###
A Constrained EM Algorithm for Principal Component Analysis

2003
*
Neural Computation
*

The method is easily applied to

doi:10.1162/089976603321043694
pmid:12590819
fatcat:33wfbbz2l5b73hskymcadepcri
*kernel*PCA. It is also shown that the new EM*algorithm*is derived from a generalized*least*-*squares*formulation. ... The single*probabilistic*PCA, especially for the case where there is no noise, can find only a vector set that is a linear superposition of principal components and requires postprocessing, such as diagonalization ... The constrained EM*algorithm*derived from them gradually recovers the actual principal components, and it is naturally applied to the*kernel*PCA, (4.2) which are derived in appendix A. ...##
###
Parallel Computing of Kernel Density Estimates with MPI
[chapter]

2007
*
Lecture Notes in Computer Science
*

*Kernel*density estimation is nowadays a very popular tool for nonparametric

*probabilistic*density estimation. ... For n-dimensional

*probabilistic*variable X with sample x i of length m,

*kernel*K and bandwidth h, the

*kernel*density estimation evaluated for x is defined as a function:f (x) = 1 mh n m i=1 K x − x i h ... Also

*least*

*squares*cross-validation method (LSCV) [10, 11] , where selecting optimal bandwidth is based on minimizing objective function g(h), has the same polynomial time complexity. ...

##
###
Nonlinear Adaptive Estimation by Kernel Least Mean Square with Surprise Criterion and Parallel Hyperslab Projection along Affine Subspaces Algorithm

2018
*
Proceedings of the 15th International Conference on Informatics in Control, Automation and Robotics
*

adaptive nonlinear estimation problem;the

doi:10.5220/0006867803720378
dblp:conf/icinco/ForeroB18
fatcat:o7u6h2hxlnhcbflpwavdkehqk4
*kernel**least**mean**square*with surprise criterion that uses concepts of likelihood and bayesian inference to predict the posterior distribution of data, guaranteeing ... It is based on the combination of: -the reproducing*kernel*to deal with the high complexity of nonlinear systems; -the parallel hyperslab projection along affine subspace learning*algorithm*, to deal with ... at low computational cost using ideas of*kernel**least**mean**square*. ...##
###
http://www.ijmlc.org/index.php?m=content&c=index&a=show&catid=108&id=1140

2020
*
International Journal of Machine Learning and Computing
*

Our method is inspired by the

doi:10.18178/ijmlc.2020.10.4.968
fatcat:mohj5acvubga3jwwkgl3mrocmy
*Least*-*Squares**Probabilistic*Classifier (LSPC), which is an efficient multi-class classification method. ... Index Terms-Novelty detection, multimodal datasets,*least*-*square**probabilistic*analysis. Our method shows competitive results in both artificial dataset and benchmark datasets. ...*Algorithm*1 1 Novelty Detection based on*Least**Square**Probabilistic*Analysis Input: Training samples {( , )} =1 , test sample te , bandwidth for the*kernel*function, and regularization parameter . ...
« Previous

*Showing results 1 — 15 out of 29,871 results*