Filters








26,457 Hits in 2.3 sec

Bayesian Extensions of Kernel Least Mean Squares [article]

Il Memming Park, Sohan Seth, Steven Van Vaerenbergh
2013 arXiv   pre-print
The kernel least mean squares (KLMS) algorithm is a computationally efficient nonlinear adaptive filtering method that "kernelizes" the celebrated (linear) least mean squares algorithm.  ...  We demonstrate that the least mean squares algorithm is closely related to the Kalman filtering, and thus, the KLMS can be interpreted as an approximate Bayesian filtering method.  ...  Inspired by the success of the LMS algorithm, a "kernelization" has been recently proposed under the name kernel least mean squares (KLMS) algorithm [7] .  ... 
arXiv:1310.5347v1 fatcat:x2khe5ap2zdctmgjqndmjwywou

On the Relationship between Online Gaussian Process Regression and Kernel Least Mean Squares Algorithms [article]

Steven Van Vaerenbergh, Jesus Fernandez-Bes, Víctor Elvira
2016 arXiv   pre-print
We study the relationship between online Gaussian process (GP) regression and kernel least mean squares (KLMS) algorithms.  ...  The probabilistic perspective allows us to understand how each of them handles uncertainty, which could explain some of their performance differences.  ...  By doing so, GP regression can be considered the natural Bayesian nonlinear extension of linear minimum mean square error estimation (MMSE) algorithms, which are central in signal processing [3] .  ... 
arXiv:1609.03164v1 fatcat:aihah5mnyncyrckyzr4dfbzakq

Probabilistic kernel least mean squares algorithms

Il Memming Park, Sohan Seth, Steven Van Vaerenbergh
2014 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)  
The kernel least mean squares (KLMS) algorithm is a computationally efficient nonlinear adaptive filtering method that "kernelizes" the celebrated (linear) least mean squares algorithm.  ...  We demonstrate that the least mean squares algorithm is closely related to the Kalman filtering, and thus, the KLMS can be interpreted as an approximate Bayesian filtering method.  ...  Inspired by the success of the LMS algorithm, a "kernelization" has been recently proposed under the name kernel least mean squares (KLMS) algorithm [1] .  ... 
doi:10.1109/icassp.2014.6855214 dblp:conf/icassp/ParkSV14 fatcat:rm6gllcivrfm5bhkht66xacnk4

Selection properties of type II maximum likelihood (empirical Bayes) in linear models with individual variance components for predictors

Tahira Jamil, Cajo J.F. ter Braak
2012 Pattern Recognition Letters  
We show analytically that RVM selects predictors when the absolute z-ratio (|least squares estimate|/standard error) exceeds 1 in the case of orthogonal predictors and, for M = 2, that this still holds  ...  In extensions of RVM to obtain stronger selection, improper priors (based on the inverse gamma family) have been assigned to the inverse precisions (variances) with parameters estimated by penalized marginal  ...  Jamil's research was supported by a grant from Higher Education Commission of Pakistan through NUFFIC (The Netherlands).  ... 
doi:10.1016/j.patrec.2012.01.004 fatcat:ikr6jwm7pjbjje6vbuw6vl75gm

Discriminant Kernels derived from the optimum nonlinear discriminant analysis

Takio Kurita
2011 The 2011 International Joint Conference on Neural Networks  
KDA is one of the nonlinear extensions of LDA and construct nonlinear discriminant mapping by using kernel functions.  ...  This means that the ONDA is closely related to Bayesian decision theory.  ...  MLR is known as one of the generalized linear models (GLM) which are a flexible generalization of the ordinary least squares regression.  ... 
doi:10.1109/ijcnn.2011.6033235 dblp:conf/ijcnn/Kurita11 fatcat:zbsatzixpvgoxfulp6qpgeg3ki

Page 7718 of Mathematical Reviews Vol. , Issue 96m [page]

1996 Mathematical Reviews  
We derive mean squared error results for the closeness of this estimator to both the true density and the unbinned kernel estimator.  ...  Summary: “We provide an asymptotic formula for the mean in- tegrated squared error (MISE) of nonlinear wavelet-based density estimators.  ... 

A kernel-based Bayesian approach to climatic reconstruction

I. Robertson, D. Lucy, L. Baxter, A. M. Pollard, R. G. Aykroyd, A. C. Barker, A. H.C. Carter, V. R. Switsur, J. S. Waterhouse
1999 The Holocene  
Title: A kernel-based Bayesian approach to climatic reconstruction Journal: The Holocene Date: 1999 Volume: 9(4) Pages: 525-530 Abstract To understand recent climatic trends and possible future climatic  ...  Proxy measures of past climatic fluctuations can be used to extend this record beyond the limited period of instrumental measurements.  ...  One of the authors (IR) was supported by a Henry Giles Fellowship. The work was also supported in part by a grant from the Natural Environment Research Council (NERC GR3/11395).  ... 
doi:10.1191/095968399676373488 fatcat:eca7vcjuojekbhmytiqmpvbmiy

Bayesian kernel-based system identification with quantized output data [article]

Giulio Bottegal, Gianluigi Pillonetto, Håkan Hjalmarsson
2015 arXiv   pre-print
We model the impulse response as a zero-mean Gaussian process whose covariance (kernel) is given by the recently proposed stable spline kernel, which encodes information on regularity and exponential stability  ...  Numerical simulations show a substantial improvement in the accuracy of the estimates over state-of-the-art kernel-based methods when employed in identification of systems with quantized data.  ...  Here instead, g is computed by means of (20), that is a minimum mean square error Bayes estimator.  ... 
arXiv:1504.06877v1 fatcat:j2ph5fwqh5b67doe2iv6xcdrre

Adaptive Bayesian nonparametric regression using kernel mixture of polynomials with application to partial linear model [article]

Fangzheng Xie, Yanxun Xu
2018 arXiv   pre-print
We propose a kernel mixture of polynomials prior for Bayesian nonparametric regression. The regression function is modeled by local averages of polynomials with kernel mixture weights.  ...  We further investigate the application of the kernel mixture of polynomials to the partial linear model and obtain both the near-optimal rate of contraction for the nonparametric component and the Bernstein-von  ...  the least-squared estimate in terms of accuracy.  ... 
arXiv:1710.08017v3 fatcat:ngiiqidoondndonmlkxqgfoahu

GP CaKe: Effective brain connectivity with causal kernels [article]

Luca Ambrogioni, Max Hinne, Marcel van Gerven, Eric Maris
2017 arXiv   pre-print
We construct a novel class of causal covariance functions that enforce the desired properties of the causal kernels, an approach which we call GP CaKe.  ...  By construction, the model and its hyperparameters have biophysical meaning and are therefore easily interpretable.  ...  Figure 3 .Figure 4 . 34 The performance of the recovery of the effective connectivity kernels in terms of the correlation and mean squared error between the actual and the recovered kernel.  ... 
arXiv:1705.05603v1 fatcat:6k77dlbecfhghackq4uqe4ucru

Logistic discriminant analysis

Takio Kurita, Kenji Watanabe, Nobuyuki Otsu
2009 2009 IEEE International Conference on Systems, Man and Cybernetics  
Also Otsu pointed out that LDA could be regarded as a linear approximation of the ONDA through the linear approximations of the Bayesian posterior probabilities.  ...  Linear discriminant analysis (LDA) is one of the well known methods to extract the best features for the multiclass discrimination.  ...  MLR is known as one of the generalized linear model which is a flexible generalization of ordinary least squares regression.  ... 
doi:10.1109/icsmc.2009.5346255 dblp:conf/smc/KuritaWO09 fatcat:qx33lea2mvhdpbz7bg55dowwye

On the Linearity of Bayesian Interpolators for Non-Gaussian Continuous-Time AR(1) Processes

Arash Amini, Philippe Thevenaz, John Paul Ward, Michael Unser
2013 IEEE Transactions on Information Theory  
The Bayesian interpolator can be expressed in a convolutive form where the kernel is described in terms of exponential splines.  ...  We redefine the Bayesian estimation problem in the Fourier domain with the help of characteristic forms.  ...  In fact, the posterior mean estimator, which is also referred to as the Bayesian filter, minimizes the least mean-square error whenever it is finite.  ... 
doi:10.1109/tit.2013.2258371 fatcat:74hmnuj27rdjrhpq4wnbnena7q

On the positivity and magnitudes of Bayesian quadrature weights

Toni Karvonen, Motonobu Kanagawa, Simo Särkkä
2019 Statistics and computing  
Gaussian and Hardy kernels). This suggests that gradient-based optimisation of design points may be effective in constructing stable and robust Bayesian quadrature rules.  ...  This article reviews and studies the properties of Bayesian quadrature weights, which strongly affect stability and robustness of the quadrature rule.  ...  SS was supported by the Academy of Finland project 313708.  ... 
doi:10.1007/s11222-019-09901-0 fatcat:oqcj74fvdveunn6ny53had26om

Assessing the Performance of Ordinary Least Square and Kernel Regression

Usman U, Garba N, Zoramawa A.B, Usman H
2020 Zenodo  
The assessment of Ordinary Least Squares (OLS) and kernel regression on their predictive performance was studied.  ...  However, the mean square error (MSE) and root mean square error (RMSE) was used to find out the most efficient among the estimated models.  ...  Root Mean Square Error (RMSE) The Root Mean Square Error (RMSE) (also called the root mean square deviation, RMSD) is a frequently used measure of the difference between values predicted by a model and  ... 
doi:10.5281/zenodo.3764305 fatcat:wugyrmd27ndybiz4outzvwdj4u

Kernel method for corrections to scaling

Kenji Harada
2015 Physical Review E  
In all cases, when the precision of the example data increases, inference results of the new kernel method correctly converge.  ...  We propose a new kernel method based on Gaussian process regression to fix this problem generally. We test the performance of the new kernel method for some example cases.  ...  ACKNOWLEDGMENTS The author thanks Keith Slevin for enlightening discussions of the least-squares method. The author also thanks Naoki Kawashima for fruitful discussions.  ... 
doi:10.1103/physreve.92.012106 pmid:26274124 fatcat:lad245xnqje4lnivismypgvwj4
« Previous Showing results 1 — 15 out of 26,457 results