Filters








17,690 Hits in 4.1 sec

Second-Order Non-Stationary Online Learning for Regression [article]

Nina Vaits, Edward Moroshko, Koby Crammer
2013 arXiv   pre-print
We introduce two novel algorithms for online regression, designed to work well in non-stationary environment.  ...  The goal of a learner, in standard online learning, is to have the cumulative loss not much larger compared with the best-performing function from some fixed class.  ...  INTRODUCTION We consider the classical problem of online learning for regression.  ... 
arXiv:1303.0140v1 fatcat:6vmvxlpq3narxaavbndragrlie

Continuous Adaptation with Online Meta-Learning for Non-Stationary Target Regression Tasks

Taku Yamagata, Raúl Santos-Rodríguez, Peter Flach
2022 Signals  
CORAL is based on Bayesian linear regression with a sliding window and offline/online meta-learning.  ...  Being able to adapt to such non-stationary environments is vital for real-world applications of many machine learning algorithms.  ...  Acknowledgments: Thanks are due to Yu Chen for providing useful comments on a draft of the paper. Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/signals3010006 fatcat:7xurggk2xfaxrewyhkk34loqby

Dual Memory Architectures for Fast Deep Learning of Stream Data via an Online-Incremental-Transfer Strategy [article]

Sang-Woo Lee, Min-Oh Heo, Jiwon Kim, Jeonghee Kim, Byoung-Tak Zhang
2015 arXiv   pre-print
In this paper, we introduce dual memory architectures for online incremental deep learning.  ...  During the training phase, we use various online, incremental ensemble, and transfer learning techniques in order to achieve lower error of the architecture.  ...  We discover the online parameter of neural networks with an amount of data, which works reasonably for stationary data, but does not work well for non-stationary data.  ... 
arXiv:1506.04477v1 fatcat:ryij5slnsjhi7acauu3bhnfkry

Non-stationary Online Regression [article]

Anant Raj, Pierre Gaillard
2020 arXiv   pre-print
We show that an expected cumulative error of order Õ(n^1/3 C_n^2/3) can be obtained for non-stationary online linear regression where the total variation of parameter sequence is bounded by C_n.  ...  To the best of our knowledge, this work is the first extension of non-stationary online regression to non-stationary kernel regression.  ...  Non-Stationary Online Regression In this section, we discuss more general problem of non-stationary online regression.  ... 
arXiv:2011.06957v1 fatcat:ej3nsvbpendvpngf64tpxukrtu

Large Scale Online Multiple Kernel Regression with Application to Time-Series Prediction

Doyen Sahoo, Steven C. H. Hoi, Bin Li
2019 ACM Transactions on Knowledge Discovery from Data  
The first-order algorithms were further advanced by second -order algorithms that improved convergence by considering second-order information [12] .  ...  can be non-stationary and the optimal kernel function may change over time.  ...  learning process especially when dealing with non-stationary data.  ... 
doi:10.1145/3299875 fatcat:hadrjmqb35ggjf776usu43at5i

Adaptive and on-line learning in non-stationary environments

Edwin Lughofer, Moamar Sayed-Mouchaweh
2015 Evolving Systems  
Therefore, a balance between continuous learning and "forgetting" is necessary to deal with non-stationary environments.  ...  Incremental and sequential learning are essential concepts in order to avoid time-intensive re-training phases and account for the systems dynamics/changing data characteristics with low computational  ...  At the end, we hope that this special issue sheds light on some novel works on adaptive and online learning in non-stationary environments.  ... 
doi:10.1007/s12530-015-9128-2 fatcat:7umh4wifbba2jfblppbtepmyjy

Adaptive regularization for Lasso models in the context of nonstationary data streams

Ricardo P. Monti, Christoforos Anagnostopoulos, Giovanni Montana
2018 Statistical analysis and data mining  
This serves to reformulate the choice of regularization parameter in a principled framework for online learning.  ...  The proposed method is derived for linear regression and subsequently extended to generalized linear models.  ...  Second, data streams are often non-stationary and rarely satisfy iid assumptions required for methods based on the bootstrap [1] .  ... 
doi:10.1002/sam.11390 fatcat:b6pxcpey2bhjbmzcke6gsca45q

Re-adapting the Regularization of Weights for Non-stationary Regression [chapter]

Nina Vaits, Koby Crammer
2011 Lecture Notes in Computer Science  
We introduce a new algorithm for regression that uses per-feature-learning rate and provide a regret bound with respect to the best sequence of functions with drift.  ...  We also sketch an algorithm that achieves the best of the two worlds: in the stationary settings has log(T ) regret, while in the non-stationary settings has sub-linear regret.  ...  Our work combines both techniques with online learning with second order algorithm for regression.  ... 
doi:10.1007/978-3-642-24412-4_12 fatcat:elbu3qik5zhv3a6t3s4ivtlwvm

Adaptive regularization for Lasso models in the context of non-stationary data streams [article]

Ricardo Pio Monti, Christoforos Anagnostopoulos, Giovanni Montana
2017 arXiv   pre-print
This serves to reformulate the choice of regularization parameter in a principled framework for online learning.  ...  The proposed method is derived for linear regression and subsequently extended to generalized linear models.  ...  Second, data streams are often non-stationary and rarely satisfy iid assumptions required for methods based on the bootstrap [Aggarwal, 2007] .  ... 
arXiv:1610.09127v2 fatcat:izbgtvo6vvh2hj26k7d2o5h5di

Online multiple kernel regression

Doyen Sahoo, Steven C.H. Hoi, Bin Li
2014 Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining - KDD '14  
Kernel-based regression represents an important family of learning techniques for solving challenging regression tasks with non-linear patterns.  ...  To overcome these drawbacks, this paper presents a novel scheme of Online Multiple Kernel Regression (OMKR), which sequentially learns the kernel-based regressor in an online and scalable fashion, and  ...  learning process especially when dealing with non-stationary data.  ... 
doi:10.1145/2623330.2623712 dblp:conf/kdd/SahooHL14 fatcat:zqni4wpckngyvpkmcnwgud7fnm

Input Warping for Bayesian Optimization of Non-stationary Functions [article]

Jasper Snoek, Kevin Swersky, Richard S. Zemel, Ryan P. Adams
2014 arXiv   pre-print
One of the most frequently occurring of these is the class of non-stationary functions.  ...  We develop a methodology for automatically learning a wide family of bijective transformations or warpings of the input space using the Beta cumulative distribution function.  ...  Acknowledgements The authors would like to thank Nitish Srivastava for providing help with the Deepnet package. Jasper Snoek is a fellow in the Harvard Center for Research on Computation and Society.  ... 
arXiv:1402.0929v3 fatcat:eslvvcqxpvgs7nfl4hn5k5b3zq

Selective Sampling with Drift [article]

Edward Moroshko, Koby Crammer
2014 arXiv   pre-print
Recently there has been much work on selective sampling, an online active learning setting, in which algorithms work in rounds. On each round an algorithm receives an input and makes a prediction.  ...  Most of this work is focused on the stationary case, where it is assumed that there is a fixed target model, and the performance of the algorithm is compared to a fixed model.  ...  arXiv:1402.4084v1 [cs.LG] 17 Feb 2014 Stationary Non-stationary Regression [23, 2, 11] [16, 22] Classification [5, 8] This work Table 1: Fully supervised online RLS-based second-order algorithms  ... 
arXiv:1402.4084v1 fatcat:vwxpy5pgunfktbfiewk5mkcy6a

Nonstationary Nonparametric Online Learning: Balancing Dynamic Regret and Model Parsimony [article]

Amrit Singh Bedi, Alec Koppel, Ketan Rajawat, Brian M. Sadler
2019 arXiv   pre-print
Experiments demonstrate the usefulness of the proposed technique for online nonlinear regression and classification problems with non-stationary data.  ...  Our approach hinges upon viewing non-stationary learning as online convex optimization with dynamic comparators, for which performance is quantified by dynamic regret.  ...  CONCLUSION In this work, we focused on non-stationary learning, for which we proposed an online universal function approximator based on compressed kernel methods.  ... 
arXiv:1909.05442v1 fatcat:3b2afktnvbcrzds6xbk3xgcdz4

Learning a Generator Model from Terminal Bus Data [article]

Nikolay Stulov, Dejan J Sobajic, Yury Maximov, Deepjyoti Deka, Michael Chertkov
2019 arXiv   pre-print
Two ML techniques were developed and tested: (a) standard vector auto-regressive (VAR) model; and (b) novel customized long short-term memory (LSTM) deep learning model.  ...  The goal is to develop an emulator which is trained online and is capable of fast predictive computations.  ...  Auto-regressive process Auto-regressive (AR) model is a proven tool in many stationary time series applications.  ... 
arXiv:1901.00781v1 fatcat:hs36dddg2fg3pnxlvdyfq5uqzy

Estimation of the forgetting factor in kernel recursive least squares

Steven Van Vaerenbergh, Ignacio Santamaria, Miguel Lazaro-Gredilla
2012 2012 IEEE International Workshop on Machine Learning for Signal Processing  
In order to guarantee optimal performance its parameters need to be determined, specifically its kernel parameters, regularization and, most importantly in non-stationary environments, its forgetting factor  ...  In a recent work we proposed a kernel recursive least-squares tracker (KRLS-T) algorithm that is capable of tracking in non-stationary environments, thanks to a forgetting mechanism built on a Bayesian  ...  GAUSSIAN PROCESS REGRESSION WITH SPATIO-TEMPORAL COVARIANCE A non-stationary GP regression model The standard GP framework that has been described in Section 2 is designed to deal with stationary regression  ... 
doi:10.1109/mlsp.2012.6349749 dblp:conf/mlsp/VaerenberghSL12 fatcat:vvqipnb3i5c27dfqojyzmzjr6u
« Previous Showing results 1 — 15 out of 17,690 results