A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Nonstationary Nonparametric Online Learning: Balancing Dynamic Regret and Model Parsimony
[article]
2019
arXiv
pre-print
An open challenge in supervised learning is conceptual drift: a data point begins as classified according to one label, but over time the notion of that label changes. Beyond linear autoregressive models, transfer and meta learning address drift, but require data that is representative of disparate domains at the outset of training. To relax this requirement, we propose a memory-efficient online universal function approximator based on compressed kernel methods. Our approach hinges upon viewing
arXiv:1909.05442v1
fatcat:3b2afktnvbcrzds6xbk3xgcdz4