A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Learning Data Representations with Sparse Coding Neural Gas
2008
The European Symposium on Artificial Neural Networks
We consider the problem of learning an unknown (overcomplete) basis from an unknown sparse linear combination. Introducing the "sparse coding neural gas" algorithm, we show how to employ a combination of the original neural gas algorithm and Oja's rule in order to learn a simple sparse code that represents each training sample by a multiple of one basis vector. We generalise this algorithm using orthogonal matching pursuit in order to learn a sparse code where each training sample is
dblp:conf/esann/LabuschBM08
fatcat:pkdw6pj6hfbvnmrnl4bzbpwjei