Minimum Error Entropy Algorithms with Sparsity Penalty Constraints

Zongze Wu, Siyuan Peng, Wentao Ma, Badong Chen, Jose Principe
2015 Entropy  
Recently, sparse adaptive learning algorithms have been developed to exploit system sparsity as well as to mitigate various noise disturbances in many applications. In particular, in sparse channel estimation, the parameter vector with sparsity characteristic can be well estimated from noisy measurements through a sparse adaptive filter. In previous studies, most works use the mean square error (MSE) based cost to develop sparse filters, which is rational under the assumption of Gaussian
more » ... utions. However, Gaussian assumption does not always hold in real-world environments. To address this issue, we incorporate in this work an l1-norm or a reweighted l1-norm into the minimum error entropy (MEE) criterion to develop new sparse adaptive filters, which may perform much better than the MSE based methods, especially in heavy-tailed non-Gaussian situations, since the error entropy can capture higher-order statistics of the errors. In addition, a new approximator of l0-norm, based on the correntropy induced metric (CIM), is also used as a sparsity penalty term (SPT). We analyze the mean square convergence of the proposed new sparse adaptive filters. An energy conservation relation is derived and a sufficient condition is obtained, which ensures the mean square convergence. Simulation results confirm the superior performance of the new algorithms. OPEN ACCESS Entropy 2015, 17 3420 Keywords: sparse estimation; minimum error entropy; correntropy induced metric; mean square convergence; impulsive noise MSC Codes: 62B10
doi:10.3390/e17053419 fatcat:5lq42sh56rgaphqodhju6ic4hm