1 Hit in 0.043 sec

Group-Constrained Maximum Correntropy Criterion Algorithms for Estimating Sparse Mix-Noised Channels

Yanyan Wang, Yingsong Li, Felix Albu, Rui Yang
2017 Entropy  
A group-constrained maximum correntropy criterion (GC-MCC) algorithm is proposed on the basis of the compressive sensing (CS) concept and zero attracting (ZA) techniques and its estimating behavior is verified over sparse multi-path channels. The proposed algorithm is implemented by exerting different norm penalties on the two grouped channel coefficients to improve the channel estimation performance in a mixed noise environment. As a result, a zero attraction term is obtained from the expected
more » ... d from the expected l 0 and l 1 penalty techniques. Furthermore, a reweighting factor is adopted and incorporated into the zero-attraction term of the GC-MCC algorithm which is denoted as the reweighted GC-MCC (RGC-MMC) algorithm to enhance the estimation performance. Both the GC-MCC and RGC-MCC algorithms are developed to exploit well the inherent sparseness properties of the sparse multi-path channels due to the expected zero-attraction terms in their iterations. The channel estimation behaviors are discussed and analyzed over sparse channels in mixed Gaussian noise environments. The computer simulation results show that the estimated steady-state error is smaller and the convergence is faster than those of the previously reported MCC and sparse MCC algorithms. Entropy 2017, 19, 432 2 of 18 algorithm has been extensively investigated in channel estimation and noise cancellation owing to its simple implementation, high stability and fast convergence speed [4, 8] . However, its performance is not satisfactory for sparse channel estimation with a low signal to noise ratio (SNR). Recently, the compressive sampling (CS) concept has been introduced into AF algorithms to handle the sparse signals [9, 10] . After that, Y. Chen et al. put forward to the zero attracting LMS (ZA-LMS) and reweighting ZA-LMS (RZA-LMS) algorithms [11] . ZA-LMS and RZA-LMS algorithms are implemented by integrating l 1 -norm and reweighting l 1 -norm into the LMS's cost function, respectively. These two algorithms achieve lower steady-error and faster convergence speed than that of the basic LMS algorithm for handling sparse signals owing to the constructed zero attractors. Moreover, l 0 -norm and l p -norm have also been employed and introduced into the LMS's cost function to improve the performance of the ZA-and RZA-LMS algorithms in the sparse signal processing area [12] [13] [14] [15] [16] . All of those norm-constrained LMS algorithms can effectively exploit sparse characteristics of the in-nature sparse channels. However, they have a common drawback, i.e., their sensitivity to the input signal scaling (ISS) and noise interferences. In order to reduce the effects of the ISS, several improved AF algorithms have been presented by using high order error criteria or mixed error norms such as least mean fourth (LMF), least mean squares-fourth (LMS/F) and so on [17] [18] [19] [20] . Similarly, their related sparse forms have also been developed based on the above mentioned norm penalties [17, [21] [22] [23] [24] [25] [26] [27] . However, those AF algorithms and their related sparse forms are not good enough for dealing with the sparse channel under non-Gaussian or mixed noise environments. In recent years, information theoretic quantities were used for implementing cost function in adaptive systems. The effective entropy-based AF algorithms include the maximum correntropy criterion (MCC) and the minimum error entropy (MEE) [28] [29] [30] [31] [32] . In [28] , it is shown that the MEE is more complex than the MCC algorithm in the computational overhead. Therefore, the MCC algorithm has been extensively developed in non-Gaussian environments [29, 31, 32] . Furthermore, the MCC has low complexity which is nearly the same as that of the LMS-like algorithms. However, the performance of the MCC algorithm may be degraded for sparse signal processing. In order to enhance the MCC algorithm for handling sparse signal and sparse system identification, l 1 -norm and reweighting l 1 -norm constraints have been exerted on the channel coefficient vector and integrated into the MCC's cost function. Similar to the ZA-LMS and RZA-LMS algorithms, the zero attracting MCC (ZA-MCC) and reweighting ZA-MCC (RZA-MCC) algorithms [33] were obtained within the zero attracting framework. Then, the normalized MCC (NMCC) algorithm was also presented [34, 35] by referring to the normalized least mean square (NLMS) algorithm. Recently, W. Ma proposed a correntropy-induced metric (CIM) penalized MCC algorithm in [33] , and Y. Li presented a soft parameter function (SPF) constrained MCC algorithm [34] . The CIM and SPF are also one kind of l 0 -norm approximation to form sparse MCC algorithms, and the SPF-MCC is given in the appendix. As for these improved MCC algorithms, the l 0 -norm, CIM and SPF penalties are incorporated into the MCC's cost function to devise desired zero attractors. In the ZA-MCC algorithm, the zero attractor gives uniform penalty on all the channel coefficients, while the l 0 -norm approximation MCC algorithms will increase the complexity. In this paper, a group-constrained maximum correntropy criterion (GC-MCC) algorithm based on the CS concept and zero attracting (ZA) techniques is proposed in order to fully exploit the sparseness characteristics of the multi-path channels. The GC-MCC algorithm is derived by incorporating a non-uniform norm into the MCC's cost function and the non-uniform norm is split into two groups according to the mean of the magnitudes of the channel coefficients. For the large channel coefficients group, the l 0 -norm penalty is used, while the l 1 -norm penalty is implemented for the small channel group. Then, a reweighting technique is utilized in the GC-MCC algorithm to develop the reweighted GC-MCC (RGC-MCC) algorithm. The performance of the GC-and RGC-MCC algorithms is evaluated and discussed for estimating mix-noised sparse channels. The GC-and RGC-MCC algorithms achieve superior performance in both steady-error and convergence for sparse channel estimations with different sparsity levels. Simulation results show that the GC-and RGC-MCC algorithms can effectively
doi:10.3390/e19080432 fatcat:qsopvje7mnd5thp4g23ilvwcby