Filters








1,795 Hits in 12.6 sec

:{unav)

Jose C. Principe, Dongxin Xu, Qun Zhao, John W. Fisher III
2012 Journal of VLSI Signal Processing Systems for Signal, Image and Video Technology  
A novel algorithm based on Renyi's quadratic entropy is used to train, directly from a data set, linear or nonlinear mappers for entropy maximization or minimization.  ...  This paper discusses a framework for learning based on information theoretic criteria.  ...  Acknowledgments: This work was partially supported by a DARPA-Air Force grant F33615-97-1019 and NSF ECS-9900394.  ... 
doi:10.1023/a:1008143417156 fatcat:jk7422trdfaxzo4hvdicfwe7ei

An Information-Maximization Approach to Blind Separation and Blind Deconvolution

Anthony J. Bell, Terrence J. Sejnowski
1995 Neural Computation  
We derive a new self-organizing learning algorithm that maximizes the information transferred in a network of nonlinear units.  ...  Finally, we derive dependencies of information transfer on time delays. We suggest that information maximization provides a unifying framework for problems in "blind" signal processing.  ...  Many helpful observations also came from Paul Viola, Barak Pearlmutter, Kenji Doya, Misha Tsodyks, Alexandre Pouget, Peter Dayan, Olivier Coenen, and Iris Ginzburg.  ... 
doi:10.1162/neco.1995.7.6.1129 pmid:7584893 fatcat:26psxygznbfhtbkha7q4fgjjee

An Introduction to Information Theoretic Learning, Part II: Applications

Daniel Silva, Denis Fantinato, Janio Canuto, Leonardo Duarte, Aline Neves, Ricardo Suyama, Jugurta Montalvão, Romis Attux
2016 Journal of Communication and Information Systems  
This is the second part of the introductory tutorial about information theoretic learning, which, after the theoretical foundations presented in Part I, now discusses the concepts of correntropy, a new  ...  similarity measure derived from the quadratic entropy, and presents example problems where the ITL framework can be successfully applied: dynamic modelling, equalization, independent component analysis  ...  ACKNOWLEDGMENT The authors thank FAPESP (Grant 2013/14185-2), CAPES and CNPq for the financial support.  ... 
doi:10.14209/jcis.2016.7 fatcat:fjsomfgggfglvcpkklr4jvmrne

Entropy minimization for supervised digital communications channel equalization

I. Santamaria, D. Erdogmus, J.C. Principe
2002 IEEE Transactions on Signal Processing  
Moreover, for a linear equalizer, an orthogonality condition for the minimum entropy solution that leads to an alternative fixed-point iterative minimization method is derived.  ...  On the other hand, for nonlinear channels and using a multilayer perceptron (MLP) as the equalizer, differences between both criteria appear.  ...  ACKNOWLEDGMENT The authors would like to thank the referees for providing us with valuable comments and insightful suggestions that have greatly improved this paper.  ... 
doi:10.1109/78.995074 fatcat:lmqwnkqxzzawlj6pnk6zgemige

An analysis of entropy estimators for blind source separation

Kenneth E. Hild, Deniz Erdogmus, Jose C. Principe
2006 Signal Processing  
Three reasons are given why Renyi's entropy estimators for Information-Theoretic Learning (ITL), on which the proposed method is based, is to be preferred over Shannon's entropy estimators for ITL.  ...  An extensive analysis of a non-parametric, information-theoretic method for instantaneous blind source separation (BSS) is presented.  ...  These three criteria will be referred to as the (modified) Minimum Renyi's Mutual Information (MRMI), Minimum Shannon Mutual Information (MSMI), and (modified) MRMI-SIG criteria, respectively.  ... 
doi:10.1016/j.sigpro.2005.04.015 fatcat:zuybddsqsbfczp24m5ppdbfj4y

Unsupervised Learning based Modified C- ICA for Audio Source Separation in Blind Scenario

Naveen Dubey, Rajesh Mehra
2016 International Journal of Information Technology and Computer Science  
This work proposed divergence algorithm designed for faster convergence speed along with good quality of separation.  ...  The main challenge in BASS is quality of separation and separation speed and the convergence speed gets compromised when separation techniques focused on quality of separation.  ...  Estimation of difference between joint entropy and marginal entropy of different information sources leads to ICA implementation using minimum mutual information (MMI).  ... 
doi:10.5815/ijitcs.2016.03.02 fatcat:346wb4p2ebb6poyyipi5g4xgiu

Blind source separation of convolutive mixtures

Shoji Makino, Harold H. Szu
2006 Independent Component Analyses, Wavelets, Unsupervised Smart Sensors, and Neural Networks IV  
BSS can be regarded as an intelligent version of ABF in the sense that it can adapt without any information on the array manifold or the target direction, and sources can be simultaneously active in BSS  ...  This paper introduces the blind source separation (BSS) of convolutive mixtures of acoustic signals, especially speech.  ...  We search for the separation matrix W(ω) that minimizes the mutual information, maximize the nongaussianity, or maximize the likelihood of the output.  ... 
doi:10.1117/12.674413 fatcat:rjp6ttufbnf37fajrpt2msqaxy

Adaptive blind signal processing-neural network approaches

S. Amari, A. Cichocki
1998 Proceedings of the IEEE  
Learning algorithms and underlying basic mathematical ideas are presented for the problem of adaptive blind signal processing, especially instantaneous blind separation and multichannel blind deconvolution  ...  We discuss recent developments of adaptive learning algorithms based on the natural gradient approach and their properties concerning convergence, stability, and efficiency.  ...  Back, and Dr. N. Murata for their fruitful collaboration and helpful discussions.  ... 
doi:10.1109/5.720251 fatcat:jg337aeuxnd3rec634qd3qjfde

Convergence properties and data efficiency of the minimum error entropy criterion in adaline training

D. Erdogmus, J.C. Principe
2003 IEEE Transactions on Signal Processing  
Recently, we have proposed the minimum error entropy (MEE) criterion as an information theoretic alternative to the widely used mean square error criterion in supervised adaptive system training.  ...  Mathematical investigation of the proposed entropy estimator revealed interesting insights about the process of information theoretical learning.  ...  Other successful applications of the proposed nonparametric entropy estimator and MEE include maximally informative subspace projections, blind source separation [9] - [11] , and blind deconvolution  ... 
doi:10.1109/tsp.2003.812843 fatcat:bzwfdf2i3zdsvgxblyrfuehfnq

Self-Adaptive Blind Source Separation Based on Activation Functions Adaptation

L. Zhang, A. Cichocki, S. Amari
2004 IEEE Transactions on Neural Networks  
The learning algorithm for the activation function adaptation is consistent with the one for training the demixing model.  ...  paper to develop a general framework of blind separation from a practical point of view with special emphasis on the activation function adaptation.  ...  Various approaches, such as entropy maximization and minimization of mutual information, lead to the cost function (4) where is determined adaptively during training.  ... 
doi:10.1109/tnn.2004.824420 pmid:15384517 fatcat:kwskyv54kvdzzil7mejvlfflwa

Comparison of maximum entropy and minimal mutual information in a nonlinear setting

Fabian J. Theis, Ch. Bauer, Elmar W. Lang
2002 Signal Processing  
In blind source separation (BSS), two di erent separation techniques are mainly used: minimal mutual information (MMI), where minimization of the mutual output information yields an independent random  ...  vector, and maximum entropy (ME), where the output entropy is maximized.  ...  Acknowledgements We thank the referees for their helpful comments during the preparation of this paper.  ... 
doi:10.1016/s0165-1684(02)00200-1 fatcat:eoufootd7fhc7hmgruyosbj3mm

Blind Identification and Separation of Noisy Source Signals : Neural Network Approaches

Andrzej CICHOCKI
1998 Systems, Control and Information  
Sympo- In this paper, we have presented neural net-IVOLTItl-9Z pp. 731-734 (1997) works On-line adaptive algorithms models and a family of associated adaptive in non stationary en- learning algorithms  ...  Emphasis was given to the neural network or adaptive multichannel filtering models and associated on-line nonlinear adaptive learn- ing algorithms which have some biological resem- blance or plausibility  ...  , Control and Information Engineers Institute ofSystems, Control and Information Engineers  ... 
doi:10.11509/isciesci.42.2_63 fatcat:6xkansmiungsbngdudq2pxmwpy

Fast and robust fixed-point algorithms for independent component analysis

A. Hyvarinen
1999 IEEE Transactions on Neural Networks  
and/or of minimum variance.  ...  These contrast functions enable both the estimation of the whole decomposition by minimizing mutual information, and estimation of individual independent components as projection pursuit directions.  ...  The advantage of neural on-line learning rules is that the inputs can be used in the algorithm at once, thus enabling faster adaptation in a nonstationary environment.  ... 
doi:10.1109/72.761722 pmid:18252563 fatcat:5jngho43xfhs3jcs4st7cogx7q

Independent Component Analysis and Blind Signal Separation: Theory, Algorithms and Applications

Eduardo F. Simas Filho, José M. de Seixas, Natanael N. Moura, Diego B. Haddad, José M. Faier, Maria C. S. Albuquerque
2012 Learning and Nonlinear Models  
An overview on the main statistical principles that guide the search for the independent components is formulated, methods for blind signal separation that require both high-order and second-order statistics  ...  Some of the most successful algorithms for both ICA and BSS are derived.  ...  Acknowledgements The authors are thankful for the support provided by CNPq and FAPERJ (Brazil), and for the Brazilian Navy Research Institute (IPqM) for providing the data set used in this work.  ... 
doi:10.21528/lnlm-vol10-no1-art4 fatcat:fewa5i5dozbilbr3euyntf4kvu

Electrical Power System Harmonic Analysis Technology Based on Fast ICA BSS Algorithm

Chen Yu, Meng Jintao, Guo Rongxing
2013 Advances in Information Sciences and Service Sciences  
Through comparing the separation performance by using the different step size, the adaptive changing step size natural gradient blind separation algorithm for the electrical power system harmonic signal  ...  The experiment result proved that the electrical power system harmonic signal separation based on the gradient algorithm is accurate.  ...  Blind signal separation algorithm mainly includes: the information maximization (Informax) algorithm [4] , the natural gradient algorithm, the Equivariant Adaptive Blind Separation (EASI) algorithm [  ... 
doi:10.4156/aiss.vol5.issue7.6 fatcat:kewkmkslkjbmpnqstd3q5fd6gu
« Previous Showing results 1 — 15 out of 1,795 results