Filters








27,842 Hits in 10.2 sec

Fast dependent components for fMRI analysis

Eerika Savia, Arto Klami, Samuel Kaski
2009 2009 IEEE International Conference on Acoustics, Speech and Signal Processing  
and the white bars the extracted stimulus features.  ...  METHOD The task in DeCA is to find linear projections of two data sets, X and Y, so that the mutual information between the projections s x = w T x X and s y = w T y Y is maximized.  ... 
doi:10.1109/icassp.2009.4959939 dblp:conf/icassp/SaviaKK09 fatcat:nbaeepjf6bc4nhu7cfr2x332pi

PET IMAGE RECONSTRUCTION USING ANATOMICAL INFORMATION THROUGH MUTUAL INFORMATION BASED PRIORS: A SCALE SPACE APPROACH

Sangeetha Somayajula, Anand Rangarajan, Richard Leahy
2007 2007 4th IEEE International Symposium on Biomedical Imaging: From Nano to Macro  
The prior uses mutual information between feature vectors that are extracted from the anatomical and functional images using a scale space approach.  ...  We propose a mutual information based prior for incorporating information from co-registered anatomical images into PET image reconstruction.  ...  In [6] we described a non-parametric method that uses MI between feature vectors extracted from the anatomical and functional images to define a prior on the functional image.  ... 
doi:10.1109/isbi.2007.356814 dblp:conf/isbi/SomayajulaRL07 fatcat:kxs6tlwixjbotfczodx5sv7wha

Information-Theoretic Linear Feature Extraction Based on Kernel Density Estimators: A Review

José M. Leiva-Murillo, Antonio Artes-Rodríguez
2012 IEEE Transactions on Systems Man and Cybernetics Part C (Applications and Reviews)  
In this paper, we provide a unified study of the application of kernel density estimators to supervised linear feature extraction by means of criteria inspired by information and detection theory.  ...  We enrich this study by the incorporation of two novel criteria to the study: the mutual information and the likelihood ratio test, and perform both a theoretical and a experimental comparison between  ...  However, in some cases this estimation may be avoided, such as in the case of the Infomax method for independent component analysis [1] or the maximization of mutual information for feature extraction  ... 
doi:10.1109/tsmcc.2012.2187191 fatcat:dje2z3af2zeydbymaob537rq4q

Multiclass Common Spatial Patterns and Information Theoretic Feature Extraction

M. Grosse-Wentrup, M. Buss
2008 IEEE Transactions on Biomedical Engineering  
We show that for two-class paradigms CSP maximizes an approximation of mutual information of extracted EEG/MEG components and class labels.  ...  (ICs) that approximately maximize mutual information of ICs and class labels.  ...  The principle of ITFE is to extract those components of the EEG/MEG data that maximize mutual information of extracted components and class labels.  ... 
doi:10.1109/tbme.2008.921154 pmid:18632362 fatcat:2lun5bkt2zaf7cz2jakq2vjcfm

On feature extraction by mutual information maximization

Kari Torkkola
2002 IEEE International Conference on Acoustics Speech and Signal Processing  
Instead of a commonly used mutual information measure based on Kullback-Leibler divergence, we use a quadratic divergence measure, which allows us to make an efficient non-parametric implementation and  ...  We present a method for learning discriminative feature transforms using as criterion the mutual information between class labels and transformed features.  ...  Figure 1 : Learning feature transforms by maximizing the mutual information between class labels and transformed features.  ... 
doi:10.1109/icassp.2002.5743865 dblp:conf/icassp/Torkkola02 fatcat:uzp6s7pznjbezkzyb2of2o7fvy

Information Potential Auto-Encoders [article]

Yan Zhang and Mete Ozay and Zhun Sun and Takayuki Okatani
2017 arXiv   pre-print
In order to estimate the entropy of the encoding variables and the mutual information, we propose a non-parametric method.  ...  In the proposed framework, AEs are regularized by minimization of the mutual information between input and encoding variables of AEs during the training phase.  ...  Our proposed method of estimating mutual information can be considered as a hybrid of parametric (for H(z|x)) and non-parametric (for H(z)) models.  ... 
arXiv:1706.04635v2 fatcat:wypeto32lfhq3kh563spivfpzq

Classification via Information-Theoretic Fusion of Vector-Magnetic and Acoustic Sensor Data

Richard J. Kozick, Brian M. Sadler
2007 2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07  
We present a general approach for multi-modal sensor fusion based on nonparametric probability density estimation and maximization of a mutual information criterion.  ...  For the magnetic data, we present a parametric model with computationally efficient parameter estimation.  ...  The steps in our approach for linear feature extraction are described first, followed by an algorithm for finding features that maximize a mutual information criterion.  ... 
doi:10.1109/icassp.2007.366395 dblp:conf/icassp/KozickS07 fatcat:q54vvsuujrctdogvtroruhkc6y

Feature Extraction by Mutual Information Based on Minimal-Redundancy-Maximal-Relevance Criterion and Its Application to Classifying EEG Signal for Brain-Computer Interfaces [chapter]

Abbas Erfanian, Farid Oveisi, Ali Shadvar
2011 Recent Advances in Brain-Computer Interface Systems  
Methods Definition of mutual information Mutual information is a non-parametric measure of relevance between two variables.  ...  The approximation is inspired by the quadratic Renyi entropy which provides a non-parametric estimate of the mutual information.  ...  Feature Extraction by Mutual Information Based on Minimal-Redundancy-Maximal-Relevance Criterion and Its Application to Classifying EEG Signal for Brain-Computer Interfaces, Recent Advances in Brain-Computer  ... 
doi:10.5772/13935 fatcat:hzqvl7ynbzbfjmfhmvsqlx2cqe

A stochastic model for natural feature representation

S. Kumar, F. Ramos, B. Upcroft, M. Ridley, L. Ong, S. Sakkarieh, H. Durrant-Whyte
2005 2005 7th International Conference on Information Fusion  
The representation combines Isomap, a non-linear manifold learning algorithm, with Expectation Maximization, a statistical learning scheme.  ...  The representation is computed offline and results in a non-linear, non-Gaussian likelihood model relating visual observations such as color and texture to the underlying visual states.  ...  ACKNOWLEDGMENT This work is supported by the ARC Center of Excellence programme, funded by the Australian Research Council (ARC) and the New South Wales (NSW) State Government.  ... 
doi:10.1109/icif.2005.1591971 fatcat:2fhrg2lhjjgnhe4qp5bontslyi

Impact Of Sample Sizes On Information Theoretic Measures For Audio-Visual Signal Processing

Ivana Arsic, Jean-Philippe Thiran, Ninoslav Marina
2005 Zenodo  
An intuitive solution is that we have to choose those features from the two modalities that have maximal mutual information.  ...  First, in Section 2 we recall the mathematical background regarding information theoretic quantities, such as entropy and mutual information, followed by an introduction of a parametric method for density  ... 
doi:10.5281/zenodo.39079 fatcat:supupewnzvgano2aulaqa3d36u

Quantum Clustering-Based Feature Subset Selection for Mammographic Image Classification

Hamdi N, Auhmani K, Hassani M.M
2015 International Journal of Computer Science & Information Technology (IJCSIT)  
It uses similarity measures such as correlation coefficient (CC) and the mutual information (MI). The feature which maximizes this information is chosen by the algorithm.  ...  It is performed in three stages: extraction of features characterizing the tissue areas then a feature selection was achieved by the proposed algorithm and finally the classification phase was carried  ...  It operates similarity measures such as correlation coefficient (CC) and the mutual information (MI). The features which maximizes this information is chosen by the algorithm.  ... 
doi:10.5121/ijcsit.2015.7211 fatcat:3bpxlqv7pzdltdgsn55tyw4fsu

Design Choices and Theoretical Issues for Relative Feature Importance, a Metric for Nonparametric Discriminatory Power [chapter]

Hilary J. Holz, Murray H. Loew
2000 Lecture Notes in Computer Science  
We have developed relative feature importance (RFI), a metric for the classifier-independent ranking of features.  ...  Previously, we have shown the metric to rank accurately features for a wide variety of artificial and natural problems, for both two-class and multi-class problems.  ...  Using the non-parametric scatter matrices, feature extraction is based on local density estimation.  ... 
doi:10.1007/3-540-44522-6_72 fatcat:6gyz6bl2arfofbgwqupnkdymta

An unsupervised data projection that preserves the cluster structure

Lev Faivishevsky, Jacob Goldberger
2012 Pattern Recognition Letters  
Instead, we utilize a non-parametric estimation of the average cluster entropies and search for a linear projection and a clustering that maximizes the estimated mutual information between the projected  ...  Formally we attempt to find a projection that maximizes the mutual information between data points and clusters in the projected space.  ...  Instead, we utilize a non-parametric estimation of the average cluster entropies and search for a linear projection and a clustering that maximizes the estimated mutual information between the projected  ... 
doi:10.1016/j.patrec.2011.10.012 fatcat:aiefszh22rgnlnzuu6hks5wf5q

Estimating Cognitive State Using Eeg Signals

Tian Lan, Andre Adami, Deniz Erdogmus, Michael Pavel
2005 Zenodo  
Torkkola, "Feature Extraction by Non-Parametric Mutual Information Maximization", Journal of Machine Learning Research, vol. 3, pp. 1415-1438, 2003. [ 11 ] 11 R. M.  ...  algorithms based on a variety of assumptions including maximization of non-Gaussianity, minimization of mutual information, nonstationarity of the sources, etc., exist to solve the ICA problem [18] [  ... 
doi:10.5281/zenodo.39141 fatcat:5ffpj5fpundnjixnsppanpgism

Improving Multimodal Fusion with Hierarchical Mutual Information Maximization for Multimodal Sentiment Analysis [article]

Wei Han, Hui Chen, Soujanya Poria
2021 arXiv   pre-print
In this work, we propose a framework named MultiModal InfoMax (MMIM), which hierarchically maximizes the Mutual Information (MI) in unimodal input pairs (inter-modality) and between multimodal fusion result  ...  To address the intractable issue of MI bounds, we further formulate a set of computationally simple parametric and non-parametric methods to approximate their truth value.  ...  Acknowledgments This project is supported by the AcRF MoE Tier-  ... 
arXiv:2109.00412v2 fatcat:ob6mwha76zfzfcnaqu5prhzvc4
« Previous Showing results 1 — 15 out of 27,842 results