A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2012; you can also visit the original URL.
The file type is application/pdf
.
Filters
FEATURE SCORING BY MUTUAL INFORMATION FOR CLASSIFICATION OF MASS SPECTRA
2006
Applied Artificial Intelligence
In this paper, it is shown how using mutual information can help answering to both objectives, in a model-free nonlinear way. ...
A combination of ranking and forward selection makes it possible to select several feature groups that may lead to similar classification performances, but that may lead to different results when evaluated ...
The method based on the combination of feature ranking and forward selection, and using mutual information on sets of features rather than individually, makes it possible to rank feature subsets. ...
doi:10.1142/9789812774118_0079
fatcat:j4uwf2vgf5hgfbte6okjnyqlu4
Hybrid Feature Selection Using Genetic Algorithm and Information Theory
2013
International Journal of Fuzzy Logic and Intelligent Systems
The proposed method consists of two parts: a wrapper part with an improved genetic algorithm(GA) using a new reproduction method and a filter part using mutual information. ...
In particular, when classifying a large number of features or variables, the accuracy and computational time of the classifier can be improved by using the relevant feature subset to remove the irrelevant ...
Each feature is ranked using the evaluated mutual information. Then, we select the top-ranked features with higher mutual information to use as candidate individuals for the genetic algorithm. ...
doi:10.5391/ijfis.2013.13.1.73
fatcat:fkd22kmjzzdvhpr6fs5aqaqgyq
Infosel++: Information Based Feature Selection C++ Library
[chapter]
2010
Lecture Notes in Computer Science
A large package of algorithms for feature ranking and selection has been developed. ...
Infosel++, Information Based Feature Selection C++ Library, is a collection of classes and utilities based on probability estimation that can help developers of machine learning methods in rapid interfacing ...
This may be done by ranking these features and selecting the most important ones, selecting a subset of relevant features or by combining (aggregating) subsets of features to create new, more informative ...
doi:10.1007/978-3-642-13208-7_49
fatcat:oafkescywfavled272wrurx6ui
Ensemble Feature Selection from Cancer Gene Expression Data using Mutual Information and Recursive Feature Elimination
2020
2020 Third International Conference on Advances in Electronics, Computers and Communications (ICAECC)
In this paper, we have proposed a randomized ensemble method for feature selection from cancer gene expression data using a combination of mutual information and recursive feature elimination. ...
We obtained a classification accuracy of 99% with a gene subset of size 316 genes and with a subset of size 4 the accuracy is 95%. ...
ACKNOWLEDGMENT Authors would like to thank Department of Science and Technology (DST), Government of India, for financially supporting this work under the scheme DST-ICPS 2019. ...
doi:10.1109/icaecc50550.2020.9339518
fatcat:6wgoytav6jb6jmsg3p7fac3hgu
Literature Review on Feature Selection Methods for High-Dimensional Data
2016
International Journal of Computer Applications
Keywords Introduction to variable and feature selection, information gain-based feature selection, gain ratio-based feature selection, symmetric uncertainty-based feature selection, subset-based feature ...
selection, feature subset-based feature selection, feature ranking-based feature selection, attribute selection, dimensionality reduction, variable selection, survey on feature selection, feature selection ...
In this method, mutual information measure is used to determine the relevancy between the individual feature and the target-class. ...
doi:10.5120/ijca2016908317
fatcat:fi3dkzxwnjgp5mop6xdr5luaze
Feature Selection Based on Information Theory Filters
[chapter]
2003
Neural Networks and Soft Computing
Feature selection is an essential component in all data mining applications. Ranking of futures was made by several inexpensive methods based on information theory. ...
Accuracy of neural, similarity based and decision tree classifiers calculated with reduced number of features. Comparison with computationally more expensive feature elimination methods was made. ...
Small values stress the importance of high mutual information between the feature and set of classes; large values stress more the mutual information with the features already included in the set S. ...
doi:10.1007/978-3-7908-1902-1_23
fatcat:j3lq6yerrfceljmmgb6p6u7duq
A Feature Selection Algorithm based on Mutual Information using Local Non-uniformity Correction Estimator
2017
International Journal of Advanced Computer Science and Applications
In this paper, a novel algorithm is proposed to select the best subset of features based on mutual information and local non-uniformity correction estimator. ...
Feature subset selection is an effective approach used to select a compact subset of features from the original set. This approach is used to remove irrelevant and redundant features from datasets. ...
The MIFS used mutual information among features and between each feature and the decision class to determine the best k features from the original set. ...
doi:10.14569/ijacsa.2017.080656
fatcat:jlyagiho7fbgvd53hvllkzh4ii
IMPROVING FUNCTIONAL ANNOTATION OF NON-SYNONOMOUS SNPs WITH INFORMATION THEORY
2004
Biocomputing 2005
In addition, we use a greedy algorithm to identify a subset of highly informative features [1]. ...
The SVM's classification accuracy is highly correlated with the ranking of the input features by their mutual information. ...
Kroetz and the PMT project, Dr. K. Karplus for contributing to the script used in mutual information evaluations, and Dr. P. Ng for mutation data. Residue B-factor . ...
doi:10.1142/9789812702456_0038
fatcat:jha3fxab7bf5xfdqzbpe46belq
COMPARATIVE STUDY: FEATURE SELECTION METHODS IN THE BLENDED LEARNING ENVIRONMENT
2017
Facta Universitatis Series Automatic Control and Robotics
We have concluded that the RelieF, Wrapper Subset Evaluation and mutual information present the most convenient features selection methods for blended learning environment. ...
Information gain, Symmetrical Uncert Feature Eval, RelieF, Correlation based Feature Selection, Wrapper Subset Evaluation, Classifier Subset Evaluator features selection methods were implemented to find ...
The models with the best performances were created for subset RF(11), WSE (5) and S2 subset with mutual information measure MI from 0.34 to 1.67. ...
doi:10.22190/fuacr1702095d
fatcat:ah2ssbwrf5bmxjegdqzta4em34
On the Use of Variable Complementarity for Feature Selection in Cancer Classification
[chapter]
2006
Lecture Notes in Computer Science
A feature selection filter based on the DISR criterion is compared in theoretical and experimental terms to recently proposed information theoretic criteria. ...
The approach is based on the use of a new information theoretic selection criterion: the double input symmetrical relevance (DISR). ...
Variable Ranking (Rank) The ranking method returns a ranking of variables on the basis of their individual mutual information with the output. ...
doi:10.1007/11732242_9
fatcat:fsh4bxchgjhxtb6t47azmwuldq
Streaming Feature Selection for Multi-Label Data with Dynamic Sliding Windows and Feature Repulsion Loss
2019
Entropy
Finally, for the fixed sliding window, the best feature subset is selected according to this loss function. ...
Then, the interaction between features is measured by a loss function inspired by the mutual repulsion and attraction between atoms in physics. ...
The best performing algorithm receives a rank of 1, the second best receives a rank of 2, and so on (See Tables 4-8) . ...
doi:10.3390/e21121151
fatcat:du5mgztyzjhibfd67bgkbisgfq
Information Theory-Based Feature Selection: Minimum Distribution Similarity with Removed Redundancy
[chapter]
2020
Lecture Notes in Computer Science
Different from the previous methods which use mutual information and greedy iteration with a loss function to rank the features, we rank features according to their distribution similarities in two classes ...
measured by relative entropy, and then remove the high redundant features from the sorted feature subsets. ...
Battiti [9] proposed Mutual Information Feature Selection (MIFS) method, which finds feature subset via greedy selection according to the MI between feature subsets and labels. ...
doi:10.1007/978-3-030-50426-7_1
fatcat:uugtu46mzjfangdmrqzymsqeby
Feature Selection Based on Information Theory for Speaker Verification
[chapter]
2009
Lecture Notes in Computer Science
In this sense, a method based on mutual information is studied in order to keep as much discriminative information as possible and the less amount of redundant information. ...
The use of automatic methods able to reduce the dimension of the feature space without losing performance is one important problem nowadays. ...
In this work, we study the use of an information theory based method [6] in order to select automatically the best subset of acoustic coefficients. ...
doi:10.1007/978-3-642-10268-4_36
fatcat:uz3gvsbezbgn7bf3hgxlls3p7q
Texture feature ranking with relevance learning to classify interstitial lung disease patterns
2012
Artificial Intelligence in Medicine
These features were ranked and selected according to their relevance obtained by GMLVQ and, for comparison, to a mutual information (MI) criteria. ...
The classification performance for different feature subsets was calculated for a k-nearest-neighbor (kNN) and a random forests classifier (RanForest), and support vector machines with a linear and a radial ...
Fig. 2 . 2 Ranking of the used texture features for the relevance measure obtained by GMLVQ (upper panel) and the mutual information MI (lower panel). ...
doi:10.1016/j.artmed.2012.07.001
pmid:23010586
pmcid:PMC4096044
fatcat:5zip3hixszagllr4jtrzwr2q24
RETRACTED ARTICLE: Feature selection for machine learning classification problems: a recent overview
2011
Artificial Intelligence Review
So it is of fundamental importance to select the relevant and necessary features in the preprocessing step. This paper describes basic feature selection issues and current research points. ...
The learning accuracy and training speed may be significantly deteriorated by these superfluous features. ...
The outer optimization stage completes the global search for the best subset of features in a wrapper way, in which the mutual information between the predictive labels of a trained classifier and the ...
doi:10.1007/s10462-011-9230-1
fatcat:qkoa7kflqvdcvgmhmrvakelnzm
« Previous
Showing results 1 — 15 out of 66,183 results