A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Stochastic Discrete First-Order Algorithm for Feature Subset Selection
2020
IEICE transactions on information and systems
However, this algorithm is unable to escape from locally optimal solutions. To resolve this, we propose a stochastic discrete first-order (SDFO) algorithm for feature subset selection. ...
This paper addresses the problem of selecting a significant subset of candidate features to use for multiple linear regression. ...
Stochastic Discrete First-Order Algorithm This section presents our SDFO algorithm for feature subset selection. ...
doi:10.1587/transinf.2019edp7274
fatcat:vqzapqtxdrckrdafvs5lnfpbqy
Adaptive MIMO antenna selection via discrete stochastic optimization
2005
IEEE Transactions on Signal Processing
In the same spirit as traditional adaptive filtering algorithms, we propose simulation based discrete stochastic optimization algorithms to adaptively select a better antenna subset using criteria such ...
These discrete stochastic approximation algorithms are ideally suited to minimize the error rate since computing a closed form expression for the error rate is intractable. ...
This paper presents discrete stochastic approximation algorithms for selecting the optimal antenna subset based on advanced discrete stochastic optimization techniques found in the recent operations research ...
doi:10.1109/tsp.2005.857056
fatcat:skn7b6fhbvei7hfxmet7fwfxby
Enhancing the Performance of Classifier Using Particle Swarm Optimization (PSO) - based Dimensionality Reduction
2015
International Journal of Energy Information and Communications
This paper proposes a PSO and F-Score based feature selection algorithm for selecting the significant features that contribute to improve the classification accuracy. ...
Particle Swarm Optimization (PSO) is a computational technique which is applied in the feature selection process to get an optimal solution. ...
The filter approach first selects significant feature subset before application of any classification algorithm and removes least significant features from the given dataset. ...
doi:10.14257/ijeic.2015.6.5.03
fatcat:a3dc5n5tffay3lskqx5xpfmyxa
SIMULATED ANNEALING ALGORITHM FOR FEATURE SELECTION
2016
INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY
In this discussion the simulated annealing algorithm is implemented in pest and weather data set for feature selection and it reduces the dimension of the attributes through specified iterations. ...
Simulated annealing is stochastic computational technique that searches for global optimum solutions in optimization problems. ...
K.Thangadurai for his constant support and suggestion. Special thanks to my principal Dr.M.Subbiah for his encouragement. ...
doi:10.24297/ijct.v15i2.565
fatcat:n5opbljduvd2biu5xuoshc3waa
Bayesian network classifiers versus selective -NN classifier
2005
Pattern Recognition
This subset is established by means of sequential feature selection methods. ...
In this paper Bayesian network classifiers are compared to the k-nearest neighbor (k-NN) classifier, which is based on a subset of features. ...
Bouchaffra for the valuable comments on this paper and to Ingo Reindl and Voest Alpine Donawitz Stahl for providing the data for the first experiment. ...
doi:10.1016/j.patcog.2004.05.012
fatcat:6ylgxmcervazld2q2wjl6gun3e
Wasserstein Learning of Determinantal Point Processes
[article]
2020
arXiv
pre-print
Determinantal point processes (DPPs) have received significant attention as an elegant probabilistic model for discrete subset selection. ...
In this work, by deriving a differentiable relaxation of a DPP sampling algorithm, we present a novel approach for learning DPPs that minimizes the Wasserstein distance between the model and data composed ...
Leveraging recent work on a DPP sampling algorithm with computational complexity that is sublinear in the size of the ground set [3] , and stochastic softmax tricks for gradient estimation of discrete ...
arXiv:2011.09712v1
fatcat:onpygopy7bge7dawfsu3grlysu
Ensemble Relational Learning based on Selective Propositionalization
[article]
2013
arXiv
pre-print
In this paper we propose a selective propositionalization method that search the optimal set of relational features to be used by a probabilistic learner in order to minimize a loss function. ...
When the combination is static (static propositionalization), the constructed features are considered as boolean features and used offline as input to a statistical learner; while, when the combination ...
a naïve Bayes classifier to select an optimal subset of the features. ...
arXiv:1311.3735v1
fatcat:7qmhadxkb5grbnfrcymordqgwe
Rough Set-Based Dataset Reduction Method Using Swarm Algorithm and Cluster Validation Function
2015
2015 48th Hawaii International Conference on System Sciences
The results confirm that the proposed method provides an effective tool for solving simultaneous attribute reduction and discretization problems. R P m j j J J ¦ where H J H J H J H J H J J J ...
A Rough Set (RS) based dataset reduction method using SWARM optimization algorithm and a cluster validation function is proposed. ...
guideline for feature subset selection. ...
doi:10.1109/hicss.2015.180
dblp:conf/hicss/HuangCC15
fatcat:ko7xijfvnnacvp47hbolv6hlrm
Integrated Acor/Iacomv-R-Svm Algorithm
2017
Zenodo
The first algorithm, ACOR-SVM, will tune SVM parameters, while the second IACOMV-R-SVM algorithm will simultaneously tune SVM parameters and select the feature subset. ...
This paper presents two algorithms that can simultaneously tune SVM parameters and select the feature subset. ...
subset selection
Input: Features
Output: Optimal feature subset
Begin
calculate features subset size randomly
initialize pheromone table
for i = 1 to no. of features do
compute weight for each ...
doi:10.5281/zenodo.1314882
fatcat:7wj7pfkgjngephwyoaolznbg6y
Integrated Acor/Iacomv-R-Svm Algorithm
2017
Zenodo
The first algorithm, ACOR-SVM, will tune SVM parameters, while the second IACOMV-R-SVM algorithm will simultaneously tune SVM parameters and select the feature subset. ...
This paper presents two algorithms that can simultaneously tune SVM parameters and select the feature subset. ...
subset selection
Input: Features
Output: Optimal feature subset
Begin
calculate features subset size randomly
initialize pheromone table
for i = 1 to no. of features do
compute weight for each ...
doi:10.5281/zenodo.1133093
fatcat:r2lozud34nervotmr6wjzehjhi
Parameter estimation for Boolean models of biological networks
2011
Theoretical Computer Science
The key feature is a discrete analog of parameter estimation for continuous models. ...
With only experimental data as input, the software can be used as a tool for reverse-engineering of Boolean network models from experimental time course data. ...
Acknowledgements The authors are grateful to the Statistical and Applied Mathematical Sciences Institute (SAMSI), the Center for Discrete Mathematics and Theoretical Computer Science (DIMACS), and the ...
doi:10.1016/j.tcs.2010.04.034
fatcat:astzshtb6rfbtp33ul2abivz7e
Overview of Particle Swarm Optimisation for Feature Selection in Classification
[chapter]
2014
Lecture Notes in Computer Science
This paper presents a review of PSO for feature selection in classification. After describing the background of feature selection and PSO, recent work involving PSO for feature selection is reviewed. ...
Feature selection is a process of selecting a subset of relevant features from a large number of original features to achieve similar or better classification performance and improve the computation efficiency ...
However, only 5 runs were conducted for stochastic algorithms. Another improved BPSO algorithm was proposed in [26] for gene expression data. ...
doi:10.1007/978-3-319-13563-2_51
fatcat:ju3w5orwmng5vppfoxhhpkdmoa
Thompson Sampling for Optimizing Stochastic Local Search
[chapter]
2017
Lecture Notes in Computer Science
Stochastic local search (SLS), like many other stochastic optimization algorithms, has several parameters that need to be optimized in order for the algorithm to find high quality solutions within a short ...
We derive a regret bound for PolyTS and validate its performance on synthetic problems of varying difficulty as well as on feature selection problems. ...
Stochastic local search (SLS), like many other stochastic optimization algorithms, has several parameters that need to be optimized in order for the algorithm to find high quality solutions within a short ...
doi:10.1007/978-3-319-71249-9_30
fatcat:h2oyl7mbovak3e5ug4wz6xjdoi
Optimizing Probabilistic Models for Relational Sequence Learning
[chapter]
2011
Lecture Notes in Computer Science
The second step finds an optimal subset of the constructed features that leads to high classification accuracy, by adopting a wrapper approach that uses a stochastic local search algorithm embedding a ...
This paper tackles the problem of relational sequence learning selecting relevant features elicited from a set of labelled sequences. ...
In the second step, Lynx adopts a wrapper feature selection approach, that uses a stochastic local search procedure, embedding a naïve Bayes classifier to select an optimal subset of the features constructed ...
doi:10.1007/978-3-642-21916-0_27
fatcat:ooonnvoixbekflc5geb5lg6tri
Feature Selection in Genetic Fuzzy Discretization for the Pattern Classification Problems
2007
IEICE transactions on information and systems
We propose a new genetic fuzzy discretization method with feature selection for the pattern classification problems. ...
We use a genetic algorithm with feature selection not only to optimize these parameters but also to reduce the amount of transformed data by filtering the unconcerned attributes. ...
For feature selection, we encode the feature subset as nbit string with 0 and 1 represents the selection and exclusion of features. ...
doi:10.1093/ietisy/e90-d.7.1047
fatcat:pwnm3qo6pbd5rocp3fcnyl5znu
« Previous
Showing results 1 — 15 out of 55,920 results