Filters








2,568,803 Hits in 6.8 sec

A Comparison of Feature-Selection Methods for Intrusion Detection [chapter]

Hai Thanh Nguyen, Slobodan Petrović, Katrin Franke
2010 Lecture Notes in Computer Science  
algorithm Introduction • Some IDS also include a feature selection algorithm -Determines the features to be used by the representation algorithm Introduction • The task of the feature selection  ...  Introduction • Individual feature evaluation is based on -Their relevance to intrusion detection -Relationships with other features Introduction • Here we compare CFS and mRMR solved by means  ... 
doi:10.1007/978-3-642-14706-7_19 fatcat:lsxqdubisrgnrbiyd6af7tliv4

Using Data-Object Flow Relations to Derive Control Flow Variants in Configurable Business Processes [chapter]

Riccardo Cognini, Flavio Corradini, Andrea Polini, Barbara Re
2015 Lecture Notes in Business Information Processing  
The bpFM approach to variability modeling The approach we propose mixes the characteristics and objectives of the two different modeling context (BP e FM) To model BPs variability an extended version of  ...  Among the activities relations are defined similarly to what is done in FODA A set of mapping rules from bpFM to BPMN 2.0 fragments has been defined Successively according to a specific feature selection  ...  all the behavioral constraints Conclusions Variability needs to be more and more taken into account also in order to reduce costs.  ... 
doi:10.1007/978-3-319-15895-2_19 fatcat:lh2qevkkszakfh2e2lv3kzttvu

BigDataGrapes D5.3 - Trust-aware Decision Support System

Nyi Nyi Htun, Diego Rojo Garcia, Katrien Verbert
2020 Zenodo  
Section 1 lays out an introduction to the deliverable describing our previous version and motivations. In Section 2, GaCoVi is described together with the development technology we utilised.  ...  This document presents an update of deliverable 5.3 where we demonstrate a decision support system that uses visualisation techniques to put domain experts in the loop of feature selection for the development  ...  variable is a common feature selection heuristic.  ... 
doi:10.5281/zenodo.4546194 fatcat:tzoitmegfjg4lnxybckynjtx7i

A procedure for Alternate Test feature design and selection

Manuel Barragan, Gildas Leger
2014 IEEE design & test  
Machine learning techniques are then used to map circuit signatures and circuit specifications.  ...  This work presents efficient algorithms for selecting information rich signatures, and for designing new ones that will improve the prediction accuracy.  ...  Firstly, in order to get an intuitive insight into the information contained into the Monte Carlo variables, we perform our feature selection search over the 33 process variables that are defined in the  ... 
doi:10.1109/mdat.2014.2361722 fatcat:reiloehjxvcoxjmyfpwtun2c2q

Effective Feature Selection for Feature Possessing Group Structure

Yasmeen Sheikh
2017 International Journal Of Engineering And Computer Science  
Feature selection has become an interesting research topic in recent years. It is an effective method to tackle the data with high dimension.  ...  Its objective is to execute the feature selection in within the group and between the group of features that select discriminative features and remove redundant features to obtain optimal subset.  ...  After an estimation and sparsity an error of prediction of groups within group selection all the features are re-evaluated so far to remove redundancy this stage is known as between group variable selection  ... 
doi:10.18535/ijecs/v6i5.19 fatcat:ebcldr5rtvcrtbz76a4loy2bky

Family-based deductive verification of software product lines

Thomas Thüm, Ina Schaefer, Martin Hentschel, Sven Apel
2012 Proceedings of the 11th International Conference on Generative Programming and Component Engineering - GPCE '12  
We illustrate and evaluate our approach for software product lines written in a feature-oriented dialect of Java and specified using the Java Modeling Language.  ...  We present a family-based approach of deductive verification to prove the correctness of a software product line efficiently.  ...  ACKNOWLEDGMENTS We thank Richard Bubel for comments on earlier drafts and for assisting us with KeY. We gratefully acknowledge our beneficial discussion with Erik Ernst.  ... 
doi:10.1145/2371401.2371404 dblp:conf/gpce/ThumSHA12 fatcat:bk4rgrznz5g2dbj6fcljjpbqam

Genetic heterogeneity analysis using genetic algorithm and network science

Zhendong Sha, Yuanzhu Chen, Ting Hu
2022 Proceedings of the Genetic and Evolutionary Computation Conference Companion  
Multiple GA-based feature selection runs are used to collect an ensemble of the high-performing feature subsets.  ...  We generate a feature co-selection network from the ensemble, where nodes represent genetic variables and edges represent their co-selection frequencies.  ...  An edge linking two variables indicates that the pair has been co-selected in at least one subset in 𝐸, and its weight is their co-selection frequency.  ... 
doi:10.1145/3520304.3529027 fatcat:5s2ln5mhhvbhbkm75x2lriyxtq

A Feature Selection Method for Air Quality Forecasting [chapter]

Luca Mesin, Fiammetta Orione, Riccardo Taormina, Eros Pasero
2010 Lecture Notes in Computer Science  
An application is shown regarding the forecast of PM10 concentration with one day of advance, based on the selected features feeding an artificial neural network.  ...  Partial mutual information criterion is used to select the regressors which carry the maximal non redundant information to be used to build a prediction model.  ...  Once selected decisive features, prediction is performed using an Artificial Neural Network (ANN).  ... 
doi:10.1007/978-3-642-15825-4_66 fatcat:fgxunowjgfawpmxidhxuxqxdty

Bio-Inspired Requirements Variability Modeling with use Case

Esraa Abdel-Ghani, Said Ghoul
2019 International Journal of Software Engineering & Applications  
It supports use case variability modeling by introducing versions and revisions features and related relations.  ...  Feature Model (FM) is the most important technique used to manage the variability through products in Software Product Lines (SPLs).  ...  It is generated by selecting coherently features that an application family needs from variable domain requirements.  ... 
doi:10.5121/ijsea.2019.10205 fatcat:6i6taqli55dm5mb7ejaktaohxu

A survey on feature selection methods

Girish Chandrashekar, Ferat Sahin
2014 Computers & electrical engineering  
The objective is to provide a generic introduction to variable elimination which can be applied to a wide array of machine learning problems. We focus on Filter, Wrapper and Embedded methods.  ...  Plenty of feature selection methods are available in literature due to the availability of data with hundreds of variables leading to data with very high dimension.  ...  Conclusion In this paper we have tried to provide an introduction to feature selection techniques.  ... 
doi:10.1016/j.compeleceng.2013.11.024 fatcat:7riaeenqkbakvatupdgzi35n2y

Shapley Feature Selection

Alex Gramegna, Paolo Giudici
2022 FinTech  
We propose to integrate an explainable AI approach, based on Shapley values, to provide more accurate information for feature selection.  ...  Feature selection is a popular topic. The main approaches to deal with it fall into the three main categories of filters, wrappers and embedded methods.  ...  Introduction Feature selection is an area of research of great importance in machine learning.  ... 
doi:10.3390/fintech1010006 fatcat:kie7gvgmajcctmtb7vnmq2lhwy

A Combined Approach To Detect Key Variables In Thick Data Analytics [article]

Giovanni Antonelli, Rosa Arboretti Giancristofaro, Riccardo Ceccato, Paolo Centomo, Luca Pegoraro, Luigi Salmaso, Marco Zecca
2020 arXiv   pre-print
A comparison is carried out between the approach proposed and Lasso, that is one of the most common alternatives for feature selection available in the literature.  ...  Several industrial problems may benefit from such an approach, and an application in the field of chemical analysis is presented.  ...  INTRODUCTION Feature selection is a critical step in data preparation before machine learning modelling.  ... 
arXiv:2006.00864v1 fatcat:aspyedxvzndrxjez2m2y4yd42m

Study on Large Data Dimension Reduction Method for Line Selection in Distribution Network

Hai-yang ZANG, Pan-pan JING, Ke LU, Zhen-kai LI, Rui-fang ZHANG
2018 DEStech Transactions on Engineering and Technology Research  
Firstly, the fault feature is extracted from the original data and then use the maximum information coefficient (MIC) to measure correlation between fault line with the line fault feature, focusing on  ...  analysis of useful data, eliminate useless data, thus to make big data dimension reduction, improve data accuracy and line selection speed of operation.  ...  With 14 kinds of fault feature variables plus an original, there are 22 kinds of fault feature variables.  ... 
doi:10.12783/dtetr/apop2017/18726 fatcat:souzxehxqvc5rkjkcg6icq5bny

Variability management with feature models

Danilo Beuche, Holger Papajewski, Wolfgang Schröder-Preikschat
2004 Science of Computer Programming  
The tools presented use extended feature models as the main model for describing variability and commonality, and provide user changeable customization of the software artifacts to be managed.  ...  Variability management in software systems requires adequate tool support to cope with the ever increasing complexity of software systems.  ...  The feature model of a problem domain (in our case the cosine world) can be used by an application engineer, and she or he should be able to select the feature that the application requires and if necessary  ... 
doi:10.1016/j.scico.2003.04.005 fatcat:vljvbgrhgzgkfixgig5e23p7by

Feature selection with mutual information for regression problems

Muhammad Aliyu Sulaiman, Jane Labadin
2015 2015 9th International Conference on IT in Asia (CITA)  
Mutual information (MI) is known to be used as relevant criterion for selecting feature subsets from input dataset with a nonlinear relationship to the predicting attribute.  ...  However, mutual information estimator suffers the following limitation; it depends on smoothing parameters, the feature selection greedy methods lack theoretically justified stopping criteria and in theory  ...  Feature selection aims at reducing the dimension of a dataset by selecting variables that are relevant to the predicting attribute(s).  ... 
doi:10.1109/cita.2015.7349826 dblp:conf/cita/SulaimanL15 fatcat:p3l4xomacrbwxbrzdt5vucfkhq
« Previous Showing results 1 — 15 out of 2,568,803 results