Improved multiclass feature selection via list combination

Javier Izetta, Pablo F. Verdes, Pablo M. Granitto
2017 Expert systems with applications  
Highlights • We introduce new SVM-RFE feature selection methods for multiclass problems • We use binary decomposition followed by strategies to combine lists of features • We discuss statistical approaches and voting theory methods • One-vs-One methods give better results than One-vs-All methods • The new K-First method is the more effective in selecting relevant features Improved multiclass feature selection via list combination Abstract Feature selection is a crucial machine learning
more » ... aimed at reducing the dimensionality of the input space. By discarding useless or redundant variables, not only it improves model performance but also facilitates its interpretability. The well-known Support Vector Machines-Recursive Feature Elimination (SVM-RFE) algorithm provides good performance with moderate computational efforts, in particular for wide datasets. When using SVM-RFE on a multiclass classification problem, the usual strategy is to decompose it into a series of binary ones, and to generate an importance statistics for each feature on each binary problem. These importances are then averaged over the set of binary problems to synthesize a single value for feature ranking. In some cases, however, this procedure can lead to poor selection. In this paper we discuss six new strategies, based on list combination, designed to yield improved selections starting from the importances given by the binary problems. We evaluate them on artificial and real-world datasets, using both One-Vs-One (OVO) and One-Vs-All (OVA) strategies. Our results suggest that the OVO decomposition is most effective for feature selection on multiclass problems. We also find that in most situations the new K-First strategy can find better subsets of features than the traditional weight average approach.
doi:10.1016/j.eswa.2017.06.043 fatcat:s6jiw6e6mneahextwpyvds3t4a