A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Algorithms for Active Classifier Selection
2017
Proceedings of the Tenth ACM International Conference on Web Search and Data Mining - WSDM '17
In this paper, we consider model-selection algorithms for these precision-constrained scenarios. ...
We develop adaptive model-selection algorithms to identify, using as few samples as possible, the best classifier from among a set of (precision) qualifying classifiers. ...
Among precision-acceptable classifiers, we want to select the one with the largest recall. ...
doi:10.1145/3018661.3018730
dblp:conf/wsdm/BennettCMZ17
fatcat:ged2kupzc5bhhdmyqezeh5s6ii
Approximate Selection with Guarantees using Proxies
[article]
2022
arXiv
pre-print
In this work, we introduce novel algorithms for approximate selection queries with statistical accuracy guarantees. ...
We show that our algorithms can improve query result quality by up to 30x for both the precision and recall targets in both real and synthetic datasets. ...
We further thank Tadashi Fukami, Trevor Hebert, and Isaac Westlund for their helpful discussions. ...
arXiv:2004.00827v4
fatcat:5mktihaghrgc3ajx6xxhfotvpi
Tuning the ensemble selection process of schema matchers
2010
Information Systems
To the best of our knowledge, none of the existing algorithmic solutions offer such a selection feature. In this paper we provide a thorough investigation of this research topic. ...
Schema matching research has been going on for more than 25 years now. ...
To understand this phenomenon, recall that the error measure that was defined for SMB balances Precision with Recall. SMB chose matchers that improve Precision. ...
doi:10.1016/j.is.2010.04.003
fatcat:nketdrw5czavvcihc327tnmlqe
Active seed selection for constrained clustering
2017
Intelligent Data Analysis
Active learning for semi-supervised clustering allows algorithms to solicit a domain expert to provide side information as instances constraints, for example a set of labeled instances called seeds. ...
In this paper, we propose a new active seed selection algorithm that relies on a k-nearest neighbors structure to locate dense potential clusters and efficiently query and propagate expert information. ...
active constraint selection algorithms [43, 44, 45] . ...
doi:10.3233/ida-150499
fatcat:vdnckohisfaqjanu6le7alh7de
Prototype-Based Sample Selection for Active Hashing
2015
Journal of Computer Science
For expert labeling, we select prototypes from clusters which do not contain any data points with labeled information so that all areas can be covered effectively. ...
In this study, we present an active hashing method by prototype-based sample selection. ...
Precision and recall are computed for each query point and then averaged for all the query points. ...
doi:10.3844/jcssp.2015.839.844
fatcat:qltdpb3kmfhgbbmg7mhdt7qt4u
Core-set Selection Using Metrics-based Explanations (CSUME) for multiclass ECG
[article]
2022
arXiv
pre-print
Our experimental results show a 9.67% and 8.69% precision and recall improvement with a significant training data volume reduction of 50%. ...
This also provides an understanding (for algorithm developers) as to why a sample was selected as more informative over others for the improvement of deep learning model performance. ...
These three metrics allow us to select the samples from the incoming dataset that maximize testing model performance in terms of accuracy, precision, and recall. ...
arXiv:2205.14508v1
fatcat:yw5imemrgjeqvpirneyuweezdu
Comparative Analysis of Selected Heterogeneous Classifiers for Software Defects Prediction Using Filter-Based Feature Selection Methods
2018
FUOYE Journal of Engineering and Technology
The datasets were classified by the selected classifiers which were carefully selected based on heterogeneity. ...
with FilterSubsetEval had the best accuracy. ...
It is used for classifying data into different classes according to some constraints. ...
doi:10.46792/fuoyejet.v3i1.178
fatcat:m4cjd63o4jhtjhsbhxd57b5bge
Software Module Fault Prediction using Convolutional Neural Network with Feature Selection
2016
International Journal of Software Engineering and Its Applications
A sequence of rigorous activities under certain constraints is followed to come up with reliable software. ...
The comparative analysis is performed on the basis of accuracy, precision, recall and F1-measure. The results clearly show better performance of the proposed CNN based technique than HySOM. ...
A sequence of rigorous activities under certain constraints is followed to come up with reliable software. ...
doi:10.14257/ijseia.2016.10.12.27
fatcat:br6q52awq5ewrm2aurkj26qmiu
Harmonious Semantic Line Detection via Maximal Weight Clique Selection
[article]
2021
arXiv
pre-print
A novel algorithm to detect an optimal set of semantic lines is proposed in this work. We develop two networks: selection network (S-Net) and harmonization network (H-Net). ...
Finally, we determine a maximal weight clique representing an optimal set of semantic lines. Moreover, to assess the overall harmony of detected lines, we propose a novel metric, called HIoU. ...
The proposed algorithm provides a poorer recall but a better precision than the conventional algorithms. F-measure is the harmonic mean of recall and precision. ...
arXiv:2104.06903v1
fatcat:3dukcxfhm5dupkck6ywvydb3cm
Feature Selection for Effective Text Classification using Semantic Information
2015
International Journal of Computer Applications
Different datasets are constructed with each different collection of features to gain an understanding about what is the best representation for text data depending on different types of classifiers. ...
to incorporate the context information with the text in machine learning for better classification accuracy. ...
Its score is maximized when the values of recall and precision are equal or close; otherwise, the smaller of recall and precision dominates the value of Fl. ...
doi:10.5120/19861-1818
fatcat:s5rozhv4krg4hofinpx6lhdrtq
Constraint Selection in Metric Learning
[article]
2016
arXiv
pre-print
The proposed approach relies on a loss-dependent weighted selection of constraints that are used for learning the metric. ...
This paper presents a simple way of improving accuracy and scalability of any iterative metric learning algorithm, where constraints are obtained prior to the algorithm. ...
(e.g. precision, recall, F-measure). ...
arXiv:1612.04853v1
fatcat:yp7uudeqxrgszop2nclhdu7a5u
Feature Selection using Stochastic Gates
[article]
2020
arXiv
pre-print
Feature selection problems have been extensively studied for linear estimation, for instance, Lasso, but less emphasis has been placed on feature selection for non-linear functions. ...
In this study, we propose a method for feature selection in high-dimensional non-linear function estimation problems. ...
Acknowledgements The authors thank Nicolas Casey and the anonymous reviewers for their helpful feedback. ...
arXiv:1810.04247v7
fatcat:towtuaxgtva7ff2x42lpyzb63y
Ontology-based Feature Selection: A Survey
[article]
2021
arXiv
pre-print
First, some of the most common classification and feature selection algorithms are briefly presented. ...
selection. ...
For the evaluation, they compared precision (ratio of correct feature to retrieved features) and recall (ratio of correct features to ideal features) against manually selected features from hu-man experts ...
arXiv:2104.07720v2
fatcat:zxd5milohbfj3ewtaw3siwhi4m
Feature and Region Selection for Visual Learning
2016
IEEE Transactions on Image Processing
To answer these questions, this paper presents a method for feature selection and region selection in the visual BoW model. ...
The main idea is to assign latent weights to the features or regions, and jointly optimize these latent variables with the parameters of a classifier (e.g., support vector machine). ...
To provide a quantitative measure for the localization performance, we compared all methods using precision-recall curves, as shown in Fig. 5 . ...
doi:10.1109/tip.2016.2514503
pmid:26742135
fatcat:6v7zgdldzveeblnd5zs6bgsupm
On optimal service selection
2005
Proceedings of the 14th international conference on World Wide Web - WWW '05
We designed and implemented both exact and heuristic (suboptimal) algorithms for the hard case, and carried out a preliminary experimental evaluation with interesting results. ...
In this paper we formalize three kinds of optimal service selection problems, based on different criteria. Then we study their complexity and implement solutions. ...
Currently, it seems that the algorithm with a guaranteed 0.52 bound on relative error is too slow for real-time service selection over large workflows and offer sets. ...
doi:10.1145/1060745.1060823
dblp:conf/www/BonattiF05
fatcat:423zxbujfff3hjc5wcr6sejfq4
« Previous
Showing results 1 — 15 out of 31,557 results