Filters








2,431 Hits in 2.8 sec

Isotonic Classification Trees [chapter]

Rémon van de Kamp, Ad Feelders, Nicola Barile
2009 Lecture Notes in Computer Science  
We propose a new algorithm for learning isotonic classification trees. It relabels non-monotone leaf nodes by performing the isotonic regression on the collection of leaf nodes.  ...  We experimentally compare the performance of the new algorithm with standard classification trees.  ...  Isotonic Classification Trees The ICT algorithm can in principle be combined with any standard classification tree algorithm.  ... 
doi:10.1007/978-3-642-03915-7_35 fatcat:qe2p5r4djjf23ce7gebhplbow4

Firm Bankruptcy Prediction: Experimental Comparison of Isotonic Separation and Other Classification Approaches

Y.U. Ryu, W.T. Yue
2005 IEEE transactions on systems, man and cybernetics. Part A. Systems and humans  
Then, various classification methods, including discriminant analysis, neural networks, decision tree induction, learning vector quantization, rough sets, and isotonic separation, are used with the reduced  ...  Experiments show that the isotonic separation method is a viable technique, performing generally better than other methods for short-term bankruptcy prediction.  ...  tree induction methods.  ... 
doi:10.1109/tsmca.2005.843393 fatcat:y2xju74hcrcb3dx2eu3izwnwlq

Calibrating Random Forests

Henrik Boström
2008 2008 Seventh International Conference on Machine Learning and Applications  
In this work, a novel calibration method is introduced, which is based on a recent finding that probabilities predicted by forests of classification trees have a lower squared error compared to those predicted  ...  The experiment shows that random forests of PETs calibrated by the novel method significantly outperform uncalibrated random forests of both PETs and classification trees, as well as random forests calibrated  ...  trees calibrated with isotonic regression.  ... 
doi:10.1109/icmla.2008.107 dblp:conf/icmla/Bostrom08 fatcat:5txxfgqbtrb35f6sp2rkvlogbq

Page-level template detection via isotonic smoothing

Deepayan Chakrabarti, Ravi Kumar, Kunal Punera
2007 Proceedings of the 16th international conference on World Wide Web - WWW '07  
The second is the global smoothing of these per-node classifier scores by solving a regularized isotonic regression problem; the latter follows from a simple yet powerful abstraction of templateness on  ...  This cost function and the tree structure lead to a regularized version of the isotonic regression problem. Problem 2 (Regularized Tree Isotonic Regression).  ...  ISOTONIC SMOOTHING In this section we formulate and solve the generalized isotonic regression problem on trees.  ... 
doi:10.1145/1242572.1242582 dblp:conf/www/ChakrabartiKP07 fatcat:73kxg7njy5aoxbuyhpy7vfjwce

Probability Calibration Trees [article]

Tim Leathart, Eibe Frank, Geoffrey Holmes, Bernhard Pfahringer
2018 arXiv   pre-print
We compare probability calibration trees to two widely used calibration methods---isotonic regression and Platt scaling---and show that our method results in lower root mean squared error on average than  ...  Obtaining accurate and well calibrated probability estimates from classifiers is useful in many applications, for example, when minimising the expected cost of classifications.  ...  Isotonic Regression Zadrozny and Elkan (2001) use a method based on isotonic regression for probability calibration for a range of classification models.  ... 
arXiv:1808.00111v2 fatcat:sy54z3qbnbgy7hidvk3zlptoyu

Breast cancer prediction using the isotonic separation technique

Young U. Ryu, R. Chandrasekaran, Varghese S. Jacob
2007 European Journal of Operational Research  
A recently developed data separation/classification method, called isotonic separation, is applied to breast cancer prediction.  ...  The experiment results show that isotonic separation is a viable and useful tool for data classification in the medical domain.  ...  In this paper we propose a classification scheme that assumes the data set satisfies an isotonic consistency condition.  ... 
doi:10.1016/j.ejor.2006.06.031 fatcat:c46sjyfmx5fs3j4npebadopjvq

Predicting good probabilities with supervised learning

Alexandru Niculescu-Mizil, Rich Caruana
2005 Proceedings of the 22nd international conference on Machine learning - ICML '05  
We experiment with two ways of correcting the biased probabilities predicted by some learning methods: Platt Scaling and Isotonic Regression.  ...  Other models such as neural nets and bagged trees do not have these biases and predict well calibrated probabilities.  ...  Elkan for the Isotonic Regression code, C. Young et al. at Stanford Linear Accelerator for the SLAC data, and A. Gualtieri at Goddard Space Center for help with the Indian Pines Data.  ... 
doi:10.1145/1102351.1102430 dblp:conf/icml/Niculescu-MizilC05 fatcat:c3dm7wmnfzgorgjpkx46ul554i

Enhanced hierarchical classification via isotonic smoothing

Kunal Punera, Joydeep Ghosh
2008 Proceeding of the 17th international conference on World Wide Web - WWW '08  
This new problem generalizes the classic isotonic tree regression problem, and both, the new formulation and algorithm, might be of independent interest.  ...  We formulate the task of smoothing classifier outputs as a regularized isotonic tree regression problem, and present a dynamic programming based method that solves it optimally.  ...  Hence, gains in classification accuracy in one part of the tree were offset by losses in others.  ... 
doi:10.1145/1367497.1367518 dblp:conf/www/PuneraG08 fatcat:4cntndrhc5ejhdw6rkwrgipb7e

On the Calibration of Nested Dichotomies for Large Multiclass Tasks [article]

Tim Leathart, Eibe Frank, Bernhard Pfahringer, Geoffrey Holmes
2018 arXiv   pre-print
A tree structure is induced that recursively splits the set of classes into subsets, and a binary classification model learns to discriminate between the two subsets of classes at each node.  ...  Nested dichotomies are used as a method of transforming a multiclass classification problem into a series of binary problems.  ...  However, with boosted trees, performance in terms of classification accuracy is less consistent, often being greater when only internal calibration is applied.  ... 
arXiv:1809.02744v3 fatcat:zup7r45aujaelfkfnojv3lqinm

Isotonic Separation

R. Chandrasekaran, Young U. Ryu, Varghese S. Jacob, Sungchul Hong
2005 INFORMS journal on computing  
Exploiting this characteristic of isotonicity, we propose a data-classification method called isotonic separation based on linear programming, especially network programming.  ...  D ata classification and prediction problems are prevalent in many domains.  ...  Madhavan for discussions on isotonic regression and its applications.  ... 
doi:10.1287/ijoc.1030.0061 fatcat:xjclnepkf5hozbozwdcpcucgqy

Robust Probabilistic Calibration [chapter]

Stefan Rüping
2006 Lecture Notes in Computer Science  
A recent comparative study [1] revealed that Isotonic Regression [2] and Platt Calibration [3] are most effective probabilistic calibration technique for a wide range of classifiers.  ...  This paper deals with probabilistic classification by calibrating a numerical classifier.  ...  The notable exception are Decision Trees and k Nearest Neighbor for the CRE measure.  ... 
doi:10.1007/11871842_75 fatcat:5dnnhliswbdkhapwrbo5z4ayb4

Comparison of Nearest Neighbor (ibk), Regression by Discretization and Isotonic Regression Classification Algorithms for Precipitation Classes Prediction

Solomon MwanjeleMwagha, Masinde Muthoni, Peter Ochieng
2014 International Journal of Computer Applications  
We sought to train, test and evaluate the performance of nearest neighbor (ibk), regression by discretization and isotonic regression classification algorithms in predicting precipitation classes.  ...  Isotonic Regression, K-nearest neighbours classifier, and RegressionByDiscretization classifiers were used for training training and testing of the data sets.  ...  The base classifier used is J48 Class for generating a pruned or unpruned C4.5 decision trees.  ... 
doi:10.5120/16919-6729 fatcat:qerkxv277jf4fkimm2nr3uxo6a

Venn predictors for well-calibrated probability estimation trees

Ulf Johansson, Tuwe Löfström, Håkan Sundell, Henrik Linusson, Anders Gidenstam, Henrik Boström
2018 International Symposium on Conformal and Probabilistic Prediction with Applications  
In this paper, Venn predictors are compared to Platt scaling and isotonic regression, for the purpose of producing well-calibrated probabilistic predictions from decision trees.  ...  Successful use of probabilistic classification requires well-calibrated probability estimates, i.e., the predicted class probabilities must correspond to the true probabilities.  ...  However, this approach is not directly applicable when learning single trees, hence leaving the question open on how to improve upon Platt scaling and isotonic regression for single trees.  ... 
dblp:conf/copa/JohanssonLSLGB18 fatcat:pwdxmicn2jbsrolmcesynfiisq

Calibration of Machine Learning Classifiers for Probability of Default Modelling [article]

Pedro G. Fonseca, Hugo D. Lopes
2017 arXiv   pre-print
The calibration techniques used are Platt Scaling and Isotonic Regression.  ...  Binary classification is highly used in credit scoring in the estimation of probability of default.  ...  Tree Classification Models.  ... 
arXiv:1710.08901v1 fatcat:vxqkstua5vblricf7v54eryqdm

Weighted L ∞ isotonic regression

Quentin F. Stout
2018 Journal of computer and system sciences (Print)  
While not as fast in the general case, for linear and tree orderings prefix algorithms are used to determine isotonic and unimodal regressions in Θ(n log n) time.  ...  L ∞ isotonic regressions are not unique, so we examine properties of the regressions an algorithm produces, in addition to the time it takes.  ...  This completes the proof of Theorem 5. ✷ River Isotonic Regression for Trees Isotonic regression on trees arises in a variety of hierarchical classification and taxonomy problems in data mining and machine  ... 
doi:10.1016/j.jcss.2017.09.001 fatcat:7sdjergaizdcng5c3xqirtsiny
« Previous Showing results 1 — 15 out of 2,431 results