A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Filters
IMPROVING STABILITY OF DECISION TREES
2002
International journal of pattern recognition and artificial intelligence
on decision tree learning (Breiman et al., 1984 and Quinlan, 1993) , existing methods of constructing decision trees from data suffer from a major problem of instability. ...
If an algorithm is unstable, the cross-validation results become estimators with high variance (Liu and Motoda, 1998), which means that an Last, Maimon, Minkov: Improving Stability of Decision Trees ...
The CID3 algorithm incrementally generates a multi-layer network, where each hidden layer is associated with a decision tree grown by the ID3 algorithm (Quinlan, 1986 The idea of using a restricted set ...
doi:10.1142/s0218001402001599
fatcat:t7dvin6fn5dgjawijzlusrmy2y
An Improved Algorithm for Incremental Induction of Decision Trees
[chapter]
1994
Machine Learning Proceedings 1994
The ID3 algorithm and its variants are compared in terms of theoretical complexity and empirical behavior. ...
, which can result in smaller decision trees. ...
Incremental Induction of Decision Trees ID3 is a useful concept-learning algorithm because it can efficiently construct a decision tree that generalizes well. ...
doi:10.1016/b978-1-55860-335-6.50046-5
dblp:conf/icml/Utgoff94
fatcat:wlzehuz7uvdahii5huz72fh2my
Interruptible anytime algorithms for iterative improvement of decision trees
2005
Proceedings of the 1st international workshop on Utility-based data mining - UBDM '05
Therefore, most of the existing algorithms for decision tree induction use a greedy approach based on local heuristics. ...
Finding a minimal decision tree consistent with the examples is an NP-complete problem. ...
Therefor, in order to produce a better estimation of the tree size, instead of calling ID3 once, LSID3 samples the space of "good" trees by repeatedly invoking a stochastic version of ID3 (SID3). ...
doi:10.1145/1089827.1089837
fatcat:bajalldksjbttgfe32kapxjktm
Improvement of Data Stream Decision Trees
2022
International Journal of Data Warehousing and Mining
Hoeffding Tree is a method to, incrementally, build decision trees. Since its proposition in the literature, it has become one of the most popular tools of data stream classification. ...
Several improvements have since emerged. Hoeffding Anytime Tree was recently introduced and is considered one of the most promising algorithms. ...
Decision Trees A decision tree is a predictive model used to represent classification and regression (Oded and Lior, 2014) . ...
doi:10.4018/ijdwm.290889
fatcat:kofchkkz4fcvzdar3y52padqjm
Clus-DTI: improving decision-tree classification with a clustering-based decision-tree induction algorithm
2012
Journal of the Brazilian Computer Society
Our intention is to investigate how clustering data as a part of the induction process affects the accuracy and complexity of the generated models. ...
Decision-tree induction is a well-known technique for assigning objects to categories in a white-box fashion. ...
Alex A. Freitas for his valuable comments, which helped improving this paper. ...
doi:10.1007/s13173-012-0075-5
fatcat:47erso2hgvdxrhdpnz6nbpto4e
Improving learning accuracy of fuzzy decision trees by hybrid neural networks
2000
IEEE transactions on fuzzy systems
This paper proposes using a hybrid neural network to improve the learning accuracy of Fuzzy ID3 algorithm which is a popular and powerful method of fuzzy rule extraction without much computational effort ...
The synergy between fuzzy decision tree induction and hybrid neural network offers new insight into the construction of hybrid intelligent systems. ...
FUZZY ID3 ALGORITHM One popular and powerful heuristic method for generating crisp decision trees is called ID3. ...
doi:10.1109/91.873583
fatcat:gkdcjgyccvcz5psrirdzkymtfe
ConfDTree: Improving Decision Trees Using Confidence Intervals
2012
2012 IEEE 12th International Conference on Data Mining
The experimental study indicates that the proposed post-processing method consistently and significantly improves the predictive performance of decision trees, particularly for small, imbalanced or multi-class ...
In this paper we present ConfDTree -a post-processing method which enables decision trees to better classify outlier instances. ...
For the binary datasets both the ConfDTree COMB and ConfDTree NORM outperformed the original version of the decision tree. ...
doi:10.1109/icdm.2012.19
dblp:conf/icdm/KatzSRO12
fatcat:z6apzb4lb5axtidqaptzu4epza
An Improved Collaborative Pruning Using Ant Colony Optimization and Pessimistic Technique of C5.0 Decision Tree Algorithm
2020
Zenodo
This paper presents a collaborative pruning model to improve on the classification efficiency of DTs. The model generates two forests using gain ratio: virgin and pessimistic pruned forest. ...
Besides, it had a better accuracy of 0.98% than the closest C5.0 DT algorithm, especially in a relatively big dataset. Keywords—Decision Tree, Ant Colony, Pheromone trails, Rule-based, Forest ...
A technique was then introduced known as pruning, which has potential to improve generalization of the decisions trees and reduce the size. ...
doi:10.5281/zenodo.4427699
fatcat:5rod3hcr4rh7tki7qgkrthjs2a
Theoretical Study of Decision Tree Algorithms to Identify Pivotal Factors for Performance Improvement: A Review
2016
International Journal of Computer Applications
A variety of decision tree algorithms are proposed in the literature like ID3 (Iterative Dichotomiser 3), C4.5 (successor of ID3), CART (Classification and Regression tree), CHAID (Chi-squared Automatic ...
Decision tree is a data mining technique used for the classification and forecasting of the data. ...
That's why there is a continuous enhancement needed in the field of decision tree generation. ...
doi:10.5120/ijca2016909926
fatcat:z3hcbr66inaalay2flia2obfyi
FDT 2.0: Improving scalability of the fuzzy decision tree induction tool - integrating database storage
2014
2014 IEEE Symposium on Computational Intelligence in Healthcare and e-health (CICARE)
The freeware fuzzy decision tree induction tool, FDT, is a scalable supervised-classification software tool implementing fuzzy decision trees. It is based on an optimized fuzzy ID3 (FID3) algorithm. ...
FDT 2.0 improves upon FDT 1.0 by bridging the gap between data science and data engineering: it combines a robust decisioning tool with data retention for future decisions, so that the tool does not need ...
Acknowledgments This work was supported in part by the Georgia Cancer Coalition (RWH is a Georgia Cancer Scholar) and the Georgia State University Molecular Basis of Disease Initiative. ...
doi:10.1109/cicare.2014.7007853
pmid:29226916
pmcid:PMC5721675
dblp:conf/cicare/DurhamYH14
fatcat:hb7r27qq6rdapguiga7geh5tnm
Real Time and Offline Network Intrusion Detection using Improved Decision Tree Algorithm
2012
International Journal of Computer Applications
Experimental result shows this improved decision tree classifier gives effective decision rules compare to existing decision tree techniques like ID3 and C45 algorithms. ...
In this paper improved, decision tree is implemented in order to detect network attacks like TCP SYN , Ping of Death, ARP Spoof attacks. ...
A decision tree can be expressed as a recursive partition of the instance data space. ...
doi:10.5120/7541-0482
fatcat:z4gmzy3pmfbohnkzypy75lvhiu
Improve Decision Trees for Probability-Based Ranking by Lazy Learners
2006
Proceedings - International Conference on Tools with Artificial Intelligence, TAI
This paper aims to improve the ranking performance under decision-tree paradigms by presenting two new models. ...
Existing work shows that classic decision trees have inherent deficiencies in obtaining a good probability-based ranking (e.g. AUC). ...
The final version is C4.4. They also pointed out that bagging, an ensemble method, could greatly improve decision trees in terms of probability-based ranking. Ferri et al. ...
doi:10.1109/ictai.2006.65
dblp:conf/ictai/LiangY06
fatcat:binl4buyk5dmtgjwcwbpibcx5m
Extracting Useful Rules Through Improved Decision Tree Induction Using Information Entropy
2013
International Journal of Information Sciences and Techniques
We suggest improvements to the existing C4.5 decision tree algorithm. ...
Modified DMQL queries are used to understand and explore the shortcomings of the decision trees generated by C4.5 classifier for education dataset and the results are compared with the proposed approach ...
C4.5 classifier [1] , [2] , a well-liked tree based classifier, is used to generate decision tree from a set of training examples. ...
doi:10.5121/ijist.2013.3103
fatcat:hhtzcojxgvavrndb3bif32oer4
Splitting matters: how monotone transformation of predictor variables may improve the predictions of decision tree models
[article]
2016
arXiv
pre-print
It is widely believed that the prediction accuracy of decision tree models is invariant under any strictly monotone transformation of the individual predictor variables. ...
Accordingly, this study provides guidelines for both developers and users of decision tree models (including bagging and random forest). ...
R package version 0.4-4.
, 7
(2-3):81-227, 2012.
Usama M Fayyad and Keki B Irani. On the handling of continuous-valued attributes in
decision tree generation. ...
arXiv:1611.04561v1
fatcat:pehyxvda6ncgpkz7qsbnyqnfvm
Extracting Useful Rules Through Improved Decision Tree Induction Using Information Entropy
2012
International Journal of Information Sciences and Techniques
We suggest improvements to the existing C4.5 decision tree algorithm. ...
Modified DMQL queries are used to understand and explore the shortcomings of the decision trees generated by C4.5 classifier for education dataset and the results are compared with the proposed approach ...
C4.5 classifier [1] , [2] , a well-liked tree based classifier, is used to generate decision tree from a set of training examples. ...
doi:10.5121/ijist.2012.2608
fatcat:nseyoo6ybbdaplqbjsoswswvsu
« Previous
Showing results 1 — 15 out of 3,464 results