Filters








13,817 Hits in 9.7 sec

Performance Evaluation of Classification Algorithms on Different Data Sets

Meenu Gupta, Deepak Dahiya
2016 Indian Journal of Science and Technology  
Further, results can be utilized to select the best classification technique among NB, decision tree and lazy classifiers in order to use with different data sets.  ...  Methods/ Statistical Analysis: Usually, the selections of classification techniques, such as, Naive Bayes (NB), Decision Tree (DT), Lazy Classifiers (LC), Support Vector Machine, etc., depend on the type  ...  Three popular classification techniques NB decision tree and lazy classifiers are used for this study.  ... 
doi:10.17485/ijst/2016/v9i40/99425 fatcat:cxkrtaeowff7de2ytpyftepc7q

Adjusting Dependence Relations for Semi-Lazy TAN Classifiers [chapter]

Zhihai Wang, Geoffrey I. Webb, Fei Zheng
2003 Lecture Notes in Computer Science  
In this paper, we analyze the implementations of two different T AN classifiers and their tree structures. Experiments show how different dependence relations impact on accuracy of T AN classifiers.  ...  Our extensive experimental results show that this kind of semi-lazy classifier delivers lower error than the original T AN and is more efficient than Superparent T AN .  ...  They build tree-augmented Bayesian classifiers based on a given set of training instances at training time, and classify a new unlabelled instance directly using the classifiers at classification time.We  ... 
doi:10.1007/978-3-540-24581-0_38 fatcat:g6xgrvervzblfmrfgk4r4nytje

COMPARISON OF CLASSIFICATION TECHNIQUES ON HEART DISEASE DATA SET

K.K .Revathi
2017 International Journal of Advanced Research in Computer Science  
Today's world in all fields extract useful knowledge from data, Data Mining is an analytic process designed to explore data. Classification analysis is one of the main techniques used in Data Mining.  ...  In our paper does a comparative study of commonly used machine learning algorithms to detecting heart diseases.  ...  In this research, we have analyzed three classifiers namely Bayesian, lazy and trees.  ... 
doi:10.26483/ijarcs.v8i9.4870 fatcat:ydacxohetrdubfjlld6e2ynvfy

Lazy Associative Classification

Adriano Veloso, Wagner Jr., Mohammed Zaki
2006 IEEE International Conference on Data Mining. Proceedings  
against a decision tree classifier.  ...  Lazy (non-eager) associative classification overcomes this problem by focusing on the features of the given test instance, increasing the chance of generating more rules that are useful for classifying  ...  Then p i = si |S| denotes the probability of class c i in S. The entropy of S is then given as E(S) = i p i log p i .  ... 
doi:10.1109/icdm.2006.96 dblp:conf/icdm/VelosoMZ06 fatcat:hnouixkhunhpdjt4zr54yjaaee

A Comparative Analysis of Data Mining Techniques on Breast Cancer Diagnosis Data using WEKA Toolbox

Majdah Alshammari, Mohammad Mezher
2020 International Journal of Advanced Computer Science and Applications  
Results conclude that Lazy IBK classifier k-NN can achieve 98% accuracy among other classifiers.  ...  The introduced algorithms were: Naïve Bayes, Logistic Regression, Lazy IBK (Instance-Bases learning with parameter K), Lazy Kstar, Lazy Locally Weighted Learner, Rules ZeroR, Decision Stump, Decision Trees  ...  J48 11 Trees RandomForest 12 Trees RandomTree 13 227 | P a g e www.ijacsa.thesai.org TABLE IV .  ... 
doi:10.14569/ijacsa.2020.0110829 fatcat:36gmbyobjjbnhh4xhmmckrglim

Improved Class Probability Estimates from Decision Tree Models [chapter]

Dragos D. Margineantu, Thomas G. Dietterich
2003 Nonlinear Estimation and Classification  
This paper introduces a new algorithm, Bagged Lazy Option Trees (B-LOTs), for constructing decision trees and compares it to an alternative, Bagged Probability Estimation Trees (B-PETs).  ...  Decision tree models typically give good classification decisions but poor probability estimates. In many applications, it is important to have good probability estimates as well.  ...  Using such a mixture model, the class conditional distribution can be computed aŝ P(y|x) =P (y)P(x|y) y P (y )P(y |x) .  ... 
doi:10.1007/978-0-387-21579-2_10 fatcat:ipe2ptqbm5gw7fhrm5hsza4r4a

Learning to predict channel stability using biogeomorphic features

Stephanie L. Moret, William T. Langford, Dragos D. Margineantu
2006 Ecological Modelling  
In particular, we use bagged lazy option trees (LOTs) and bagged probability estimation trees (PETs) to identify all unstable channels while making the smallest number of errors of classifying stable channels  ...  We measured the performance of the classifiers using ROC curves and found that the PETs performed better than the LOTs in situations where the number of instances of the stable and unstable classes were  ...  Lazy trees A combination of the lazy learning idea and decision tree algorithms is an algorithm called lazy decision trees introduced by Friedman et al. (1996) .  ... 
doi:10.1016/j.ecolmodel.2005.08.011 fatcat:54zk4rugi5hfxknx6s5lchwv44

:{unav)

Zijian Zheng, Geoffrey I. Webb
2012 Machine Learning  
tree learning algorithm, a constructive Bayesian classifier that eliminates attributes and constructs new attributes using Cartesian products of existing nominal attributes, and a lazy decision tree learning  ...  This paper proposes the application of lazy learning techniques to Bayesian tree induction and presents the resulting lazy Bayesian rule learning algorithm, called Lbr.  ...  Lbr uses lazy learning.  ... 
doi:10.1023/a:1007613203719 fatcat:krd3z63krnaipilj76l742f6d4

Analisis Perbandingan Kinerja Algoritma Naïve Bayes, Decision Tree-J48 dan Lazy-IBK

Indra Rukmana, Arvin Rasheda, Faiz Fathulhuda, Muh Rizky Cahyadi, Fitriyani Fitriyani
2021 JURNAL MEDIA INFORMATIKA BUDIDARMA  
Using the help of Weka software Version 3.8.5 to find out the classification algorithm testing.  ...  This study uses the Breast Cancer and Thoracic Surgery dataset, which is downloaded on the UCI Machine Learning Repository website.  ...  Beberapa algoritma yang akan digunakan adalah Naïve Bayes, Lazy-IBK, dan Decision Tree-J48.  ... 
doi:10.30865/mib.v5i3.3055 fatcat:ogzxsdsbznaj3d2jr6bxsejxr4

Simple Test Strategies for Cost-Sensitive Decision Trees [chapter]

Shengli Sheng, Charles X. Ling, Qiang Yang
2005 Lecture Notes in Computer Science  
In particular, we first propose a lazy decision tree learning that minimizes the total cost of tests and misclassifications.  ...  We study cost-sensitive learning of decision trees that incorporate both test costs and misclassification costs.  ...  In addition, lazy trees for the same set of unknown attributes are the same. Trees frequently used can be stored in memory for the speed trade-off.  ... 
doi:10.1007/11564096_36 fatcat:j56ps3ndu5f2tplzvtkzpzsxl4

A Conjoint Application of Data Mining Techniques for Analysis of Global Terrorist Attacks -- Prevention and Prediction for Combating Terrorism [article]

Vivek Kumar, Manuel Mazzara, Maj. Gen. Angelo Messina, JooYoung Lee
2019 arXiv   pre-print
It analyzes the performance of classifiers such as Lazy Tree, Multilayer Perceptron, Multiclass and Na\"ive Bayes classifiers for observing the trends for terrorist attacks around the world.  ...  Results and Discussion The classifiers used for this data are: Lazy classifier IBK linear NN, Lazy classifier IBK Filtered Neighbor Search, Lazy classifier IBK, Ball Tree, Lazy classifier K-star, Decision  ...  Anonymous Table 7 . 23Anonymous4Anonymous5Anonymous6Anonymous7 Result of Lazy Classifier IBK. 88.00% 89.00% 90.00% 91.00% 92.00% 93.00% 94.00% Lazy tree, IBK, Ball Tree Decision Tree Lazy Classifier, IBK  ... 
arXiv:1901.06483v3 fatcat:5gvzhttavzgj3nuylwadgctoly

Comparing the Performance of Different Data Mining Techniques in Evaluating Loan Applications

Arash Riasi, Deshen Wang
2016 International Business Research  
AD Tree.  ...  <p>This study compares the performance of various data mining classifiers in order to find out which classifiers should be used for predicting whether a loan application will be approved or rejected.  ...  The 57 classifiers which we used in our study are categorized into 6 different groups by Weka. These six groups are: Bayes, rules, meta, functions, lazy, and trees.  ... 
doi:10.5539/ibr.v9n7p164 fatcat:glfuib6e4rb37lntzpvfmmemim

Prediction of Malignant & Benign Breast Cancer: A Data Mining Approach in Healthcare Applications [article]

Vivek Kumar, Brojo Kishore Mishra, Manuel Mazzara, Dang N. H. Thanh, Abhishek Verma
2019 arXiv   pre-print
The performances of these twelve algorithms: Ada Boost M 1, Decision Table, J Rip, Lazy IBK, Logistics Regression, Multiclass Classifier, Multilayer Perceptron, Naive Bayes, Random forest and Random Tree  ...  Breast Cancer Wisconsin data set from the UCI repository has been used as experimental dataset while attribute clump thickness being used as an evaluation class.  ...  Ada Boost M1 Decision Table J-Rip J48 Lazy IBK Lazy K-star Logistics Regression Multiclass Classifier Multilayer Perceptron Naïve Bayes Random Forest Random Tree  ... 
arXiv:1902.03825v4 fatcat:lcq72mmbmrgdzifomn4jtktnc4

Classification accuracy performance of Naïve Bayesian (NB), Bayesian Networks (BN), Lazy Learning of Bayesian Rules (LBR) and Instance-Based Learner (IB1) - comparative study

Ayse Cufoglu, Mahi Lohi, Kambiz Madani
2008 2008 International Conference on Computer Engineering & Systems  
Following formula shows the LBR Bayes rule that used for classification [17] , ) | ( / ) | ( ) | ( ) | ( 2 1 2 1 2 1 2 1 V V P V C V P V C P V V C P i i ∧ = ∧ (3) Here 1 V and 2 V are any conjunction  ...  Furthermore, we know that LBR classifier proposed to improve the performance of NB classifier by applying the lazy algorithm on the NB classifier.  ... 
doi:10.1109/icces.2008.4772998 fatcat:bp4kyxwb25byfbuwg2uor6sknq

Global Optimization in Learning with Important Data: an FCA-Based Approach

Yury Kashnitsky, Sergei O. Kuznetsov
2016 International Conference on Concept Lattices and their Applications  
In this paper, we propose an FCA-based lazy classification technique where each test instance is classified with a set of the best (in terms of some information-based criterion) rules.  ...  Though decision trees are not accurate on their own, they make very good base learners for advanced tree-based methods such as random forests and gradient boosted trees.  ...  Id P class! = 1 P class == 2 P class!  ... 
dblp:conf/cla/KashnitskyK16 fatcat:k53lu2nnpvgftdyys5fjq364ta
« Previous Showing results 1 — 15 out of 13,817 results