Filters








18 Hits in 2.6 sec

NBDT: Neural-Backed Decision Trees [article]

Alvin Wan, Lisa Dunlap, Daniel Ho, Jihan Yin, Scott Lee, Henry Jin, Suzanne Petryk, Sarah Adel Bargal, Joseph E. Gonzalez
2021 arXiv   pre-print
Code and pretrained NBDTs are at https://github.com/alvinwan/neural-backed-decision-trees.  ...  We forgo this dilemma by jointly improving accuracy and interpretability using Neural-Backed Decision Trees (NBDTs).  ...  In short, soft inference can tolerate mistakes in highly uncertain decisions. METHOD Neural-Backed Decision Trees (NBDTs) replace a network's final linear layer with a decision tree.  ... 
arXiv:2004.00221v3 fatcat:u5rzlprh45hpjeefskjg2tvgpq

NBDT: Neural-Backed Decision Tree

Alvin Wan, Lisa Dunlap, Daniel Ho, Jihan Yin, Scott Lee, Suzanne Petryk, Sarah Adel Bargal, Joseph E. Gonzalez
2021 International Conference on Learning Representations  
Code and pretrained NBDTs are at github.com/alvinwan/neural-backed-decision-trees. ⇤ denotes equal contribution  ...  We forgo this dilemma by jointly improving accuracy and interpretability using Neural-Backed Decision Trees (NBDTs).  ...  In short, soft inference can tolerate mistakes in highly uncertain decisions. METHOD Neural-Backed Decision Trees (NBDTs) replace a network's final linear layer with a decision tree.  ... 
dblp:conf/iclr/WanDHYLPBG21 fatcat:tjaz6xabzrb5bnypiv4ptvxw2i

SegNBDT: Visual Decision Rules for Segmentation [article]

Alvin Wan, Daniel Ho, Younjin Song, Henk Tillman, Sarah Adel Bargal, Joseph E. Gonzalez
2020 arXiv   pre-print
We obtain semantic visual meaning by extending saliency methods to segmentation and attain accuracy by leveraging insights from neural-backed decision trees, a deep learning analog of decision trees for  ...  To address this, prior work combines neural networks with decision trees.  ...  Neural-Backed Decision Trees for Segmentation Neural-backed decision trees (NBDT) [32] are decision trees that accept neural features as input and use inner-products for each decision rule [17] .  ... 
arXiv:2006.06868v1 fatcat:4if6moi6nje3pbckhp2r3oxqwq

Making CNNs Interpretable by Building Dynamic Sequential Decision Forests with Top-down Hierarchy Learning [article]

Yilin Wang, Shaozuo Yu, Xiaokang Yang, Wei Shen
2021 arXiv   pre-print
We achieve this by building a differentiable decision forest on top of CNNs, which enjoys two characteristics: 1) During training, the tree hierarchies of the forest are learned in a top-down manner under  ...  the guidance from the category semantics embedded in the pre-trained CNN weights; 2) During inference, a single decision tree is dynamically selected from the forest for each input sample, enabling the  ...  We compare our dDSDF with the original neural network (NN) and two deep-decision-tree based methods, deep Neural Decision Forest (dNDF) [14] and Neural-Backed Decision Tree (NBDT) [35] .  ... 
arXiv:2106.02824v1 fatcat:fjbh476ikfearorl75hi56csw4

SONG: Self-Organizing Neural Graphs [article]

Łukasz Struski, Tomasz Danel, Marek Śmieja, Jacek Tabor, Bartosz Zieliński
2021 arXiv   pre-print
Recent years have seen a surge in research on deep interpretable neural networks with decision trees as one of the most commonly incorporated tools.  ...  However, one of the well-known drawbacks of decision trees, as compared to decision graphs, is that decision trees cannot reuse the decision nodes.  ...  SONG generalizes methods like Soft Decision Tree (SDT) [8] and Neural-Backed Decision Trees (NBDT) [38] and as a differentiable solution is applicable to any deep learning pipeline ( Figure 1 ).  ... 
arXiv:2107.13214v1 fatcat:47o7qt6mrjgmnpc7rgwc4sylbe

Dive into Decision Trees and Forests: A Theoretical Demonstration [article]

Jinxiong Zhang
2021 arXiv   pre-print
Based on decision trees, many fields have arguably made tremendous progress in recent years.  ...  In simple words, decision trees use the strategy of "divide-and-conquer" to divide the complex problem on the dependency between input features and labels into smaller ones.  ...  Neural-backed decision trees Neural-Backed Decision Trees (NBDTs) [41] are proposed to leverage the powerful feature representation of convolutional neural networks and the inherent interpretable of  ... 
arXiv:2101.08656v1 fatcat:lksqf44i7fh2jfnqlvdspddzba

A Hybrid Intelligent Approach for Network Intrusion Detection

Mrutyunjaya Panda, Ajith Abraham, Manas Ranjan Patra
2012 Procedia Engineering  
In this paper, we propose to use a hybrid intelligent approach using combination of classifiers in order to make the decision intelligently, so that the overall performance of the resultant model is enhanced  ...  and stability in comparison to the back propagation neural network, Decision Tree and Naïve Bayes.  ...  Panda and Patra used a hybrid NBDT by combining Naïve Bayes with Decision tree along with AdaBoost to produce the best detection rate with false positive rate in designing IDS.  ... 
doi:10.1016/j.proeng.2012.01.827 fatcat:ttfziakfijgstkgneimp23lm6e

TUTOR: Training Neural Networks Using Decision Rules as Model Priors [article]

Shayan Hassantabar, Prerit Terway, Niraj K. Jha
2022 arXiv   pre-print
On the other hand, deep neural networks (DNNs) generally need large amounts of data and computational resources for training. However, this requirement is not met in many settings.  ...  Neural-backed decision trees (NBDTs) [14] combine decision trees with a DNN to enhance DNN explainability. NBDTs achieve performance close to that of the DNN while improving explainability.  ...  This approach uses back-propagation to learn the parameter to split at a node. Deep neural decision trees [16] use a soft binning function modeled by a DNN to split nodes into multiple leaves.  ... 
arXiv:2010.05429v3 fatcat:qzczgytwq5h4vmf6nianj2laa4

CDT: Cascading Decision Trees for Explainable Reinforcement Learning [article]

Zihan Ding, Pablo Hernandez-Leal, Gavin Weiguang Ding, Changjian Li, Ruitong Huang
2021 arXiv   pre-print
However, explaining the policy of RL agents still remains an open problem due to several factors, one being the complexity of explaining neural networks decisions.  ...  Soft decision trees (SDTs) and discretized differentiable decision trees (DDTs) have been demonstrated to achieve both good performance and share the benefit of having explainable policies.  ...  (Wan et al., 2020) propose the neural-backed decision tree (NBDT) which transfers the final fully connected layer of a NN into a DT with induced hierarchies for the ease of interpretation, but shares  ... 
arXiv:2011.07553v2 fatcat:2kezrg5c4jbkzaxxotnngfqedu

Concept Embedding Analysis: A Review [article]

Gesina Schwalbe
2022 arXiv   pre-print
Deep neural networks (DNNs) have found their way into many applications with potential impact on the safety, security, and fairness of human-machine-systems.  ...  Other than previous k-means based methods, Neural-Backed Decision Trees (NBDT) (Wan et al, 2020) uses hierarchical agglomerative clustering (HAC) to find a hierarchy of CAVs in the last DNN hidden layer  ...  tree Additive explainer (Chen et al, 2019b) General additive model Interpretable Basis Decomposition Local linear model (Zhou et al, 2018) NBDT (Wan et al, 2020) Decision tree (semantic hierarchies) Explanatory  ... 
arXiv:2203.13909v1 fatcat:tpu2eobm2zeo7evuerphqoj7di

Coarse-to-Fine Curriculum Learning [article]

Otilia Stretcu, Emmanouil Antonios Platanios, Tom M. Mitchell, Barnabás Póczos
2021 arXiv   pre-print
For the Resnet and WideResnet experiments we replicated the setting of [59] , available at https://github.com/alvinwan/neural-backed-decision-trees, in order to be able to directly compare with their results  ...  A tree-based prior over the last layer of a neural network is used to encourage classes closer in the hierarchy to have similar parameters.  ... 
arXiv:2106.04072v1 fatcat:4n5iku6od5gzbks543gmq54gui

A Neural Tangent Kernel Perspective of Infinite Tree Ensembles [article]

Ryuichi Kanoh, Mahito Sugiyama
2022 arXiv   pre-print
In practical situations, the tree ensemble is one of the most popular models along with neural networks. A soft tree is a variant of a decision tree.  ...  By considering an ensemble of infinite soft trees, this paper introduces and studies the Tree Neural Tangent Kernel (TNTK), which provides new insights into the behavior of the infinite ensemble of soft  ...  NBDT: Neural-Backed Decision Tree. In International Conference on Learning Representations, 2021. Christopher K. I. Williams. Computing with Infinite Networks.  ... 
arXiv:2109.04983v2 fatcat:cnf5h4hmw5bz5nxetfkcbhu7sq

Harnessing value from data science in business: ensuring explainability and fairness of solutions [article]

Krzysztof Chomiak, Michał Miktus
2021 arXiv   pre-print
, Wan et al. (2021) proposed the Neural-Backed Decision Trees (NBDT) 38 technique to estimate a decision tree augmented with the information obtained from a simultaneous neural network.  ...  The high-level 37 Implemented in Python: https://github.com/Eric-Wallace/deep-knn 38 Implemented in Python: https://github.com/alvinwan/neural-backed-decision-trees 39 Implemented in Python: https://github.com  ... 
arXiv:2108.07714v1 fatcat:s36ftwpzyvbaxnawhtcdtpyobe

A Comprehensive Taxonomy for Explainable Artificial Intelligence: A Systematic Survey of Surveys on Methods and Concepts [article]

Gesina Schwalbe, Bettina Finzel
2022 arXiv   pre-print
NBDT (Wan et al, 2020) . A similar approach is followed by NBDT (Neural-Backed Decision Trees).  ...  The first approaches aiming at making neural network decisions transparent for debugging purposes date back to the mid 90's.  ... 
arXiv:2105.07190v3 fatcat:zy7vl6o4gzcbrpqkrxeyazyeuq

Application of a New Feature Generation Algorithm in Intrusion Detection System

Yingchun Niu, Chengdong Chen, Xuehua Zhang, Xiaoguang Zhou, Hongjie Liu, Wenjuan Li
2022 Wireless Communications and Mobile Computing  
For example, Panda proposed a new hybrid data mining method [7] , specifically combining mutual Bayes and decision trees to propose the NBDT algorithm, combining the decision tree with the farthest traversal  ...  First, we use the CFS method for feature processing on the KDD99 dataset and then use Xgboost, decision trees, gradient boosting trees, random forests, and Bagger classifiers (XGB, DT, GBDT, RF, and BC  ... 
doi:10.1155/2022/3794579 fatcat:7czffbp4tnejpmedaopkfhfsc4
« Previous Showing results 1 — 15 out of 18 results