489 Hits in 3.5 sec

Semi-Supervised Learning in Medical Images Through Graph-Embedded Random Forest

Lin Gu, Xiaowei Zhang, Shaodi You, Shen Zhao, Zhenzhong Liu, Tatsuya Harada
2020 Frontiers in Neuroinformatics  
We identify the key bottleneck of random forest to be the information gain calculation and replace it with a graph-embedded entropy which is more reliable for insufficient labeled data scenario.  ...  and robustness over over-fitting.  ...  From Figure 2 , we found that Stage 3, optimal parameter selection, is the performance bottleneck of the splitting node construction, which is also the keystone of random forest construction (Liu et  ... 
doi:10.3389/fninf.2020.601829 pmid:33240071 pmcid:PMC7683389 fatcat:kyfy3gw6fjdhfol5kblkgkbcku

Financial Data Anomaly Detection Method Based on Decision Tree and Random Forest Algorithm

Qingyang Zhang, Miaochao Chen
2022 Journal of Mathematics  
Simulation experiments show that compared with the other two distance-based abnormal sample detection techniques, the random forest-based abnormal sample detection has greater advantages than the other  ...  In this paper, the random forest algorithm is introduced into the detection of abnormal samples, and the concept of abnormal point scale is proposed to measure the abnormal degree of the sample based on  ...  For example, it only takes more than ten seconds to construct a random forest with a scale of 1,000 trees for a dataset with a capacity of 5,000 samples.  ... 
doi:10.1155/2022/9135117 fatcat:ci5mjbwlyngzvb4s2v2emdja2e

A New Semisupervised-Entropy Framework of Hyperspectral Image Classification Based on Random Forest

Mengmeng Sun, Chunyang Wang, Shuangting Wang, Zongze Zhao, Xiao Li
2018 Advances in Multimedia  
The framework is composed of five parts: (1) random samples selection with (2) probabilistic output initial random forest classification processing based on the number of votes; (3) semisupervised classification  ...  The purposes of the algorithm presented in this paper are to select features with the highest average separability by using the random forest method to distinguish categories that are easy to distinguish  ...  In order to ensure the random forest classifier performs well, the number of variables used in a random forest decision tree (N-tree) and Table 2 : The process of semisupervised classification of random  ... 
doi:10.1155/2018/3521720 fatcat:imtl5u5uljegxaqsys2dptqnbm

A Comparative Study on Serial Decision Tree Classification Algorithms in Text Mining

Khaled M. Almunirawi, Ashraf Y. A. Maghari
2016 International Journal of Intelligent Computing Research  
We found out that Random Forest classifier is the most accurate one among other classifiers.  ...  In this paper, we review various decision tree algorithms with their limitations, and conduct a comparative study to evaluate their performance regarding accuracy, learning time and tree size, using four  ...  In a random forest, each node is split using the best split among a subset of predictors randomly chosen at that node.  ... 
doi:10.20533/ijicr.2042.4655.2016.0093 fatcat:is7fo23yvnffti4rylxln6i72m

Empirical Analysis of Financial Statement Fraud of Listed Companies Based on Logistic Regression and Random Forest Algorithm

Xinchun Liu, Miaochao Chen
2021 Journal of Mathematics  
Through the empirical analysis of logistic regression, gradient lifting decision tree, and random forest model, the preliminary results are obtained, and then the random forest model is used for secondary  ...  In this paper, two machine learning algorithms, decision tree and random forest, are used to detect the company's financial data.  ...  X represents the training data set constructed by random forest, which is a multidimensional vector set.  ... 
doi:10.1155/2021/9241338 fatcat:ho5pttcxlzajnmr5m6ok5qbe44

Weakly Supervised Classification of Objects in Images Using Soft Random Forests [chapter]

Riwal Lefort, Ronan Fablet, Jean-Marc Boucher
2010 Lecture Notes in Computer Science  
The development of robust classification model is among the important issues in computer vision.  ...  This paper deals with weakly supervised learning that generalizes the supervised and semi-supervised learning.  ...  Random forests combine a "bagging" procedure [22] and the random selection of a subset of descriptors at each node [23] .  ... 
doi:10.1007/978-3-642-15561-1_14 fatcat:d7tsp3opqjer3hjwngxjiiczmi

Semi-supervised Node Splitting for Random Forest Construction

Xiao Liu, Mingli Song, Dacheng Tao, Zicheng Liu, Luming Zhang, Chun Chen, Jiajun Bu
2013 2013 IEEE Conference on Computer Vision and Pattern Recognition  
Node splitting is an important issue in Random Forest but robust splitting requires a large number of training samples.  ...  In this paper, we present semi-supervised splitting to overcome this limitation by splitting nodes with the guidance of both labeled and unlabeled data.  ...  Figure 3 . 3 The classification accuracy of Random Forests with traditional splitting criteria (dashed lines) and the proposed semisupervised splitting (solid lines).  ... 
doi:10.1109/cvpr.2013.70 dblp:conf/cvpr/LiuSTLZCB13 fatcat:ywawii4ocredpml2cld6pcxtty

3D Features for Human Action Recognition with Semi-supervised Learning

Suraj Sahoo, Samit Ari, Ulli Srinivasu
2019 IET Image Processing  
These local features are used to build the trees for the random forest technique. During tree building, a semi-supervised learning is proposed for better splitting of data points at each node.  ...  For recognition of an action, mutual information is estimated for all the extracted interest points to each of the trained class by passing them through the random forest.  ...  The mutual information is estimated by using the semi-supervised learning random forest voting. Random forest is constructed by building M number of independent decision trees.  ... 
doi:10.1049/iet-ipr.2018.6045 fatcat:vqpek22slvaulmm5yqcguyzlwq

Semi-Supervised Nonlinear Distance Metric Learning via Forests of Max-Margin Cluster Hierarchies [article]

David M. Johnson, Caiming Xiong, Jason J. Corso
2014 arXiv   pre-print
We propose a novel nonlinear metric learning method that uses an iterative, hierarchical variant of semi-supervised max-margin clustering to construct a forest of cluster hierarchies, where each individual  ...  By introducing randomness during hierarchy training and combining the output of many of the resulting semi-random weak hierarchy metrics, we can obtain a powerful and robust nonlinear metric model.  ...  Our metric first constructs a model of the data by computing a forest of semi-random cluster hierarchies, where each tree is generated by iteratively applying a partially-randomized binary semi-supervised  ... 
arXiv:1402.5565v1 fatcat:awfqaihmzzhytkoji4unsndu4a

A Novel Computer-Aided Diagnosis Scheme on Small Annotated Set: G2C-CAD

Guangyuan Zheng, Guanghui Han, Nouman Q. Soomro, Linjuan Ma, Fuquan Zhang, Yanfeng Zhao, Xinming Zhao, Chunwu Zhou
2019 BioMed Research International  
Finally, we tested the fuzzy Co-forest and compared its performance with that of a C4.5 random decision forest and the G2C-CAD system without the fuzzy scheme, using ROC and confusion matrix for evaluation  ...  Then, coupled with the simulated unlabeled samples generated by the trained DCGAN, we conducted iterative semisupervised learning, which continually improved the classification performance of the fuzzy  ...  Based on these fake samples, we conducted semisupervised learning with the fuzzy Co-forest and finally obtained a classifier with excellent performance.  ... 
doi:10.1155/2019/6425963 pmid:31119180 pmcid:PMC6500711 fatcat:t7qdcifya5foboolooncela2ra

Latent Regression Forest: Structured Estimation of 3D Articulated Hand Posture

Danhang Tang, Hyung Jin Chang, Alykhan Tejani, Tae-Kyun Kim
2014 2014 IEEE Conference on Computer Vision and Pattern Recognition  
In this paper we present the Latent Regression Forest (LRF), a novel framework for real-time, 3D hand pose estimation from a single depth image.  ...  (ii) A new forest-based, discriminative framework for structured search in images, as well as an error regression step to avoid error accumulation.  ...  The splitting candidate, φ * i , that gives the largest infor-mation gain is stored at the LRT node, which is a split node as in the standard Random Forest.  ... 
doi:10.1109/cvpr.2014.490 dblp:conf/cvpr/TangCTK14 fatcat:hvpzrgh23baf7fq3gfdyxkx4lu

Semi-Supervised Streaming Learning with Emerging New Labels

Yong-Nan Zhu, Yu-Feng Li
In this paper, we tackle these issues by a new approach called SEEN which consists of three major components: an effective novel class detector based on clustering random trees, a robust classifier for  ...  Great efforts have been devoted to learning with novel concepts recently, which are typically in a supervised setting with completely supervised initialization.  ...  SEEN-Forest consists of many SEENTrees, and each SEENTree is built using a random subset of input training set O of size φ. Algorithm 2 summarizes the construction of SEENTree.  ... 
doi:10.1609/aaai.v34i04.6186 fatcat:xjntj22r7fapxg3y5pc5nxbjgq

Weakly Supervised Learning: Application to Fish School Recognition [chapter]

Riwal Lefort, Ronan Fablet, Jean-Marc Boucher
2011 Studies in Computational Intelligence  
Experiments show that random forests outperform discriminative and generative models in supervised learning but random forests are not robust to high complexity class proportions.  ...  Finally, a compromise is achieved by taking a combination of classifiers that keeps the accuracy of random forests and exploits the robustness of discriminative models.  ...  For random forests the explanation is that the used criterion to find acceptable split at the corresponding node m does not fit for prior labelling.  ... 
doi:10.1007/978-3-642-11739-8_10 fatcat:4cctb63g4jfq3eojjfcz2zz4hu

Learning Convolutional Neural Networks for Object Detection with Very Little Training Data [chapter]

Christoph Reinders, Hanno Ackermann, Michael Ying Yang, Bodo Rosenhahn
2019 Multimodal Scene Understanding  
Random forests [17] have shown to be robust classifiers even if few data are available. A random forest consists of multiple decision trees.  ...  As a result, random forests are still fast and additionally very robust to overfitting. An example of a decision tree and a random forest is shown in Fig. 4 .8.  ... 
doi:10.1016/b978-0-12-817358-9.00010-x fatcat:axggdj3vq5fpxev22v2x6657ai

SemiContour: A Semi-Supervised Learning Approach for Contour Detection

Zizhao Zhang, Fuyong Xing, Xiaoshuang Shi, Lin Yang
2016 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)  
their estimated structured labels to help SRF perform better node splitting.  ...  Specifically, we propose a semi-supervised structured ensemble learning approach for contour detection built on structured random forests (SRF).  ...  The method is built on the structured random forests (SRF), which has a similar learning procedure as the standard random forest (RF) [6] .  ... 
doi:10.1109/cvpr.2016.34 pmid:28496297 pmcid:PMC5423734 dblp:conf/cvpr/ZhangXSY16 fatcat:7rdp44gdi5fq5me2yafuiain64
« Previous Showing results 1 — 15 out of 489 results