Filters








207 Hits in 4.8 sec

Multiple Instance Cancer Detection by Boosting Regularised Trees [chapter]

Wenqi Li, Jianguo Zhang, Stephen J. McKenna
2015 Lecture Notes in Computer Science  
Regularised regression trees are then constructed and combined on the set of prototypes using a multiple instance boosting framework.  ...  We propose a novel multiple instance learning algorithm for cancer detection in histopathology images.  ...  Our algorithm extends Multiple Instance Boosting (MILBoosting) [17] by boosting regularised trees with instanceto-prototype distances as features.  ... 
doi:10.1007/978-3-319-24553-9_79 fatcat:ntpauqxrhjc77moyprtsf56gxm

Performance analysis of cost-sensitive learning methods with application to imbalanced medical data

Ibomoiye Domor Mienye, Yanxia Sun
2021 Informatics in Medicine Unlocked  
This research focuses on developing robust cost-sensitive classifiers by modifying the objective functions of some famous algorithms, such as logistic regression, decision tree, extreme gradient boosting  ...  Four popular medical datasets, including the Pima Indians Diabetes, Haberman Breast Cancer, Cervical Cancer Risk Factors, and Chronic Kidney Disease datasets, are used in the experiments to validate the  ...  Cost-sensitive learning has also been utilized to detect breast cancer, one of the most prevalent cancers among women.  ... 
doi:10.1016/j.imu.2021.100690 fatcat:fbxsyvzv2zdvfcn6mina5nxdme

Early detection of type 2 diabetes mellitus using machine learning-based prediction models

Leon Kopitar, Primoz Kocbek, Leona Cilar, Aziz Sheikh, Gregor Stiglic
2020 Scientific Reports  
With 6 months of data available, simple regression model performed with the lowest average RMSE of 0.838, followed by RF (0.842), LightGBM (0.846), Glmnet (0.859) and XGBoost (0.881).  ...  The work described in this article was supported by the Slovenian Research Agency (ARRS Grants P2-0057 and N2-0101), UM FHS Grant 073/217/2000-7/302 and by the European Union under the European Social  ...  Gradient boosting is a technique where new models are added to correct the errors made by existing models-in this case, regression trees.  ... 
doi:10.1038/s41598-020-68771-z pmid:32686721 fatcat:fo57bbloozdize2hva5l5wiesq

Machine Learning and Deep Learning Methods for Building Intelligent Systems in Medicine and Drug Discovery: A Comprehensive Survey [article]

G Jignesh Chowdary, Suganya G, Premalatha M, Asnath Victy Phamila Y, Karunamurthy K
2021 arXiv   pre-print
Many cancer-related deaths in men are due to prostate cancer. A neural network model was developed by Tsehay et al. (2017) to detect prostate cancers from multi-parametric MRI scan images.  ...  Neural network approaches were used by Ausawalaithong et al. (2018) to detect lung cancer from chest x-ray images.  ... 
arXiv:2107.14037v1 fatcat:2xb4vsemofci7c45ethux6y6aa

Ensemble deep learning: A review [article]

M.A. Ganaie and Minghui Hu and A.K. Malik and M. Tanveer and P.N. Suganthan
2022 arXiv   pre-print
The ensemble models are broadly categorised into ensemble models like bagging, boosting and stacking, negative correlation based deep ensemble models, explicit/implicit ensembles, homogeneous /heterogeneous  ...  cancer and lymphoma patients.  ...  With a continuous increase in available data, there have been problems that need to assign each instance multiple labels.  ... 
arXiv:2104.02395v2 fatcat:lq73jqso5vadvnqfnnmw4zul4q

Detecting Hate Speech on Twitter Network using Ensemble Machine Learning

Raymond T Mutanga, Nalindren Naicker, Oludayo O Olugbara
2022 International Journal of Advanced Computer Science and Applications  
This study presents a voting ensemble machine learning method that harnesses the strengths of logistic regression, decision trees, and support vector machines for the automatic detection of hate speech  ...  Machine learning methods have been explored in manifold studies to address the inherent challenges of hate speech detection in online spaces.  ...  Popular ensemble learning methods include bagging, boosting, and stacking. Bagging minimises variance by combining the verdicts from different decision trees [21, www.ijacsa.thesai. org 22] .  ... 
doi:10.14569/ijacsa.2022.0130341 fatcat:qptkfqybcbhp7gaomibjmxiqsm

A review of evidence of health benefit from artificial neural networks in medical intervention

P.J.G. Lisboa
2002 Neural Networks  
This is followed by a survey of published Randomised Controlled Trials and Clinical Trials, leading to recommendations for good practice in the design and evaluation of neural networks for use in medical  ...  In a cancer mortality study, for instance, the event of interest might be death ascribed to breast cancer, with a period of study involving follow-up over five years.  ...  reporting the boost afforded over prevalence, or during training, by directly maximising predictive power rather than accuracy (Lisboa et al, 2000b) .  ... 
doi:10.1016/s0893-6080(01)00111-3 pmid:11958484 fatcat:oj73bxwth5ft3gd7y2wegr7lbe

A Review on Machine Learning and Deep Learning Techniques Applied to Liquid Biopsy [chapter]

Arets Paeglis, Boriss Strumfs, Dzeina Mezale, Ilze Fridrihsone
2018 Liquid Biopsy [Working Title]  
As new technologies for detecting and measuring biochemical markers from bodily fluid samples (e.g., microfluidics and labs-on-a-chip) revolutionise the industry of diagnostics and precision medicine,  ...  Mutation prediction and early lung cancer detection in liquid biopsy using convolutional neural networks The proliferation of cancer cells is driven by specific somatic mutations in the cancer genome  ...  (ANN)-based model class used for early detection and diagnosis of lung cancer.  ... 
doi:10.5772/intechopen.79404 fatcat:ydcnwek7argurcs67lrrwp7x5e

Machine Learning of Raman Spectroscopy Data for Classifying Cancers: A Review of the Recent Literature

Nathan Blake, Riana Gaifulina, Lewis D. Griffin, Ian M. Bell, Geraint M. H. Thomas
2022 Diagnostics  
We conduct a literature review to ascertain the recent machine learning methods used to classify cancers using Raman spectral data.  ...  Cancer (serum) Boosted Tree Not Stated 184 subjects/ samples 3 spectra per subject.  ...  Ito et al. developed a boosted tree model from serum samples taken from suspected colorectal cancer patients, classifying them into four categories, colorectal cancer, adenoma, hyper-plastic polyps and  ... 
doi:10.3390/diagnostics12061491 pmid:35741300 pmcid:PMC9222091 fatcat:a3vjjt4s7bb47j2vyb5bquyw24

Improved Heart Disease Prediction Using Particle Swarm Optimization Based Stacked Sparse Autoencoder

Ibomoiye Domor Mienye, Yanxia Sun
2021 Electronics  
The network consists of multiple sparse autoencoders and a softmax classifier.  ...  The optimization by the PSO improves the feature learning and classification performance of the SSAE.  ...  discriminant analysis (LDA), support vector machine (SVM), extreme gradient boosting (XGBoost), adaptive boosting (AdaBoost), random forest, softmax classifier, and some methods in recent literature.  ... 
doi:10.3390/electronics10192347 fatcat:6zszsocwhvgwbca6brcpeeno7y

Is rotation forest the best classifier for problems with continuous features? [article]

A. Bagnall, M. Flynn, J. Large, J. Line, A. Bostrom, G. Cawley
2020 arXiv   pre-print
Rotation forest is a tree based ensemble that performs transforms on subsets of attributes prior to constructing each tree.  ...  We evaluate classifiers from three families of algorithms: support vector machines; tree-based ensembles; and neural networks tuned with a large grid search.  ...  In [31] rotation forest was applied to two well known microarray cancer datasets (the breast cancer dataset and the prostate cancer dataset) and found to be more effective than bagging or boosting.  ... 
arXiv:1809.06705v3 fatcat:xg2ohujljvg4rhihsenoggmbfa

Supervised Learning Models for the Preliminary Detection of COVID-19 in Patients Using Demographic and Epidemiological Parameters

Aditya Pradhan, Srikanth Prabhu, Krishnaraj Chadaga, Saptarshi Sengupta, Gopal Nath
2022 Information  
is the tree (regularisation function). • AdaBoost: Adaptive boosting, also referred to as AdaBoost, is a machine learning approach that uses the ensemble methodology [73] .  ...  The 'Tree-Explainer' procedure is mainly used in tree-based classifiers such as decision tree, random forest and other boosting algorithms.  ... 
doi:10.3390/info13070330 fatcat:usw2wn2ylrebzckjas7aococae

Ada-WHIPS: explaining AdaBoost classification with applications in the health sciences

Julian Hatwell, Mohamed Medhat Gaber, R. Muhammad Atif Azad
2020 BMC Medical Informatics and Decision Making  
We compare the explanations generated by our novel approach with the state of the art in an experimental study.  ...  Using a novel formulation, Ada-WHIPS uniquely redistributes the weights among individual decision nodes of the internal decision trees of the AdaBoost model.  ...  The easiest to classify instances can be explained by traversing the tree, while hard to classify instances are left to the SVM which remains a black box.  ... 
doi:10.1186/s12911-020-01201-2 pmid:33008388 fatcat:zl2i4a6yibeofd4hgkpvcae67q

A Survey of Current Resources to Study lncRNA-Protein Interactions

Melcy Philip, Tyrone Chen, Sonika Tyagi
2021 Non-Coding RNA  
Phenotypes are driven by regulated gene expression, which in turn are mediated by complex interactions between diverse biological molecules.  ...  Their importance is further highlighted by their conservation across kingdoms. Hence, interest in LPI research is increasing.  ...  This work was supported by the www.massive.org.au (accessed on 27 May 2021) MASSIVE HPC facility.  ... 
doi:10.3390/ncrna7020033 pmid:34201302 fatcat:psr5mlwzlzgyvdbpsdnzzj6lvq

Machine learning and deep learning predictive models for type 2 diabetes: a systematic review

Luis Fregoso-Aparicio, Julieta Noguez, Luis Montesinos, José A. García-García
2021 Diabetology & Metabolic Syndrome  
The review followed the PRISMA methodology primarily, enriched with the one proposed by Keele and Durham Universities.  ...  Eighteen different types of models were compared, with tree-based algorithms showing top performances. Deep Neural Networks proved suboptimal, despite their ability to deal with big and dirty data.  ...  LSR estimators minimize the sum of the squared errors (a difference between observed values and predicted values). • Multiple Instance Learning boosting (MIL): The boosting algorithm sequentially trains  ... 
doi:10.1186/s13098-021-00767-9 pmid:34930452 pmcid:PMC8686642 fatcat:ybn34nsodja5lcetnidihomdvy
« Previous Showing results 1 — 15 out of 207 results