Filters








10 Hits in 14.7 sec

DropConnected Neural Network Trained with Diverse Features for Classifying Heart Sounds

Edmund Kay, Anurag Agarwal
2016 2016 Computing in Cardiology Conference (CinC)   unpublished
A fully-connected, two-hidden-layer neural network trained by error backpropagation, and regularized with DropConnect is used to classify heart sounds as normal or abnormal.  ...  Features are extracted from the heart sounds using a wavelet transform, mel-frequency cepstral coefficients, inter-beat properties, and signal complexity.  ...  Classification is done using a fully-connected, two-hidden-layer neural network, trained with error backpropagation and regularized using DropConnect.  ... 
doi:10.22489/cinc.2016.181-266 fatcat:kpdwzg3x2jfszns3q3t5x7tw5q

DropConnected neural networks trained on time-frequency and inter-beat features for classifying heart sounds

Edmund Kay, Anurag Agarwal
2017 Physiological Measurement  
They are then used as the input to a fullyconnected, two-hidden-layer neural network, trained by error backpropagation, and regularized with DropConnect.  ...  An algorithm has been trained, on the PhysioNet open-access heart sounds database, to classify heart sounds as normal or abnormal.  ...  Classification is done using a fully-connected, two-hidden-layer neural network, trained with error backpropagation and regularized using DropConnect.  ... 
doi:10.1088/1361-6579/aa6a3d pmid:28758641 fatcat:fjogf6ivrbgoxmg3daieg34ojq

Assessment of Dual-Tree Complex Wavelet Transform to Improve SNR in Collaboration with Neuro-Fuzzy System for Heart-Sound Identification

Bassam Al-Naami, Hossam Fraihat, Jamal Al-Nabulsi, Nasr Y. Gharaibeh, Paolo Visconti, Abdel-Razzak Al-Hinnawi
2022 Electronics  
with the adaptive neuro-fuzzy inference System (ANFIS) classifier.  ...  The research paper proposes a novel denoising method to improve the outcome of heart-sound (HS)-based heart-condition identification by applying the dual-tree complex wavelet transform (DTCWT) together  ...  Acknowledgments: We thank the international database named PhysioNet for providing an open source of medical data (biosignal recordings) in the form of an annual challenge.  ... 
doi:10.3390/electronics11060938 fatcat:mf7d6mvrvrbifkicyuhic4fkhy

Deep Convolutional Neural Networks for Image Classification: A Comprehensive Review

Waseem Rawat, Zenghui Wang
2017 Neural Computation  
Convolutional neural networks (CNNs) have been applied to visual tasks since the late 1980s.  ...  network renaissance that has seen rapid progression since 2012.  ...  ) DropConnect (Wan et al., 2013) a Ensemble of DropConnect networks, with data augmentation (no elastic distortions) 0.21 Notes: The italicized entries represent the state of the art for this configuration  ... 
doi:10.1162/neco_a_00990 pmid:28599112 fatcat:ospodd7lbzhsbl4fkjzvpe6jgy

Opportunities and obstacles for deep learning in biology and medicine

Travers Ching, Daniel S. Himmelstein, Brett K. Beaulieu-Jones, Alexandr A. Kalinin, Brian T. Do, Gregory P. Way, Enrico Ferrero, Paul-Michael Agapow, Michael Zietz, Michael M. Hoffman, Wei Xie, Gail L. Rosen (+24 others)
2018 Journal of the Royal Society Interface  
Furthermore, the limited amount of labelled data for training presents problems in some domains, as do legal and privacy constraints on work with sensitive health records.  ...  Though progress has been made linking a specific neural network's prediction to input features, understanding how users should interpret these models to make testable hypotheses about the system under  ...  for clarifying edits to the abstract and introduction and Robert Gieseke, Ruibang Luo, Stephen Ra, Sourav Singh and GitHub user snikumbh for correcting typos, formatting and references.  ... 
doi:10.1098/rsif.2017.0387 pmid:29618526 pmcid:PMC5938574 fatcat:65o4xmp53nc6zmj37srzuht6tq

Opportunities And Obstacles For Deep Learning In Biology And Medicine [article]

Travers Ching, Daniel S. Himmelstein, Brett K. Beaulieu-Jones, Alexandr A. Kalinin, Brian T. Do, Gregory P. Way, Enrico Ferrero, Paul-Michael Agapow, Michael Zietz, Michael M Hoffman, Wei Xie, Gail L. Rosen (+24 others)
2017 bioRxiv   pre-print
Furthermore, the limited amount of labeled data for training presents problems in some domains, as do legal and privacy constraints on work with sensitive health records.  ...  Nonetheless, we foresee deep learning powering changes at both bench and bedside with the potential to transform several areas of biology and medicine.  ...  We would like to thank Anna Greene for a careful proofreading of the manuscript in advance of the first submission.  ... 
doi:10.1101/142760 fatcat:l7zvbtbgxjamtd735vir2trw6q

Almost Sure Convergence of Dropout Algorithms for Neural Networks [article]

Albert Senen-Cerda, Jaron Sanders
2020 arXiv   pre-print
We investigate the convergence and convergence rate of stochastic training algorithms for Neural Networks (NNs) that, over the years, have spawned from Dropout (Hinton et al., 2012).  ...  We also establish an upper bound on the rate of convergence of Gradient Descent (GD) on the limiting ODEs of dropout algorithms for arborescences (a class of trees) of arbitrary depth and with linear activation  ...  Dropconnected neural network trained with diverse features for classifying heart sounds. In 2016 Computing in Cardiology Conference (CinC), pp. 617-620. IEEE, 2016.  ... 
arXiv:2002.02247v1 fatcat:qjqgugdeezax7at54uacqmp5ka

Deep Neural Architectures for Medical Image Semantic Segmentation: Review

Muhammad Zubair Khan, Mohan Kumar Gajendran, Yugyung Lee, Muazzam A. Khan
2021 IEEE Access  
DEEP NEURAL NETWORK ARCHITECTURES 1) Convolution neural networks (CNNs) A convolutional neural network (CNN) is an advanced neural network architecture developed for analyzing twodimensional images [50  ...  This mainly featured deep convolutional networks or extensions like recurrent neural networks, generative adversarial networks for image restoration, classification, segmentation, compression, and registration  ... 
doi:10.1109/access.2021.3086530 fatcat:hacpqwdxybh63j5ygebqszm7qq

Dynamic Mathematics for Automated Machine Learning Techniques [article]

Nicholas Kuo, University, The Australian National
2021
However, modern machine learning techniques such as backpropagation training was firmly established in 1986 while computer vision was revolutionised in 2012 with the introduction of AlexNet.  ...  This enabled us to extend meta-learning beyond network configuration for network pruning and continual learning.  ...  trained the classifier networks.  ... 
doi:10.25911/zmy2-7160 fatcat:flnkwfv33rbupg2e5m4twnbaie

Dagstuhl Reports, Volume 7, Issue 2, February 2017, Complete Issue [article]

2017
Backpropagation for energy-efficient neuromorphic computing. In NIPS 2015. 8 L. Wan et al. Regularization of Neural Networks using DropConnect. In ICML 2013. 5 C. 1 P. A. Merolla et al.  ...  Consequently, tasks like face recognition and identification can be solved using powerful methods, like Convolutional Neural Networks [13] , and millions of face images for training.  ...  People have become comfortable with searching. There is a need for more sophisticated tools. Google is not designed as a learning system and yet people use it for learning.  ... 
doi:10.4230/dagrep.7.2 fatcat:ydwbggkkxjfahomhv75rjaguiy