215,967 Hits in 5.0 sec

Investigating deep learning for fNIRS based BCI

Johannes Hennrich, Christian Herff, Dominic Heger, Tanja Schultz
2015 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)  
The feasibility of more complex and powerful classification approaches like Deep Neural Networks has, to the best of our knowledge, not been investigated for fNIRS based BCI.  ...  These networks have recently become increasingly popular, as they outperformed conventional machine learning methods for a variety of tasks, due in part to advances in training methods for neural networks  ...  For a more appropriate comparison we cross-validated a shrinkage-LDA on the same data as the deep neural network as a third method.  ... 
doi:10.1109/embc.2015.7318984 pmid:26736884 dblp:conf/embc/HennrichHHS15 fatcat:iguk7owtffg7bffhs626uedojy

Radiologists versus Deep Convolutional Neural Networks: A Comparative Study for Diagnosing COVID-19

Abdulkader Helwan, Mohammad Khaleel Sallam Ma'aitah, Hani Hamdan, Dilber Uzun Ozsahin, Ozum Tuncyurek, Waqas Haider Bangyal
2021 Computational and Mathematical Methods in Medicine  
In a test set of 250 images used to evaluate the deep neural networks and the radiologists, it was found that deep networks (ResNet-18, ResNet-50, and DenseNet-201) can outperform the radiologists in terms  ...  The reverse transcriptase polymerase chain reaction (RT-PCR) is still the routinely used test for the diagnosis of SARS-CoV-2 (COVID-19).  ...  Acknowledgments The authors would like to thank the two radiologists who helped in diagnosing the chest CT images for this study.  ... 
doi:10.1155/2021/5527271 pmid:34055034 pmcid:PMC8112196 fatcat:2dojwvhgqzbkjpub7ywpi4zcc4

Abstract: Adversarial Examples as Benchmark for Medical Imaging Neural Networks [chapter]

Magdalini Paschali, Sailesh Conjeti, Fernando Navarro, Nassir Navab
2019 Handbook of Experimental Pharmacology  
Traditionally, the performance of a deep learning model is evaluated on a test dataset, originating from the same distribution as the training set.  ...  Extensive evaluation was performed on state-of-the-art classification and segmentation deep neural networks, for the challenging tasks of fine-grained skin lesion classification and whole brain segmentation  ...  Traditionally, the performance of a deep learning model is evaluated on a test dataset, originating from the same distribution as the training set.  ... 
doi:10.1007/978-3-658-25326-4_4 fatcat:bch6xie7tfebdbereerp4dwwmu

Deep Forest: Towards An Alternative to Deep Neural Networks

Zhi-Hua Zhou, Ji Feng
2017 Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence  
In this paper, we propose gcForest, a decision tree ensemble approach with performance highly competitive to deep neural networks in a broad range of tasks.  ...  In contrast to deep neural networks which require great effort in hyper-parameter tuning, gcForest is much easier to train; even when it is applied to different data across different domains in our experiments  ...  Our purpose of using ensemble, however, is quite different. We are aiming at an alternative to deep neural networks rather than a combination with deep neural networks.  ... 
doi:10.24963/ijcai.2017/497 dblp:conf/ijcai/ZhouF17 fatcat:63e5y4c6trdtlhu4z2dgngr6am

Air Quality Measurement Based on Double-Channel Convolutional Neural Network Ensemble Learning [article]

Zhenyu Wang, Wei Zheng, Chunfeng Song
2019 arXiv   pre-print
Some air quality measurement algorithms related to deep learning mostly adopt a single convolutional neural network to train the whole image, which will ignore the difference of different parts of the  ...  In this paper, we propose a method for air quality measurement based on double-channel convolutional neural network ensemble learning to solve the problem of feature extraction for different parts of environmental  ...  Considering the different composition information of different parts of the environmental image, we constructed a double-channel convolutional neural network based on the method of deep convolutional neural  ... 
arXiv:1902.06942v3 fatcat:xobaseho25ezperksjyunkqyf4

Investigating Learning in Deep Neural Networks using Layer-Wise Weight Change [article]

Ayush Manish Agrawal, Atharva Tendle, Harshvardhan Sikka, Sahib Singh, Amr Kayid
2020 arXiv   pre-print
Understanding the per-layer learning dynamics of deep neural networks is of significant interest as it may provide insights into how neural networks learn and the potential for better training regimens  ...  We investigate learning in Deep Convolutional Neural Networks (CNNs) by measuring the relative weight change of layers while training.  ...  In this work, we empirically investigate the learning dynamics of different layers in various deep convolutional neural network architectures on several different vision tasks.  ... 
arXiv:2011.06735v2 fatcat:em2a63l5ffaqrcwbaz3bjhjdlq

Improve Document Embedding for Text Categorization Through Deep Siamese Neural Network [article]

Erfaneh Gharavi, Hadi Veisi
2020 arXiv   pre-print
To obtain representation for large text, we propose the utilization of deep Siamese neural networks.  ...  Our Siamese network consists of two sub-network of multi-layer perceptron. We examine our representation for the text categorization task on BBC news dataset.  ...  Therefore, deep neural networks perform poorly. Table 4 reports the text classification performance of the other four classifiers on different representations.  ... 
arXiv:2006.00572v1 fatcat:sgufkdhb2rfmxfz7ziovdxzkmm

SwGridNet: A Deep Convolutional Neural Network based on Grid Topology for Image Classification [article]

Atsushi Takeda
2017 arXiv   pre-print
Deep convolutional neural networks (CNNs) achieve remarkable performance on image classification tasks.  ...  Recent studies, however, have demonstrated that generalization abilities are more important than the depth of neural networks for improving performance on image classification tasks.  ...  A multipath neural network has the same effect as ensemble learning [27] which improves the capability of a neural network for generalization.  ... 
arXiv:1709.07646v3 fatcat:dcliwjugejd6jido4itlhsyxe4

Multi-task Neural Networks for QSAR Predictions [article]

George E. Dahl and Navdeep Jaitly and Ruslan Salakhutdinov
2014 arXiv   pre-print
assays at the same time.  ...  We compared our methods to alternative methods reported to perform well on these tasks and found that our neural net methods provided superior performance.  ...  Acknowledgements We would like to acknowledge Christopher Jordan-Squire and Geoff Hinton for their work on our team during the Merck molecular activity challenge; without the contest we would not have  ... 
arXiv:1406.1231v1 fatcat:5f3t3p6ainfojjs2obw3dyal3a

Rail track condition monitoring: a review on deep learning approaches

Albert Ji, Wai Lok Woo, Eugene Wai Leong Wong, Yang Thee Quek
2021 Intelligence & Robotics  
In the paper, we review the existing literature on applying deep learning to rail track condition monitoring.  ...  Therefore, rail track condition monitoring is an important task. Over the past decade, deep learning techniques have been rapidly developed and deployed.  ...  The opinions, findings, and conclusions or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect those of the company.  ... 
doi:10.20517/ir.2021.14 fatcat:7a7p6nqvtre4thatkaqb76asn4

MtNet: A Multi-Task Neural Network for Dynamic Malware Classification [chapter]

Wenyi Huang, Jack W. Stokes
2016 Lecture Notes in Computer Science  
For the first time, we see improvements using multiple layers in a deep neural network architecture for malware classification.  ...  The system is trained on 4.5 million files and tested on a holdout test set of 2 million files which is the largest study to date.  ...  Acknowledgements: The authors would like to thank Mady Marinescu with helping in the data collection.  ... 
doi:10.1007/978-3-319-40667-1_20 fatcat:gyt6qqs33zcppiiwnorz2xtvl4

Multipath Graph Convolutional Neural Networks [article]

Rangan Das, Bikram Boote, Saumik Bhattacharya, Ujjwal Maulik
2021 arXiv   pre-print
We train and test our model on various benchmarks datasets for the task of node property prediction.  ...  In this work, we propose a novel Multipath Graph convolutional neural network that aggregates the output of multiple different shallow networks.  ...  The proposed archi- tecture converges faster and provides a higher accuracy on the test set. This has been verified using different datasets for the task of node property prediction.  ... 
arXiv:2105.01510v1 fatcat:mey2t3k72vegpg44foywvckhqm

StochasticNet: Forming Deep Neural Networks via Stochastic Connectivity [article]

Mohammad Javad Shafiee, Parthipan Siva, Alexander Wong
2015 arXiv   pre-print
One area in deep neural networks that is ripe for exploration is neural connectivity formation.  ...  To evaluate the feasibility of such a deep neural network architecture, we train a StochasticNet using four different image datasets (CIFAR-10, MNIST, SVHN, and STL-10).  ...  Second, in a deep feed-forward neural network, there can be no neural connections between neurons on the same layer.  ... 
arXiv:1508.05463v4 fatcat:jm4cqujmtjhrzdopixwux5m2me

Research on software credibility algorithm based on deep convolutional sparse coding

Zhaosheng Xu, I. Barukčić
2021 MATEC Web of Conferences  
Firstly, it summarizes the convolutional sparse coding and trust classification system, and then constructs the algorithm from two aspects: factor processing based on deep convolution neural network and  ...  Based on the author's research time, this paper studies the software credibility algorithm based on deep convolutional sparse coding.  ...  Trust classification based on deep convolution sparse coding Factor processing based on deep convolution neural network The typical convolution neural network can be divided into three parts: feature  ... 
doi:10.1051/matecconf/202133608013 fatcat:pmh445it2zf3bi4o65jbwt6iqu

Anti-Transfer Learning for Task Invariance in Convolutional Neural Networks for Speech Processing [article]

Eric Guizzo, Tillman Weyde, Giacomo Tarroni
2021 arXiv   pre-print
We introduce the novel concept of anti-transfer learning for speech processing with convolutional neural networks.  ...  We have implemented anti-transfer for convolutional neural networks in different configurations with several similarity metrics and aggregation functions, which we evaluate and analyze with several speech  ...  Implementation Our implementation is based on the VGG16 Architecture Simonyan and Zisserman [2014], a deep Convolutional Neural Network.  ... 
arXiv:2006.06494v2 fatcat:r2tzkcoofzcclcu24dmpgbgr64
« Previous Showing results 1 — 15 out of 215,967 results