Filters








14,074 Hits in 6.7 sec

Learning Morphological Transformations with Recurrent Neural Networks

Saurav Biswas, Thomas Breuel
2015 Procedia Computer Science  
In this paper we investigate the use Recurrent neural network architectures to learn useful transformations of an image(object), progressively over time.  ...  Deep learning techniques have been successfully used in recent years to learn useful image transformations and features, thus contributing significantly to the advancements in neural networks.  ...  While deep learning has been used to learn feature hierarchies, another kind of neural network architecure, called Recurrent neural networks (RNN), have been studied extensively to mimic the recurrent  ... 
doi:10.1016/j.procs.2015.07.326 fatcat:txjxacguwff7zkherycimry5tm

Some open questions on morphological operators and representations in the deep learning era [article]

Jesus Angulo
2021 arXiv   pre-print
residual neural networks, recurrent neural networks, etc.) is worthwhile.  ...  Indeed, I firmly believe that the convergence between mathematical morphology and the computation methods which gravitate around deep learning (fully connected networks, convolutional neural networks,  ...  residual neural networks, recurrent neural networks, etc.) is worthwhile.  ... 
arXiv:2105.01339v2 fatcat:d2k2or6ih5difpwawoot7tty6y

Applying the Transformer to Character-level Transduction [article]

Shijie Wu, Ryan Cotterell, Mans Hulden
2021 arXiv   pre-print
The transformer has been shown to outperform recurrent neural network-based sequence-to-sequence models in various word-level NLP tasks.  ...  show that with a large enough batch size, the transformer does indeed outperform recurrent models.  ...  Yet for character-level transduction tasks like morphological inflection, the dominant model has remained a recurrent neural network-based sequence-to-sequence model with attention (Cotterell et al.,  ... 
arXiv:2005.10213v2 fatcat:xghukun42nho5owxwrmrhot6cm

Character-based recurrent neural networks for morphological relational reasoning

Olof Mogren, Richard Johansson
2017 Proceedings of the First Workshop on Subword and Character Level Models in NLP  
To address the task of predicting a word form given a demo relation (a pair of word forms) and a query word, we devise a character-based recurrent neural network architecture using three separate encoders  ...  We present a model for predicting word forms based on morphological relational reasoning with analogies.  ...  Recurrent neural networks A recurrent neural network (RNN) is an artificial neural network that can model a sequence of arbitrary length.  ... 
doi:10.18653/v1/w17-4108 dblp:conf/emnlp/MogrenJ17 fatcat:227sbxy53ncgrpxcbek3wdjm4u

Neural Morphological Tagging from Characters for Morphologically Rich Languages [article]

Georg Heigold, Guenter Neumann, Josef van Genabith
2016 arXiv   pre-print
This paper investigates neural character-based morphological tagging for languages with complex morphology and large tag sets.  ...  Our experiments for morphological tagging suggest that for "simple" model configurations, the choice of the network architecture (CNN vs. CNNHighway vs. LSTM vs.  ...  Hence, we can model the position-wise probabilities p(t|w 1 , . . . , w N ) with recurrent neural networks, such as long short-term memory recurrent neural networks (LSTMs) (Graves, 2012) .  ... 
arXiv:1606.06640v1 fatcat:7q4exji3xvb75p7doqf5n6xcly

An initial study of deep learning for mangrove classification

S Faza, E B Nababan, S Efendi, M Basyuni, R F Rahmat
2018 IOP Conference Series: Materials Science and Engineering  
Deep Learning is a new breakthrough in the area of neural network. One of it methodology is Deep Neural Network.  ...  Usually, deep neural network is a method that can solve problems such as classification or prediction.  ...  Introduction Deep Learning is a new advancement in the area of neural network [1] .  ... 
doi:10.1088/1757-899x/420/1/012093 fatcat:rqm4febgxbfstcivhgs5farwha

A Survey on Multistage lung cancer Detection and Classification

Jay Jawarkar, Nishit Solanki, Meet Vaishnav, Harsh Vichare, Sheshang Degadwala
2020 International Journal of Scientific Research in Computer Science Engineering and Information Technology  
In this re- search we compare different Machine learning (SVM, KNN, RF etc.) techniques with deep learning (CNN, CDNN) techniques using different parameters accuracy, precision and recall.  ...  Earlier, Lung cancer is the primary cause of cancer deaths worldwide among both men and women, with more than 1 million deaths annually.  ...  A recurrent neural network (RNN) is a type of artificial neural network commonly used in speech recognition and natural language processing (NLP).  ... 
doi:10.32628/cseit20631110 fatcat:hdzjvb22yfe7liuzaa3dvbpuey

Learning Noun Cases Using Sequential Neural Networks [article]

Sina Ahmadi
2018 arXiv   pre-print
This research proposal seeks to address the degree to which Recurrent Neural Networks (RNNs) are efficient in learning to decline noun cases.  ...  can improve the performance of neural network models.  ...  In addition to the axes as mentioned earlier, I am also interested in probing the ability of the neural networks in learning more structure-sensitive dependencies in human language such as gender agreement  ... 
arXiv:1810.03996v1 fatcat:rupladevyffnzdyhekeyvbm6hu

A Comparison of Transformer and Recurrent Neural Networks on Multilingual Neural Machine Translation [article]

Surafel M. Lakew, Mauro Cettolo, Marcello Federico
2018 arXiv   pre-print
Recently, neural machine translation (NMT) has been extended to multilinguality, that is to handle more than one translation direction with a single system.  ...  architectures in MT, which are the Recurrent and the Transformer ones; and (iii) quantitatively explores how the closeness between languages influences the zero-shot translation.  ...  We also gratefully acknowledge the support of NVIDIA Corporation with the donation of the Titan Xp GPU used for this research.  ... 
arXiv:1806.06957v2 fatcat:4uyws5peqrek7m5sevzsxuz7ay

Deep Recurrent Neural Networks for ECG Signal Denoising [article]

Karol Antczak
2019 arXiv   pre-print
We present a novel approach to denoise electrocardiographic signals with deep recurrent denoising neural networks.  ...  The results indicate that four-layer deep recurrent neural network can outperform reference methods for heavily noised signal.  ...  It utilizes both neural networks and wavelet transform, in a form of Wavelet Neural Networks (WNN).  ... 
arXiv:1807.11551v3 fatcat:refjgwbxgfcxpcxv2gvycddgza

Detection and Classification Brain Tumor of Magnetic Resonance Imaging Using 2D Gabor Wavelet Transform and Artificial Neural Network: A Review

S.A. Nagtode, Bhakti B Potdukhe
2015 International Journal of IT-based Public Health Management  
Artificial neural network Classify the Brain Tumor into three types that is benign, malignant or normal. Probabilistic Neural Network is the part of the artificial neural network .  ...  Probabilistic Neural Network gives proper result of classification than other artificial neural networks and it is a optimistic tool for classification of the brain Tumors .  ...  This determines which character was read. 2] Recurrent Neural Networks Figure.4. Recurrent Neural Networks Recurrent neural network have arbitrary topologies.  ... 
doi:10.21742/ijiphm.2015.2.2.02 fatcat:5b7qyqkt2zbnponwzdvhbuu66q

Long short-term memory language models with additive morphological features for automatic speech recognition

Daniel Renshaw, Keith B. Hall
2015 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)  
Our novel neural network language model integrates this additive morphological representation into a long short-term memory architecture, improving Russian speech recognition word error rates by 0.9 absolute  ...  Rare and unknown words containing common morphemes can thus be represented with greater fidelity despite their sparsity.  ...  Acknowledgments We would like to thank Brian Roark, Herman Kamper, and Eva Schlinger for their contributions during useful discussions, Haşim Sak and Kaisuke Nakajima for their help with implementation  ... 
doi:10.1109/icassp.2015.7178972 dblp:conf/icassp/RenshawH15 fatcat:xkph4htwozhqrgrzpjcm44bf5e

Advances in Natural Language Question Answering: A Review [article]

K.S.D. Ishwari, A.K.R.R.Aneeze, S.Sudheesan, H.J.D.A. Karunaratne, A. Nugaliyadde, Y. Mallawarrachchi
2019 arXiv   pre-print
Most of the deep learning approaches have shown to achieve higher results compared to machine learning and statistical methods.  ...  The dynamic nature of language has profited from the nonlinear learning in deep learning. This has created prominent success and a spike in work on question answering.  ...  With the composition of enough such transformations, very complex functions can be learned.  ... 
arXiv:1904.05276v1 fatcat:kgnbpaey45fwrloyzj5dzfvile

A Recurrent Neural Network Approach for Automated Neural Tracing in Multispectral 3D Images [article]

Yan Yan, Benjamin V Sadis, Douglas H Roossien, Jason J Corso, Dawen Cai
2018 bioRxiv   pre-print
These cubes are the input to the recur-rent neural network. The proposed approach is simple and effective. The approach can be implemented with the deep learning toolbox "Keras" in 100 lines.  ...  To benefit from the power of deep learning, in this paper, we propose an automated neural tracing approach in multispectral 3D Brainbow images based on recurrent neural net-work.  ...  In this paper, we proposed a recurrent neural network approach for automated neural tracing in Brainbow image stacks. The method contains an image denoising step and a LSTM learning step.  ... 
doi:10.1101/230441 fatcat:33s4fwwpnfaufoptdcdzto5v5q

Deep Learning Based Natural Language Processing for End to End Speech Translation [article]

Sarvesh Patil
2018 arXiv   pre-print
In this paper, we will take a look at various signal processing techniques and then application of them to produce a speech-to-text system using Deep Recurrent Neural Networks.  ...  Deep Learning methods employ multiple processing layers to learn hierarchial representations of data.  ...  Graves, "Sequence transduction with recurrent neural networks," in ICML Representation Learning Worksop, 2012. [5] Junyoung Chung, Caglar Gulcehre, KyungHyun Cho, Yos hua Bengio, Empirical Evaluation of  ... 
arXiv:1808.04459v1 fatcat:zhlfomp4g5ggvicydhcdam46em
« Previous Showing results 1 — 15 out of 14,074 results