1,986 Hits in 4.2 sec

On the Curse of Memory in Recurrent Neural Networks: Approximation and Optimization Analysis [article]

Zhong Li, Jiequn Han, Weinan E, Qianxiao Li
2021 arXiv   pre-print
We study the approximation properties and optimization dynamics of recurrent neural networks (RNNs) when applied to learn input-output relationships in temporal data.  ...  A unifying theme uncovered is the non-trivial effect of memory, a notion that can be made precise in our framework, on approximation and optimization: when there is long term memory in the target, it takes  ...  The curse of memory in optimization.  ... 
arXiv:2009.07799v2 fatcat:i4lxwbhxzvhbhdf7wtbnjatizu

Expressivity of Deep Neural Networks [article]

Ingo Gühring, Mones Raslan, Gitta Kutyniok
2020 arXiv   pre-print
In this review paper, we give a comprehensive overview of the large variety of approximation results for neural networks.  ...  While the mainbody of existing results is for general feedforward architectures, we also depict approximation results for convolutional, residual and recurrent neural networks.  ...  Moreover, they would like to thank Mark Cheng for creating most of the figures and Johannes von Lindheim for providing  ... 
arXiv:2007.04759v1 fatcat:lpneojafcvfbrgx4qx4oo5k5na

Predicting Inflation with Neural Networks [article]

Livia Paranhos
2021 arXiv   pre-print
The use of a particular recurrent neural network, the long-short term memory model, or LSTM, that summarizes macroeconomic information into common components is a major contribution of the paper.  ...  The LSTM in particular is found to outperform the traditional feed-forward network at long horizons, suggesting an advantage of the recurrent model in capturing the long-term trend of inflation.  ...  Although the focus of the analysis resides on the recurrent model, other neural network structures are also considered for comparison.  ... 
arXiv:2104.03757v1 fatcat:2s2fpguhtbanpma4a5rtkfomti

Recurrent neural network speech predictor based on dynamical systems approach

E. Varoglu, K. Hacioglu
2000 IEE Proceedings - Vision Image and Signal Processing  
A nonlinear predictive model of speech, based on the method of time delay reconstruction, is presented and approximated using a fully connected recurrent neural network (RNN) followed by a linear combiner  ...  In all cases, the proposed network was found to be a good solution for both prediction and synthesis.  ...  [3] and a Recurrent Neural Network (RNN) [4] .  ... 
doi:10.1049/ip-vis:20000192 fatcat:6tyf6apwhzfxterzlozphuouzi

Deep Learning for Marginal Bayesian Posterior Inference with Recurrent Neural Networks

Thayer Fisher, Alex Luedtke, Marco Carone, Noah Simon
2024 Statistica sinica  
We discuss a general approach that reframes this as a multi-task learning problem and uses recurrent deep neural networks (RNNs) to approximately evaluate posterior quantiles.  ...  In Bayesian data analysis, it is often important to evaluate quantiles of the posterior distribution of a parameter of interest (e.g., to form posterior intervals).  ...  This proposed recurrent neural network can take in a dataset and return an approximation to a single posterior quantile.  ... 
doi:10.5705/ss.202020.0348 fatcat:nqfjqqw6crcqngdn5qoskzrahm

Recurrent Neural Networks Hardware Implementation on FPGA [article]

Andre Xian Ming Chang, Berin Martini, Eugenio Culurciello
2016 arXiv   pre-print
Recurrent Neural Networks (RNNs) have the ability to retain memory and learn data sequences.  ...  In this paper we present a hardware implementation of Long-Short Term Memory (LSTM) recurrent network on the programmable logic Zynq 7020 FPGA from Xilinx.  ...  We would like to thank Vinayak Gokhale for the discussion on implementation and hardware architecture and also thank Alfredo Canziani, Aysegul Dundar and Jonghoon Jin for the support.  ... 
arXiv:1511.05552v4 fatcat:dpav4x43hjab7e3vvuu6pl362e

Neural network models and deep learning

Nikolaus Kriegeskorte, Tal Golan
2019 Current Biology  
We introduce feedforward and recurrent networks and explain the expressive power of this modeling framework and the backpropagation algorithm for setting the parameters.  ...  They can approximate functions and dynamics by learning from examples. Here we give a brief introduction to neural network models and deep learning for biologists.  ...  Any recurrent neural network can be unfolded along time as a feedforward network. To this end, the units of the recurrent neural network (blue, green, pink sets) are replicated for each time step.  ... 
doi:10.1016/j.cub.2019.02.034 pmid:30939301 fatcat:yuo75bhphjextbrkkys6i44i5y

Scalable Recurrent Neural Network for Hyperspectral Image Classi cation

M. E. Paoletti, J. M. Haut, J. Plaza, A. Plaza
2022 Zenodo  
Recurrent neural networks (RNNs) have been widely used for the classification of HSI datasets, understood as a single sequence of pixel vectors with high dimensionality.  ...  In order to mitigate this problem, this paper presents a new RNN classifier based on simple recurrent units (SRUs) that performs HSI classification in a highly scalable and efficient way.  ...  -Their working mode is based on the optimization of a loss function; for instance, the mean square error (MSE) between the networks' outputs and the desired outputs, through the adjustment of the networks  ... 
doi:10.5281/zenodo.6414127 fatcat:rnqct4degjdzdhr6vfqxiqfiji

Agent Inspired Trading Using Recurrent Reinforcement Learning and LSTM Neural Networks [article]

David W. Lu
2017 arXiv   pre-print
With the breakthrough of computational power and deep neural networks, many areas that we haven't explore with various techniques that was researched rigorously in past is feasible.  ...  The learning model is implemented in Long Short Term Memory (LSTM) recurrent structures with Reinforcement Learning or Evolution Strategies acting as agents The robustness and feasibility of the system  ...  ACKNOWLEDGMENT The author would like to thank Professor Ching-Ho Leu from the Department of Statistics at National Cheng-Kung University, Tainan, Taiwan for his relentless mentorship and his helpful comments  ... 
arXiv:1707.07338v1 fatcat:f3hlkanncjbd3cjfqemx5t627u

Deep Neural Networks [chapter]

Mariette Awad, Rahul Khanna
2015 Efficient Learning Machines  
Theoretically, it could be re-created on a neural network, but that would be very difficult, as it would require all one's memories.  ...  Among the many evolutions of ANN, deep neural networks (DNNs) (Hinton, Osindero, and Teh 2006) stand out as a promising extension of the shallow ANN structure.  ...  With such a broad definition of deep learning in mind, we can construe the combinations of the backpropagation algorithm (available since 1974) with recurrent neural networks and convolution neural networks  ... 
doi:10.1007/978-1-4302-5990-9_7 fatcat:3umxjrv465b4lcldh47k3j4aji

Regarding the temporal requirements of a hierarchical Willshaw network

João Sacramento, Francisco Burnay, Andreas Wichert
2012 Neural Networks  
In this work we compute the expected retrieval time for the random neural activity regime which maximises the capacity of the Willshaw model and we study the task of finding the optimal hierarchy parametrisation  ...  In a recent communication, Sacramento and Wichert (2011) proposed a hierarchical retrieval prescription for Willshaw-type associative networks.  ...  Acknowledgments The authors are indebted to an anonymous reviewer which contributed with many detailed comments, including an improved asymptotic bound f (n) on a, and to Ângelo Cardoso and Francisco S  ... 
doi:10.1016/j.neunet.2011.07.005 pmid:21820274 fatcat:tmji43pmszhithxclh3ynw2tzq

A Neural Network Ensemble Approach for GDP Forecasting

Luigi Longo, Massimo Riccaboni, Armando Rungi
2021 Social Science Research Network  
The analysis is based on a set of predictors encompassing a wide range of variables measured at different frequencies.  ...  Our approach combines a Recurrent Neural Network (RNN) with a Dynamic Factor model accounting for time-variation in mean with a Generalized Autoregressive Score (DFM-GAS).  ...  The recursive structure of a RNN is optimal for time series analysis as it stores memories of previous time information.  ... 
doi:10.2139/ssrn.3894861 fatcat:6x5uetofhvc35iksl2ni6r7agm

Hybrid Deep Network Scheme for Emotion Recognition in Speech

Sujay Angadi, Venkata Reddy
2019 International Journal of Intelligent Engineering and Systems  
This research paper concentrated on SER using hybrid network that is composed of Convolutional Neural Network and Bidirectional Long Short Term Memory Networks (CNN-BLSTM).  ...  IEMOCAP dataset has been used in experimental analysis of the proposed approach to classify the different emotions of human; those are happy, angry, sad, and neutral.  ...  Few works proposed deep learning based convolution neural network and recurrent neural networks.  ... 
doi:10.22266/ijies2019.0630.07 fatcat:ltx4iqpqhna2zcx7ig4d7d5jgu

Transfer Learning in Sentiment Classification with Deep Neural Networks [chapter]

Andrea Pagliarani, Gianluca Moro, Roberto Pasolini, Giacomo Domeniconi
2019 Primate Life Histories, Sex Roles, and Adaptability  
Deep neural networks have recently reached the state-of-the-art in many NLP tasks, including in-domain sentiment classification, but few of them involve transfer learning and cross-domain sentiment solutions  ...  Deep neural networks have recently reached the state-of-the-art in many NLP tasks, including in-domain sentiment classification, but few of them involve transfer learning and cross-domain sentiment solutions  ...  Wang et al. combined Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN) for sentiment analysis of short texts, taking advantage of the coarse-grained local features generated by CNN  ... 
doi:10.1007/978-3-030-15640-4_1 dblp:conf/ic3k/PagliaraniMPD17 fatcat:gih2lwn36ze4jmuu64idncgfl4

A neural network multigrid solver for the Navier-Stokes equations [article]

Nils Margenberg, Dirk Hartmann, Christian Lessig, Thomas Richter
2021 arXiv   pre-print
DNN-MG improves computational efficiency using a judicious combination of a geometric multigrid solver and a recurrent neural network with memory.  ...  This results in a reduction in computation time through DNN-MG's highly compact neural network.  ...  Acknowledgement NM and TR acknowledge the financial support by the Federal Ministry of Education and Research of Germany, grant number 05M16NMA as well as the GRK 2297 MathCoRe, funded by the Deutsche  ... 
arXiv:2008.11520v2 fatcat:voepkt4ucvgahfyjmnbgakxwpy
« Previous Showing results 1 — 15 out of 1,986 results