A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Bayesian Recurrent Neural Networks
[article]
2019
arXiv
pre-print
We show how this technique is not exclusive to recurrent neural networks and can be applied more widely to train Bayesian neural networks. ...
In this work we explore a straightforward variational Bayes scheme for Recurrent Neural Networks. ...
recurrent neural networks. ...
arXiv:1704.02798v4
fatcat:ac452clc2bfd3ogyyu2csuiw7m
Recognizing recurrent neural networks (rRNN): Bayesian inference for recurrent neural networks
2012
Biological cybernetics
We suggest that the Bayesian inversion of recurrent neural networks may be useful both as a model of brain function and as a machine learning tool. ...
Recurrent neural networks (RNNs) are widely used in computational neuroscience and machine learning applications. ...
This 'generative recurrent neural network' (gRNN) runs independently of any input and generates sensory data, i.e. observations. ...
doi:10.1007/s00422-012-0490-x
pmid:22581026
fatcat:y3prg6rhfjg2bivyndcydzmsoq
A Bayesian Approach to Recurrence in Neural Networks
[article]
2019
arXiv
pre-print
We begin by reiterating that common neural network activation functions have simple Bayesian origins. ...
In this spirit, we go on to show that Bayes's theorem also implies a simple recurrence relation; this leads to a Bayesian recurrent unit with a prescribed feedback formulation. ...
A Bayesian Approach to Recurrence in Neural Networks Philip N. Garner, Sibo Tong Abstract-We begin by reiterating that common neural network activation functions have simple Bayesian origins. ...
arXiv:1910.11247v1
fatcat:s6ntnxscxveddd55a4woxqyjky
Bayesian Sparsification of Recurrent Neural Networks
[article]
2017
arXiv
pre-print
We apply this technique to sparsify recurrent neural networks. To account for recurrent specifics we also rely on Binary Variational Dropout for RNN. ...
Recurrent neural networks show state-of-the-art results in many text analysis tasks but often require a lot of memory to store their weights. ...
In a Bayesian neural network the weights ω are treated as random variables. ...
arXiv:1708.00077v1
fatcat:fjahpcop3be3pattqvhpdzfwiy
Optimizing Bayesian Recurrent Neural Networks on an FPGA-based Accelerator
[article]
2021
arXiv
pre-print
In contrast, Bayesian recurrent neural networks (RNNs) are able to provide uncertainty estimation with improved accuracy. ...
Neural networks have demonstrated their outstanding performance in a wide range of tasks. ...
At the same time, this is the first work that is focused on accelerating Bayesian recurrent neural networks on an FPGA. ...
arXiv:2106.06048v3
fatcat:o6ucskc6afabbgjdkokmmpqrmi
Fully Bayesian Recurrent Neural Networks for Safe Reinforcement Learning
[article]
2019
arXiv
pre-print
In this work, we build on Probabilistic Backpropagation to introduce a fully Bayesian Recurrent Neural Network architecture. ...
Bayesian Neural Networks (BNNs), which thus produces fully-Bayesian model uncertainty estimates. ...
Probabilistic Backpropagation for Recurrent Neural Networks Recurrent Neural Networks (RNNs), and specifically LSTMs, are the current state-of-the-art for dynamic obstacle avoidance tasks (Alahi et al ...
arXiv:1911.03308v2
fatcat:cmrerp5yuvfcrc4lfu6ai6y5xa
Bayesian Sparsification of Gated Recurrent Neural Networks
[article]
2018
arXiv
pre-print
Bayesian methods have been successfully applied to sparsify weights of neural networks and to remove structure units from the networks, e. g. neurons. ...
We apply and further develop this approach for gated recurrent architectures. ...
In [14] , the authors propose a Bayesian technique called Sparse variational dropout (SparseVD) for neural networks sparsification. ...
arXiv:1812.05692v1
fatcat:ex3adxatfjaqhanfsirkulum6i
Fault Detection and Identification using Bayesian Recurrent Neural Networks
[article]
2020
arXiv
pre-print
In this paper, a novel probabilistic fault detection and identification method is proposed which adopts a newly developed deep learning approach using Bayesian recurrent neural networks~(BRNNs) with variational ...
This article proposes a novel end-to-end FDI framework, which adopts a recently developed Bayesian recurrent neural network (BRNN) architecture (Gal and Ghahramani, 2016b) . ...
Background
Recurrent Neural Networks RNNs were developed in the 1980s (Rumelhart et al., 1986) . ...
arXiv:1911.04386v2
fatcat:npkgp4srgbg5xg3l3efshczdxa
Scalable Bayesian Learning of Recurrent Neural Networks for Language Modeling
[article]
2017
arXiv
pre-print
Recurrent neural networks (RNNs) have shown promising performance for language modeling. However, traditional training of RNNs using back-propagation through time often suffers from overfitting. ...
It yields a principled Bayesian learning algorithm, adding gradient noise during training (enhancing exploration of the model-parameter space) and model averaging when testing. ...
Conclusion We propose a scalable Bayesian learning framework using SG-MCMC, to model weight uncertainty in recurrent neural networks. ...
arXiv:1611.08034v2
fatcat:txk5lx2vv5gsdfycr7atpgvwea
Nonlinear Bayesian Filters for Training Recurrent Neural Networks
[chapter]
2008
Lecture Notes in Computer Science
We compare the predictability of various Bayesian filter-trained recurrent neural networks using a chaotic time-series. ...
In this paper, we present nonlinear Bayesian filters for training recurrent neural networks with a special emphasis on a novel, more accurate, derivative-free member of the approximate Bayesian filter ...
Specifically, recurrent neural networks (RNNs) have had successes in areas such as dynamic system identification [13] , nonlinear prediction [7] and control [22] . ...
doi:10.1007/978-3-540-88636-5_2
fatcat:bh5r7jvfajgg5gk5q7sww7humy
Note on Equivalence Between Recurrent Neural Network Time Series Models and Variational Bayesian Models
[article]
2016
arXiv
pre-print
We observe that the standard log likelihood training objective for a Recurrent Neural Network (RNN) model of time series data is equivalent to a variational Bayesian training objective, given the proper ...
This perspective may motivate extensions to both RNNs and variational Bayesian models. ...
Recurren Neural Networks (RNNs)
RNN definition A Recurrent Neural Network (RNN) [3] has a visible state x t at each time step, and a corresponding hidden state h t . ...
arXiv:1504.08025v2
fatcat:yaxg6gpa5bfnjnfqjf5rirmfou
Bayesian Recurrent Neural Network Models for Forecasting and Quantifying Uncertainty in Spatial-Temporal Data
[article]
2018
arXiv
pre-print
Recurrent neural networks (RNNs) are nonlinear dynamical models commonly used in the machine learning and dynamical systems literature to represent complex dynamical or sequential relationships between ...
Even when considered, the uncertainty is generally quantified without the use of a rigorous framework, such as a fully Bayesian setting. ...
Bayesian Spatio-Temporal Recurrent Neural Network In this section we introduce the Bayesian spatio-temporal RNN, referred to hereafter as the BAST-RNN model. ...
arXiv:1711.00636v2
fatcat:j52osxxyivanpkucodly22l33u
Opportunistic Networks Link Prediction Method Based on Bayesian Recurrent Neural Network
2019
IEEE Access
INDEX TERMS Link prediction, opportunistic networks, Bayesian recurrent neural network. ...
According to the time-varying characteristics and the node's mobility of opportunistic networks, this paper proposes a novel link prediction method based on the Bayesian recurrent neural network (BRNN-LP ...
NETWORK STRUCTURE Recurrent neural network (RNN) [41] are a kind of neural network, which can be used to analyse time series data. ...
doi:10.1109/access.2019.2961243
fatcat:65mso3qbqbbv5hdrebe6wdvl5m
D-optimal Bayesian Interrogation for Parameter and Noise Identification of Recurrent Neural Networks
[article]
2008
arXiv
pre-print
We introduce a novel online Bayesian method for the identification of a family of noisy recurrent neural networks (RNNs). ...
We apply a greedy technique to maximize the information gain concerning network parameters at each time step. ...
arXiv:0801.1883v1
fatcat:7no63zjkvvhnjhwetxlzl3j22i
Adaptive Probabilistic Vehicle Trajectory Prediction Through Physically Feasible Bayesian Recurrent Neural Network
2019
2019 International Conference on Robotics and Automation (ICRA)
In this work, we propose to overcome these two shortcomings by a Bayesian recurrent neural network model consisting of Bayesian-neural-network-based policy model and known physical model of the scenario ...
Furthermore, a particle-filter-based parameter adaptation algorithm is designed to adapt the policy Bayesian neural network to the predicted target online. ...
Physically Feasible Bayesian Recurrent Neural Network We use the BNN with random input noise to represent the policy. ...
doi:10.1109/icra.2019.8794130
dblp:conf/icra/TangCT19
fatcat:hkhfwr7ksndsxdcy6v6jnhtpgm
« Previous
Showing results 1 — 15 out of 23,826 results