A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Linear-Time Sequence Classification using Restricted Boltzmann Machines
[article]
2018
arXiv
pre-print
Classification of sequence data is the topic of interest for dynamic Bayesian models and Recurrent Neural Networks (RNNs). ...
Also, the experimental results on optical character recognition, part-of-speech tagging and text chunking demonstrate that our model is comparable to recurrent neural networks with complex memory gates ...
It indicates a consistent improvement in best-case performance from the n-gram models, the non-recurrent neural networks, and then the recurrent neural network models, with the SCRBM outperforming all ...
arXiv:1710.02245v3
fatcat:jq6cu4ztt5gl7l66gf6cws3zna
Tracking slow modulations in synaptic gain using dynamic causal modelling: Validation in epilepsy
2015
NeuroImage
Bayesian model selection was used to identify the intrinsic (within-source) and extrinsic (between-source) connectivity. ...
Our key finding was that intrinsic synaptic changes were sufficient to explain seizure onset, where these changes showed dissociable time courses over several seconds. ...
Dynamic causal modelling of cross spectral density can be implemented using the DCM Toolbox. ...
doi:10.1016/j.neuroimage.2014.12.007
pmid:25498428
pmcid:PMC4306529
fatcat:q7yss2ng2vhw7egfaoo37tdh44
OutbreakFlow: Model-based Bayesian inference of disease outbreak dynamics with invertible neural networks and its application to the COVID-19 pandemics in Germany
[article]
2021
arXiv
pre-print
In this work, we address this problem with a novel combination of epidemiological modeling with specialized neural networks. ...
In the subsequent inference phase, the trained neural network processes the observed data of an actual outbreak and infers the parameters of the model in order to realistically reproduce the observed dynamics ...
Our neural architecture comprises three sub-networks: (i) a convolutional filtering network performing noise reduction and feature extraction on the raw observational data; (ii) a recurrent summary network ...
arXiv:2010.00300v4
fatcat:jqjs5cgdwbe3tnsfttrqtutiaq
Amortized Bayesian model comparison with evidential deep learning
[article]
2021
arXiv
pre-print
We demonstrate the utility of our method on toy examples and simulated data from non-trivial models from cognitive science and single-cell neuroscience. ...
The Bayesian probabilistic framework offers a principled way to perform model comparison and extract useful metrics for guiding decisions. ...
We also thank David Izydorczyk and Mattia Sensi for reading the paper and providing constructive feedback. ...
arXiv:2004.10629v4
fatcat:gpsjtnxm4bfftowftsdvqdjs7q
Neural Integration of Continuous Dynamics
[article]
2019
arXiv
pre-print
Modeled as constant-sized recurrent networks embedding a continuous neural differential equation, they achieve fully neural temporal output. ...
Using the polynomial class of dynamical systems, we demonstrate the equivalence of neural and numerical integration. ...
Support from ONR grant N00014-19-1-2273, the MIT Environmental Solutions Initiative, the Maryanne and John Montrym Fund, and the MIT Lincoln Laboratory are gratefully acknowledged. ...
arXiv:1911.10309v1
fatcat:ewjuh4hewbd23odgarsnev4zhi
Recognizing recurrent neural networks (rRNN): Bayesian inference for recurrent neural networks
2012
Biological cybernetics
We suggest that the Bayesian inversion of recurrent neural networks may be useful both as a model of brain function and as a machine learning tool. ...
Recurrent neural networks (RNNs) are widely used in computational neuroscience and machine learning applications. ...
Acknowledgments We thank both anonymous reviewers for the helpful and constructive comments on a previous version of this manuscript. ...
doi:10.1007/s00422-012-0490-x
pmid:22581026
fatcat:y3prg6rhfjg2bivyndcydzmsoq
OutbreakFlow: Model-based Bayesian inference of disease outbreak dynamics with invertible neural networks and its application to the COVID-19 pandemics in Germany
2021
PLoS Computational Biology
In this work, we address this problem with a novel combination of epidemiological modeling with specialized neural networks. ...
In the subsequent inference phase, the trained neural network processes the observed data of an actual outbreak and infers the parameters of the model in order to realistically reproduce the observed dynamics ...
Our neural architecture comprises three sub-networks: (i) a convolutional filtering network performing noise reduction and feature extraction on the raw observational data; (ii) a recurrent summary network ...
doi:10.1371/journal.pcbi.1009472
pmid:34695111
pmcid:PMC8584772
fatcat:tz6ei3zd6rfd7g5ir6a7uvfy3a
Survey of Cryptocurrency Volatility Prediction Literature Using Artificial Neural Networks
2022
Business and Economic Research
Recently-developed literature that attempt to predict volatilities of cryptocurrency valuations through creation of hybrid artificial neural network models are then discussed. ...
For the major part of the paper, we delve into details of multiple hybrid artificial neural networks that were thoroughly implemented to predict cryptocurrency volatilities. ...
The model is established on the quadratic variance of the theory of arbitrage-free price processes in time series, and it uses connections among realized volatility and matrix of conditional covariance ...
doi:10.5296/ber.v12i1.19301
fatcat:pst3igfhivdc5pi2ln4ubcdyae
Towards Automated Satellite Conjunction Management with Bayesian Deep Learning
[article]
2020
arXiv
pre-print
We introduce a Bayesian deep learning approach to this problem, and develop recurrent neural network architectures (LSTMs) that work with time series of conjunction data messages (CDMs), a standard data ...
We show that our method can be used to model all CDM features simultaneously, including the time of arrival of future CDMs, providing predictions of conjunction event evolution with associated uncertainties ...
We would like to thank Dario Izzo and Moriba Jah for sharing their technical expertise and James Parr, Jodie Hughes, Leo Silverberg, Alessandro Donati for their support. ...
arXiv:2012.12450v1
fatcat:dadqsub7uvevlfx32ce645mziu
Connectivity Inference from Neural Recording Data: Challenges, Mathematical Bases and Research Directions
[article]
2017
arXiv
pre-print
We then review connectivity inference methods based on two major mathematical foundations, namely, descriptive model-free approaches and generative model-based approaches. ...
We first identify biophysical and technical challenges in connectivity inference along the data processing pipeline. ...
and internal funding from the Okinawa Institute of Science and Technology Graduate University. ...
arXiv:1708.01888v2
fatcat:fezbmzuzenac7mqcnqhq5sveye
The graphical brain: Belief propagation and active inference
2017
Network Neuroscience
For example, Bayesian model averaging and comparison, which link discrete and continuous states, may be implemented in thalamocortical loops. ...
To accommodate mixed generative models (of discrete and continuous states), one also has to consider link nodes or factors that enable discrete and continuous representations to talk to each other. ...
Note the formal similarity between the Bayesian network and the Forney factor graph; however, also note the differences. ...
doi:10.1162/netn_a_00018
pmid:29417960
pmcid:PMC5798592
fatcat:ew5x2cczwvarfeedfz5s6ldivm
FORECASTING FOREIGN EXCHANGE RATES WITH ARTIFICIAL NEURAL NETWORKS: A REVIEW
2004
International Journal of Information Technology and Decision Making
Several design factors significantly impact the accuracy of neural network forecasts. These factors include the selection of input variables, preparing data, and network architecture. ...
We also describe the integration of ANNs with other methods and report the comparison between performances of ANNs and those of other forecasting methods, and finding mixed results. ...
Acknowledgement This project is supported by NSFC, CAS and the City University of Hong Kong. ...
doi:10.1142/s0219622004000969
fatcat:5woran6t6veidh373r6ibivfmm
Self-Supervised Inference in State-Space Models
[article]
2022
arXiv
pre-print
Without parameterizing a generative model, we apply Bayesian update formulas using a local linearity approximation parameterized by neural networks. ...
Usage of such domain knowledge is reflected in excellent results (despite our model's simplicity) on the chaotic Lorenz system compared to fully supervised and variational inference methods. ...
The model obtained by parameterizing p(x k | y <k ) directly (eq. ( 12 )) is referred to as the recurrent filter or recurrent smoother, as it only employs recurrent neural networks (and no Bayesian recursion ...
arXiv:2107.13349v3
fatcat:rp2ionq6kbbblh6m563pg6mn4y
Learning Scalable Deep Kernels with Recurrent Structure
2017
Journal of machine learning research
The resulting model, GP-LSTM, fully encapsulates the inductive biases of long short-term memory (LSTM) recurrent networks, while retaining the non-parametric probabilistic advantages of Gaussian processes ...
Many applications in speech, robotics, finance, and biology deal with sequential data, where ordering matters and recurrent structures are common. ...
This work was supported in part by NIH R01GM114311, AFRL/DARPA FA87501220324, and NSF IIS-1563887. ...
pmid:30662374
pmcid:PMC6334642
fatcat:oloxsfrnvvh53kouiredghe3tm
The time dimension of neural network models
1994
ACM SIGART Bulletin
This review attempts to provide an insightful perspective on the role of time within neural network models and the use of neural networks for problems involving time. ...
The most commonly used neural network models are de ned and explained giving mention to important technical issues but avoiding great detail. ...
It is interesting to note a formal similarity between recurrent networks and TDNNs. ...
doi:10.1145/181911.181917
fatcat:qk5nunmmrzcohncrnm4k7wd3uq
« Previous
Showing results 1 — 15 out of 3,048 results