Filters








5,655 Hits in 6.3 sec

Liquid Time-constant Recurrent Neural Networks as Universal Approximators [article]

Ramin M. Hasani, Mathias Lechner, Alexander Amini, Daniela Rus, Radu Grosu
2018 arXiv   pre-print
In this paper, we introduce the notion of liquid time-constant (LTC) recurrent neural networks (RNN)s, a subclass of continuous-time RNNs, with varying neuronal time-constant realized by their nonlinear  ...  Here, we also theoretically find bounds on their neuronal states and varying time-constant.  ...  In this paper, we formalize networks built based on such principles as liquid time-constant (LTC) RNNs (Sec. 2) and theoretically prove their universal approximation capabilities (Sec. 3).  ... 
arXiv:1811.00321v1 fatcat:wmyf54nl5jghrhxpilswkd3ice

Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations

Wolfgang Maass, Thomas Natschläger, Henry Markram
2002 Neural Computation  
We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks.  ...  It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory.  ...  Furthermore the output of this network was available at any time, and was usually correct as soon as the liquid state of the neural circuit had absorbed enough information about the input (the initial  ... 
doi:10.1162/089976602760407955 pmid:12433288 fatcat:33ec7g6kojg3tdteeae3cjzumq

A Model for Real-Time Computation in Generic Neural Microcircuits

Wolfgang Maass, Thomas Natschläger, Henry Markram
2002 Neural Information Processing Systems  
A key challenge for neural modeling is to explain how a continuous stream of multi-modal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire  ...  neurons in real-time.  ...  The "liquid state" 2 ¡ ¨ § ¤¥ of the recurrent circuit consisting of § neurons was modeled by an § -dimensional vector computed by applying a low pass filter with a time constant of 30 ms to the spike  ... 
dblp:conf/nips/MaassNM02 fatcat:xg4b5c6zwvbbdefzmwjvrre6sa

Liquid Time-constant Networks

Ramin Hasani, Mathias Lechner, Alexander Amini, Daniela Rus, Radu Grosu
2021 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
We introduce a new class of time-continuous recurrent neural network models.  ...  We then conduct a series of time-series prediction experiments to manifest the approximation capability of Liquid Time-Constant Networks (LTCs) compared to classical and modern RNNs.  ...  features and benefits: Liquid Time-Constant.  ... 
doi:10.1609/aaai.v35i9.16936 fatcat:nwom7mvt6zc4bnucozosxylip4

Closed-form Continuous-time Neural Models [article]

Ramin Hasani, Mathias Lechner, Alexander Amini, Lucas Liebenwein, Aaron Ray, Max Tschaikowski, Gerald Teschl, Daniela Rus
2022 arXiv   pre-print
Here, we show it is possible to closely approximate the interaction between neurons and synapses -- the building blocks of natural and artificial neural networks -- constructed by liquid time-constant  ...  Lastly, as these models are derived from liquid networks, they show remarkable performance in time series modeling, compared to advanced recurrent models.  ...  A baclbone neural network layer delivers the input signals into three head networks g, f and h. f acts as a liquid time-constant for the sigmoidal time-gates of the network. g and h construct the nonlinearieties  ... 
arXiv:2106.13898v2 fatcat:zab5n7l3zfbfppeznphmwmdxuu

The Application of Liquid State Machines in Robot Path Planning

Yanduo Zhang, Kun Wang
2009 Journal of Computers  
Index Terms -Liquid state machines; spiking neural networks; Parallel Delta Rule; path planning  ...  This paper discusses the Liquid state machines and does some researches on spiking neural network and Parallel Delta Rule, using them to solve the robot path planning optimization problems, at the same  ...  But the LSM isn't the same as general recurrent neural network, which can do real-time calculation for continues input-flow, the process is to put input-flow into a large enough complex recurrent neural  ... 
doi:10.4304/jcp.4.11.1182-1186 fatcat:czt7isowxzfkdixdqoq6czlo3a

Liquid Time-constant Networks [article]

Ramin Hasani, Mathias Lechner, Alexander Amini, Daniela Rus, Radu Grosu
2020 arXiv   pre-print
We introduce a new class of time-continuous recurrent neural network models.  ...  We then conduct a series of time-series prediction experiments to manifest the approximation capability of Liquid Time-Constant Networks (LTCs) compared to classical and modern RNNs.  ...  We refer to these models as liquid time-constant recurrent neural networks (LTCs). LTCs can be implemented by an arbitrary choice of ODE solvers.  ... 
arXiv:2006.04439v4 fatcat:amtvcma3pzce5b6uimjp2qq3d4

Using Deep 1D Convolutional Grated Recurrent Unit Neural Network to Optimize Quantum Molecular Properties and Predict Intramolecular Coupling Constants of Molecules of Potential Health Medications and Other Generic Molecules

David Opeoluwa Oyewola, Emmanuel Gbenga Dada, Onyeka Emebo, Olugbenga Oluseun Oluwagbemi
2022 Applied Sciences  
A one-dimensional deep convolutional gated recurrent neural network (1D-CNN-GRU) was used in this study to offer a novel forecasting model for molecular property prediction of liquids or fluids.  ...  Fluid characteristics for industrial purposes find applications in the development of various liquid household products, such as liquid detergents, drinks, beverages, and liquid health medications, amongst  ...  A comparison of the performance of molecular property systems based on a recurrent neural network, long short-term memory (LSTM), gated recurrent unit (GRU), one-dimensional convolutional neural network  ... 
doi:10.3390/app12147228 fatcat:5fifalutmzdyvdwyj55aa3jib4

Liquid State Machine on SpiNNaker for Spatio-Temporal Classification Tasks

Alberto Patiño-Saucedo, Horacio Rostro-González, Teresa Serrano-Gotarredona, Bernabé Linares-Barranco
2022 Frontiers in Neuroscience  
Liquid State Machines (LSMs) are computing reservoirs composed of recurrently connected Spiking Neural Networks which have attracted research interest for their modeling capacity of biological structures  ...  The training of the readout layer is performed using a recent adaptation of back-propagation-through-time (BPTT) for SNNs, while the internal weights of the reservoir are kept static.  ...  ACKNOWLEDGMENTS We are grateful to the Advanced Processor Technologies (APT) Research Group at University of Manchester for enabling access to SpiNNaker boards and support with related software.  ... 
doi:10.3389/fnins.2022.819063 pmid:35360182 pmcid:PMC8964061 fatcat:zx74emjq2fawzghjrqib4g4tg4

Neuro-Inspired Speech Recognition Based on Reservoir Computing [chapter]

Arfan Ghani
2010 Advances in Speech Recognition  
Theoretical results imply that this is a universal state machine which has no limitation on the power of neural microcircuit as long as the reservoir and readouts fulfill the separation and approximation  ...  A major advantage of recurrent neural networks such as Elman, Bengio, and Jordan is their capability to store information for a short period of time.  ...  Furthermore, major challenges that were typically ignored in previous speech recognition research, such as noise and reverberation, appear repeatedly in recent papers.  ... 
doi:10.5772/10186 fatcat:ojsrto4ofraazaka7u76tt244i

Initialization and self‐organized optimization of recurrent neural network connectivity

Joschka Boedecker, Oliver Obst, N. Michael Mayer, Minoru Asada
2009 HFSP Journal  
Reservoir computing (RC) is a recent paradigm in the field of recurrent neural networks.  ...  RC Networks have recently received increased attention as a mathematical model for generic neural microcircuits, to investigate and explain computations in neocortical columns.  ...  Keely for proofreading the manuscript, and the Reservoir Lab at University of Ghent, Belgium, for providing the Reservoir Computing (RC) toolbox for Matlab, as well as Herbert Jaeger for making source  ... 
doi:10.2976/1.3240502 pmid:20357891 pmcid:PMC2801534 fatcat:ek2ym26zijbureujlqnhnrfunq

Fading memory and kernel properties of generic cortical microcircuit models

Wolfgang Maass, Thomas Natschläger, Henry Markram
2004 Journal of Physiology - Paris  
new results about the performance of generic neural microcircuit models for the recognition of speech that is subject to linear and non-linear time-warps, as well as for computations on time-varying firing  ...  Markram, Real-time computing without stable states: a new framework for neural computation based on perturbations, Neural Comput. 14 (11) (2002) 2531-2560, Online available as #130 from: ], and contains  ...  machine might be viewed as a universal finite state machine whose ''liquid'' high-dimensional analog state x(t) changes continuously over time.  ... 
doi:10.1016/j.jphysparis.2005.09.020 pmid:16310350 fatcat:juhnqbgbpre5lcs2qbvviub6ui

Perspectives of the high-dimensional dynamics of neural microcircuits from the point of view of low-dimensional readouts

Stefan Häusler, Henry Markram, Wolfgang Maass
2003 Complexity  
We demonstrate that the projection of time-varying inputs into a large recurrent neural circuit enables a linear readout neuron to classify the time-varying circuit inputs with the same power as a complex  ...  Hence a generic neural microcircuit may play a similar role for information processing as a kernel for support vector machines in machine learning.  ...  The trajectory of the recurrent neural circuit was modeled as a sequence of consecutive liquid states sampled every 20 ms.  ... 
doi:10.1002/cplx.10089 fatcat:pnzqo2hamnckvjcbfmnqvyrblm

On the computational power of circuits of spiking neurons

Wolfgang Maass, Henry Markram
2004 Journal of computer and system sciences (Print)  
In this article, we initiate a rigorous mathematical analysis of the real-time computing capabilities of a new generation of models for neural computation, liquid state machines, that can be implemented  ...  computational units (neurons and synapses), as well as the traditional emphasis on offline computing in almost all theoretical approaches towards neural computation.  ...  Alternatively, one could approximate by spiking neurons any other class of feedforward neural networks that is known to have the universal approximation property, such as multi-layer perceptrons (see  ... 
doi:10.1016/j.jcss.2004.04.001 fatcat:wj4esrvwmfcgrikba6zbvltsiq

Computer Models and Analysis Tools for Neural Microcircuits [chapter]

Thomas Natschläger, Henry Markram, Wolfgang Maass
2003 Neuroscience Databases  
In particular it describes the features of a new website (www.lsm.tugraz.at) that facilitates the creation of computer models for cortical neural microcircuits of various sizes and levels of detail, as  ...  well as tools for evaluating the computational power of these models in a Matlabenvironment.  ...  A B The basic idea is that a neural (recurrent) microcircuit may serve as an unbiased analog (fading) memory (informally referred to as "liquid") about current and preceding inputs to the circuit.  ... 
doi:10.1007/978-1-4615-1079-6_9 fatcat:uj3vjo4bbje25npzzhnzgicbme
« Previous Showing results 1 — 15 out of 5,655 results