Filters








519 Hits in 4.8 sec

Variational Latent Gaussian Process for Recovering Single-Trial Dynamics from Population Spike Trains

Yuan Zhao, Il Memming Park
2017 Neural Computation  
Recovering these latent trajectories, particularly from single-trial population recordings, may help us understand the dynamics that drive neural computation.  ...  Here, we propose a practical and efficient inference method, called the variational latent Gaussian process (vLGP).  ...  Acknowledgment We thank the reviewers for their constructive feedback. We are grateful to Arnulf Graf, Adam Kohn, Tony Movshon, and Mehrdad Jazayeri for providing the V1 dataset.  ... 
doi:10.1162/neco_a_00953 pmid:28333587 fatcat:ac4cktpxxzdnni4p57wapmjewq

Temporal alignment and latent Gaussian process factor inference in population spike trains [article]

Lea Duncker, Maneesh Sahani
2018 bioRxiv   pre-print
We introduce a novel scalable approach to identifying common latent structure in neural population spike-trains, which allows for variability both in the trajectory and in the rate of progression of the  ...  We show that the new method learns to recover latent trajectories in synthetic data, and can accurately identify the trial-to-trial timing of movement-related parameters from motor cortical data without  ...  Acknowledgements We would like to thank Vincent Adam for early contributions to this project, Gergo Bohner for helpful discussions, and the Shenoy laboratory at Stanford University for sharing the centre-out  ... 
doi:10.1101/331751 fatcat:nxaijfww7fbpfakfijytgkaxo4

Linear dynamical neural population models through nonlinear embeddings [article]

Yuanjun Gao, Evan Archer, Liam Paninski, John P. Cunningham
2016 arXiv   pre-print
A body of recent work in modeling neural activity focuses on recovering low-dimensional latent features that capture the statistical structure of large-scale neural populations.  ...  Here, we propose fLDS, a general class of nonlinear generative models that permits the firing rate of each neuron to vary as an arbitrary smooth function of a latent, linear dynamical state.  ...  Future work includes relaxing the latent linear dynamical system assumption to incorporate more flexible latent dynamics (for example, by using a Gaussian process prior [12] or by incorporating a nonlinear  ... 
arXiv:1605.08454v2 fatcat:sgn5jdm2dre47gpz55tlx4nqcm

Hierarchical models for neural population dynamics in the presence of non-stationarity [article]

Mijung Park, Jakob H. Macke
2014 arXiv   pre-print
We develop variational inference methods for learning model parameters, and demonstrate that the method can recover non-stationarities in both average firing rates and correlation structure.  ...  This variability is thought to arise from single-neuron stochasticity, neural dynamics on short time-scales, as well as from modulations of neural firing properties on long time-scales, often referred  ...  Acknowledgments We thank the authors of [5] for sharing dataset I with us and making it publicly available at: http://toliaslab.org/publications/ecker-et-al-2014/, as well as the authors of [27] for  ... 
arXiv:1410.3111v1 fatcat:dt5b3dnfojfpbinl7tgpmhoot4

Gaussian process based nonlinear latent structure discovery in multivariate spike train data

Anqi Wu, Nicholas A Roy, Stephen Keeley, Jonathan W Pillow
2017 Advances in Neural Information Processing Systems  
A large body of recent work focuses on methods for extracting low-dimensional latent structure from multi-neuron spike train data.  ...  We apply the model to spike trains recorded from hippocampal place cells and show that it compares favorably to a variety of previous methods for latent structure discovery, including variational auto-encoder  ...  In this paper, we propose the Poisson Gaussian process latent variable model (P-GPLVM) for spike train data, which allows for nonlinearity in both the latent state dynamics and in the mapping from the  ... 
pmid:31244512 pmcid:PMC6594561 fatcat:45bwbczkgfcnxlyfx4gc6z7xi4

Probabilistic Encoding Models for Multivariate Neural Data

Marcus A. Triplett, Geoffrey J. Goodhill
2019 Frontiers in Neural Circuits  
In particular, we discuss methods for estimating receptive fields, modeling neural population dynamics, and inferring low dimensional latent structure from a population of neurons, in the context of both  ...  An understanding of the encoding process is essential both for gaining insight into the origins of perception and for the development of brain-computer interfaces.  ...  GG is grateful for financial support from the Australian Research Council Discovery Projects DP170102263 and DP180100636.  ... 
doi:10.3389/fncir.2019.00001 pmid:30745864 pmcid:PMC6360288 fatcat:vqnhoyvhwrdotkbqlrgkeoqu3m

Building population models for large-scale neural recordings: opportunities and pitfalls [article]

Cole Hurwitz, Nina Kudryashova, Arno Onken, Matthias H. Hennig
2021 arXiv   pre-print
This has driven the development of new statistical models for analyzing and interpreting neural population activity. Here we provide a broad overview of recent developments in this area.  ...  Modern recording technologies now enable simultaneous recording from large numbers of neurons.  ...  A rst approach is to recover neural spike trains using deconvolution methods.  ... 
arXiv:2102.01807v4 fatcat:teymhliyd5bq7hkulxd46f2zxm

Inferring single-trial neural population dynamics using sequential auto-encoders

Chethan Pandarinath, Daniel J. O'Shea, Jasmine Collins, Rafal Jozefowicz, Sergey D. Stavisky, Jonathan C. Kao, Eric M. Trautmann, Matthew T. Kaufman, Stephen I. Ryu, Leigh R. Hochberg, Jaimie M. Henderson, Krishna V. Shenoy (+2 others)
2018 Nature Methods  
We introduce Latent Factor Analysis via Dynamical Systems (LFADS), a deep learning method to infer latent dynamics from single-trial neural spiking data.  ...  LFADS uses a nonlinear dynamical system to infer the dynamics underlying observed spiking activity and to extract 'de-noised' single-trial firing rates.  ...  Churchland for contributions to data collection for monkey J, Christine Blabe and Paul Nuyujukian for assistance with research sessions with participant T5, Emad Eskandar for array implantation with participant  ... 
doi:10.1038/s41592-018-0109-9 pmid:30224673 fatcat:tpolivjosrg23aexvcfkcbx7g4

Deep inference of latent dynamics with spatio-temporal super-resolution using selective backpropagation through time [article]

Feng Zhu, Andrew R. Sedler, Harrison A. Grier, Nauman Ahad, Mark A. Davenport, Matthew T. Kaufman, Andrea Giovannucci, Chethan Pandarinath
2021 arXiv   pre-print
Our novel neural network training strategy, selective backpropagation through time (SBTT), enables learning of deep generative models of latent dynamics from data in which the set of observed variables  ...  The resulting models are able to infer activity for missing samples by combining observations with learned latent dynamics.  ...  Within our application domain, a variety of methods have been developed to infer latent dynamical structure from neural population activity on individual trials, including those based on Gaussian processes  ... 
arXiv:2111.00070v1 fatcat:bfnhah7ygnhmrkqkdzxcanb5v4

Inferring single-trial neural population dynamics using sequential auto-encoders [article]

Chethan Pandarinath, Daniel J. O'Shea, Jasmine Collins, Rafal Jozefowicz, Sergey D. Stavisky, Jonathan C. Kao, Eric M. Trautmann, Matthew T. Kaufman, Stephen I. Ryu, Leigh R. Hochberg, Jaimie M. Henderson, Krishna V. Shenoy (+2 others)
2017 bioRxiv   pre-print
Here we introduce Latent Factor Analysis via Dynamical Systems (LFADS), a deep learning method to infer latent dynamics from single-trial neural spiking data.  ...  LFADS uses a nonlinear dynamical system (a recurrent neural network) to infer the dynamics underlying observed population activity and to extract 'de-noised' single-trial firing rates from neural spiking  ...  for assistance with research sessions with participant T7.  ... 
doi:10.1101/152884 fatcat:vgfccx6z4fhcbjguipykw2ethe

Targeted Neural Dynamical Modeling [article]

Cole Hurwitz, Akash Srivastava, Kai Xu, Justin Jude, Matthew G. Perich, Lee E. Miller, Matthias H. Hennig
2021 arXiv   pre-print
Latent dynamics models have emerged as powerful tools for modeling and interpreting neural population activity.  ...  We implement TNDM as a sequential variational autoencoder and validate it on simulated recordings and recordings taken from the premotor and motor cortex of a monkey performing a center-out reaching task  ...  Acknowledgements We thank Alessandro Facchin and Nina Kudryashova for the code contributions and for the insightful discussions.  ... 
arXiv:2110.14853v1 fatcat:u4wfvankobejvig3dfvlz6u5mm

Parallel inference of hierarchical latent dynamics in two-photon calcium imaging of neuronal populations [article]

Luke Yuri Prince, Shahab Bakhtiari, Colleen J. Gillon, Blake A Richards
2021 bioRxiv   pre-print
For spiking data, such latent variable modelling can treat the data as a set of point-processes, due to the fact that spiking dynamics occur on a much faster timescale than the computational dynamics being  ...  Dynamic latent variable modelling has provided a powerful tool for understanding how populations of neurons compute.  ...  C) Rotational dynamics inferred from condition-averaged (Top) and single-trial spikes (Bottom) using LFADS. D) Same as C), but for VaLPACa. E) Same as C), but for OASIS+LFADS.  ... 
doi:10.1101/2021.03.05.434105 fatcat:xfeh6jtzkrbfpkypuqaf6gjo5m

Variational online learning of neural dynamics [article]

Yuan Zhao, Il Memming Park
2020 arXiv   pre-print
We developed a flexible online learning framework for latent nonlinear state dynamics and filtered latent states.  ...  It brings the challenge of learning both latent neural state and the underlying dynamical system because neither is known for neural systems a priori.  ...  Clinically, a nonlinear state space model provides a basis for nonlinear feedback control as a potential treatment for neurological diseases that arise from diseased dynamical states.  ... 
arXiv:1707.09049v5 fatcat:oq5wq7jwprbd5fiywiu3zbwv74

Stimulus-choice (mis)alignment in primate area MT

Yuan Zhao, Jacob L. Yates, Aaron J. Levi, Alexander C. Huk, Il Memming Park, Daniele Marinazzo
2020 PLoS Computational Biology  
For stimuli near perceptual threshold, the trial-by-trial activity of single neurons in many sensory areas is correlated with the animal's perceptual report.  ...  Using a statistical nonlinear dimensionality reduction technique on single-trial ensemble recordings from the middle temporal (MT) area during perceptual-decision-making, we extracted low-dimensional latent  ...  Acknowledgments We thank the anonymous reviewers for their helpful comments. Memming thanks Hendrikje Nienborg for stimulating discussions.  ... 
doi:10.1371/journal.pcbi.1007614 pmid:32421716 fatcat:mjfgobg3trfc3ddvyzwutjohky

Inferring the collective dynamics of neuronal populations from single-trial spike trains using mechanistic models [article]

Christian Donner, Manfred Opper, Josef Ladenbauer
2019 bioRxiv   pre-print
dynamics, all from a single trial.  ...  To efficiently estimate the model parameters and compare different model variants we compute the likelihood of observed single-trail spike trains by leveraging analytical methods for spiking neuron models  ...  Acknowledgments We thank Christian Pozzorini for making available the in-vitro data. CD was supported by the German Research Foundation via GRK1589/2 and CRC1295.  ... 
doi:10.1101/671909 fatcat:wajgugbwhfdh5mwd3r6a5w6iv4
« Previous Showing results 1 — 15 out of 519 results