Correlation-based approach to analysis of spiking networks

Michael Krumin, Shy Shoham
2010 BMC Neuroscience  
The correlation structure of neural activity is believed to play a major role in the encoding and possibly decoding of information in neural populations. In addition, some of the most fundamental and widely applied tools for the identification of systems rely on the use of second order statistical properties (correlation or spectral). An arsenal of tools for identifying spike train models from their correlations, rather than from their full observed realizations could thus form a welcome bridge
more » ... between 'classical' signal processing ideas and tools and the field of neural spike train analysis. Recently, several methods were developed for controlling the correlation structure of multi-channel synthetic spike trains [1, [3] [4] [5] , and in related work, correlation based analysis of spike trains was used for blind identification of single-neuron models [5], for identifying compact autoregressive models for multi-channel spike trains, and for facilitating their causal network analysis [6] . However, the diversity of correlation structures that can be explained by the non-recurrent generative models used in these studies is limited, and hence, methods based on such models occasionally fail while analyzing correlation structures that are observed in neural activity. Here, we extend this framework by deriving closed-form expressions for the correlation structure of a more powerful multivariate self-and mutually-exciting Hawkes model class that is driven by exogenous non-negative inputs. We demonstrate that the resulting Linear-Nonlinear-Hawkes (LNH) framework is capable of capturing the dynamics of spike trains with a generally richer multi-correlation structure. We explore several new applications of this framework including highly compact representation of multi-channel spike train data and causal analysis of network information flow.
doi:10.1186/1471-2202-11-s1-p182 pmcid:PMC3090890 fatcat:tit3xsmcnbhuhe2vu5xfea2t7q