Neuronal message passing using Mean-field, Bethe, and Marginal approximations

Thomas Parr, Dimitrije Markovic, Stefan J. Kiebel, Karl J. Friston
2019 Scientific Reports  
Neuronal computations rely upon local interactions across synapses. For a neuronal network to perform inference, it must integrate information from locally computed messages that are propagated among elements of that network. We review the form of two popular (Bayesian) message passing schemes and consider their plausibility as descriptions of inference in biological networks. These are variational message passing and belief propagation - each of which is derived from a free energy functional
more » ... at relies upon different approximations (mean-field and Bethe respectively). We begin with an overview of these schemes and illustrate the form of the messages required to perform inference using Hidden Markov Models as generative models. Throughout, we use factor graphs to show the form of the generative models and of the messages they entail. We consider how these messages might manifest neuronally and simulate the inferences they perform. While variational message passing offers a simple and neuronally plausible architecture, it falls short of the inferential performance of belief propagation. In contrast, belief propagation allows exact computation of marginal posteriors at the expense of the architectural simplicity of variational message passing. As a compromise between these two extremes, we offer a third approach - marginal message passing - that features a simple architecture, while approximating the performance of belief propagation. Finally, we link formal considerations to accounts of neurological and psychiatric syndromes in terms of aberrant message passing.
doi:10.1038/s41598-018-38246-3 pmid:30760782 pmcid:PMC6374414 fatcat:3u6w7kywufdw7hgqck5lalmhf4