Filters








6,430 Hits in 11.8 sec

Bayesian Imaging With Data-Driven Priors Encoded by Neural Networks: Theory, Methods, and Algorithms [article]

Matthew Holden, Marcelo Pereyra, Konstantinos C. Zygalakis
2021 arXiv   pre-print
data by using a variational autoencoder or a generative adversarial network.  ...  This paper proposes a new methodology for performing Bayesian inference in imaging inverse problems where the prior knowledge is available in the form of training data.  ...  Acknowledgments The authors are grateful for useful discussions with Andrés Almansa.  ... 
arXiv:2103.10182v1 fatcat:cyleliu7prfotnnf5tlns4ky24

The optimally designed autoencoder network for compressed sensing

Zufan Zhang, Yunfeng Wu, Chenquan Gan, Qingyi Zhu
2019 EURASIP Journal on Image and Video Processing  
Meanwhile, a trained decoder sub-network solves the CS recovery problem by learning the structure features within the training data.  ...  To this end, a deep learning based stacked sparse denoising autoencoder compressed sensing (SSDAE_CS) model, which mainly consists of an encoder sub-network and a decoder sub-network, is proposed and analyzed  ...  Bayesian algorithms solve the sparse recovery problem by taking into account a prior knowledge of the sparse signal distribution, e.g., Bayesian via Laplace Prior (BCS-LP) [22] and Bayesian framework  ... 
doi:10.1186/s13640-019-0460-5 fatcat:stybudg7nfgvxfpm5swoytno2e

Variational Denoising Network: Toward Blind Noise Modeling and Removal [article]

Zongsheng Yue, Hongwei Yong, Qian Zhao, Lei Zhang, Deyu Meng
2020 arXiv   pre-print
Specifically, an approximate posterior, parameterized by deep neural networks, is presented by taking the intrinsic clean image and noise variances as latent variables conditioned on the input noisy image  ...  On one hand, as other data-driven deep learning methods, our method, namely variational denoising network (VDN), can perform denoising efficiently due to its explicit form of posterior expression.  ...  Data-driven Deep Learning based Methods: Instead of pre-setting image prior, deep learning methods directly learn a denoiser (formed as a deep neural network) from noisy to clean ones on a large collection  ... 
arXiv:1908.11314v4 fatcat:adxcypvc7jhnnn7imwbocyp33u

An End-To-End Bayesian Segmentation Network Based on a Generative Adversarial Network for Remote Sensing Images

Dehui Xiong, Chu He, Xinlong Liu, Mingsheng Liao
2020 Remote Sensing  
First, fully convolutional networks (FCNs) and GANs are utilized to realize the derivation of the prior probability and the likelihood to the posterior probability in Bayesian theory.  ...  In this paper, we present an end-to-end Bayesian segmentation network based on generative adversarial networks (GANs) for remote sensing images.  ...  The first one is the encoder-decoder neural network category, which includes U-Net [15] and SegNet [16] . In the encoder network, the spatial dimension is gradually reduced with a pooling layer.  ... 
doi:10.3390/rs12020216 fatcat:nuwkc373szh2hcj3szomqipx3y

Incorporating Domain Knowledge into Deep Neural Networks [article]

Tirtharaj Dash, Sharad Chitlangia, Aditya Ahuja, Ashwin Srinivasan
2021 arXiv   pre-print
We present a survey of ways in which domain-knowledge has been included when constructing models with neural networks.  ...  This paper examines two broad approaches to encode such knowledge--as logical and numerical constraints--and describes techniques and results obtained in several sub-categories under each of these approaches  ...  Priors Seminal work by [Neal, 1995] on Bayesian learning in neural networks showed how domain-knowledge could help build a prior probability distribution over neural network parameters.  ... 
arXiv:2103.00180v2 fatcat:g6hpc4zqw5ezlf67u6e7fhhqjy

2021 Index IEEE Transactions on Neural Networks and Learning Systems Vol. 32

2021 IEEE Transactions on Neural Networks and Learning Systems  
The primary entry includes the coauthors' names, the title of the paper or other item, and its location, specified by the publication abbreviation, year, month, and inclusive pagination.  ...  The Subject Index contains entries describing the item under all appropriate subject headings, plus the first author's name, the publication abbreviation, month, and year, and inclusive pages.  ...  ., +, Fuzzy set theory Data-Driven Intelligent Warning Method for Membrane Fouling.  ... 
doi:10.1109/tnnls.2021.3134132 fatcat:2e7comcq2fhrziselptjubwjme

Hands-on Bayesian Neural Networks – a Tutorial for Deep Learning Users [article]

Laurent Valentin Jospin and Wray Buntine and Farid Boussaid and Hamid Laga and Mohammed Bennamoun
2022 arXiv   pre-print
Bayesian statistics offer a formalism to understand and quantify the uncertainty associated with deep neural network predictions.  ...  Stochastic Artificial Neural Networks trained using Bayesian methods.  ...  ACKNOWLEDGEMENTS This material is partially based on research sponsored by the Australian Research Council https://www.arc.gov.au/ (Grants DP150100294 and DP150104251), and Air Force Research Laboratory  ... 
arXiv:2007.06823v3 fatcat:desypxpalfdg7daugvw2u7db4a

InverseNet: Solving Inverse Problems with Splitting Networks [article]

Kai Fan, Qi Wei, Wenlin Wang, Amit Chakraborty, Katherine Heller
2017 arXiv   pre-print
with other image processing algorithms.  ...  forward model associated with the data term and one handling the denoising of the output from the former network, i.e., the inverted version, associated with the prior/regularization term.  ...  By leveraging the powerful approximation ability of deep neural networks, these deep learning based data-driven methods have achieved state-of-the-art performance in many challenging inverse problems like  ... 
arXiv:1712.00202v1 fatcat:p3aootffcjgevmcqp27vlgxf3y

Multi-Scale Information, Network, Causality, and Dynamics: Mathematical Computation and Bayesian Inference to Cognitive Neuroscience and Aging [chapter]

Michelle Yongmei
2013 Functional Brain Mapping and the Endeavor to Understand the Working Brain  
Acknowledgements Preparation of this chapter is supported in part by a grant from the National Institute of Aging, K25AG033725.  ...  Author details Michelle Yongmei Wang * Address all correspondence to: ymw@illinois.edu Departments of Statistics, Psychology, and Bioengineering, Beckman Institute, University of Illinois at Urbana-Champaign  ...  Bayesian statistical methods in conjunction with Bayesian networks offer an efficient and principled approach for avoiding the data overfitting.  ... 
doi:10.5772/55262 fatcat:go2r6jruyzdqrp4lqcnt64j7va

An efficient MDL-based construction of RBF networks

Aleš Leonardis, Horst Bischof
1998 Neural Networks  
We test the proposed method on function approximation and classification tasks, and compare it with some other recently proposed methods. ᭧  ...  By iteratively combining these two procedures we achieve a controlled way of training and modifying RBF networks, which balances accuracy, training time, and complexity of the resulting network.  ...  Leonardis also acknowledges partial support by the Ministry of Science and Technology of Republic of Slovenia (Projects J2-6187, J2-8829), the European Union Copernicus Program (Grant 1068 RECCAD), and  ... 
doi:10.1016/s0893-6080(98)00051-3 pmid:12662797 fatcat:m7p5rvqamzfkzagy2tdmzgjg2u

Simplifying Neural Networks by Soft Weight-Sharing

Steven J. Nowlan, Geoffrey E. Hinton
1992 Neural Computation  
This is a sensible and quite efficient algorithm to use for Simplifying Neural Networks 479 estimating the mixture parameters when we are dealing with a stationary data distribution.  ...  negligible so the weights are initially driven primarily by the derivative of the data- misfit term.  ... 
doi:10.1162/neco.1992.4.4.473 fatcat:g5diinzi5ja53gk2dkuoi3rhiy

Object Recognition Using Deep Neural Networks: A Survey [article]

Soren Goyal, Paul Benjamin
2014 arXiv   pre-print
The paper briefly describes the history of research in Neural Networks and describe several of the recent advances in this field.  ...  Recognition of objects using Deep Neural Networks is an active area of research and many breakthroughs have been made in the last few years.  ...  First, neural networks are data driven self-adaptive algorithms; they require no prior knowledge of the data or underlying properties.  ... 
arXiv:1412.3684v1 fatcat:zh2jlxncbzgofipvkgbfwsmt4i

Brain-Inspired Hardware Solutions for Inference in Bayesian Networks

Leila Bagheriye, Johan Kwisthout
2021 Frontiers in Neuroscience  
., computing posterior probabilities) in Bayesian networks using a conventional computing paradigm turns out to be inefficient in terms of energy, time, and space, due to the substantial resources required  ...  This comprehensive review paper discusses different hardware implementations of Bayesian networks considering different devices, circuits, and architectures, as well as a more futuristic overview to solve  ...  on medical image data.  ... 
doi:10.3389/fnins.2021.728086 pmid:34924925 pmcid:PMC8677599 fatcat:tihogzl6tfbpjdybwpggllwd5u

Equilibrated Zeroth-Order Unrolled Deep Networks for Accelerated MRI [article]

Zhuo-Xu Cui, Jing Cheng, Qingyong Zhu, Yuanyuan Liu, Sen Jia, Kankan Zhao, Ziwen Ke, Wenqi Huang, Haifeng Wang, Yanjie Zhu, Dong Liang
2021 arXiv   pre-print
) of the regularizer with a network module, which appears more explainable and predictable compared to common data-driven networks.  ...  Recently, model-driven deep learning unrolls a certain iterative algorithm of a regularization model into a cascade network by replacing the first-order information (i.e., (sub)gradient or proximal operator  ...  Uncertainty estimation in medical image denoising with bayesian deep image prior.  ... 
arXiv:2112.09891v2 fatcat:zcfmho7opjahlbosqevc4wk3yy

BayesFlow: Learning complex stochastic models with invertible neural networks [article]

Stefan T. Radev, Ulf K. Mertens, Andreass Voss, Lynton Ardizzone, Ullrich Köthe
2020 arXiv   pre-print
With this work, we propose a novel method for globally amortized Bayesian inference based on invertible neural networks which we call BayesFlow.  ...  In addition, our method incorporates a summary network trained to embed the observed data into maximally informative summary statistics.  ...  We also thank Francis Tuerlinckx and Stijn Verdonck for their support and thought-provoking ideas. References  ... 
arXiv:2003.06281v4 fatcat:fsoeg74glrdkxkn63d6fdjbtzm
« Previous Showing results 1 — 15 out of 6,430 results