Filters








208 Hits in 2.6 sec

Untrained Graph Neural Networks for Denoising [article]

Samuel Rey, Santiago Segarra, Reinhard Heckel, Antonio G. Marques
2021 arXiv   pre-print
This paper introduces two untrained graph neural network architectures for graph signal denoising, provides theoretical guarantees for their denoising capabilities in a simple setup, and numerically validates  ...  While there are many well-performing methods for denoising signals defined on regular supports, such as images defined on two-dimensional grids of pixels, many important classes of signals are defined  ...  Successful examples of this approach can be found in the subareas of network analytics, machine learning over graphs, and graph signal processing (GSP) [2] - [4] , with graph neural networks (GNNs) and  ... 
arXiv:2109.11700v1 fatcat:aawkjh55rfhc7jgo4w3sc6bigy

Denoising convolutional neural networks for photoacoustic microscopy [article]

Xianlin Song, Kanggao Tang, Jianshuang Wei, Lingfang Song
2020 arXiv   pre-print
In order to solve the problem of low SNR of photoacoustic images, we use feedforward denoising convolutional neural network to further process the obtained images, so as to obtain higher SNR images and  ...  segmated a training set containing 400 images, and then used it for network training.  ...  Denoising Convolutional Neural Network is a new deep learning neural network which is greatly enhanced on the basis of the traditional denoising neural network and innovatively realizes the learning for  ... 
arXiv:2009.13913v1 fatcat:x7vnw4qxpfd4lkx6af2sy4r4ki

On Architecture Selection for Linear Inverse Problems with Untrained Neural Networks

Yang Sun, Hangdong Zhao, Jonathan Scarlett
2021 Entropy  
While pre-trained generative models are perhaps the most common, it has additionally been shown that even untrained neural networks can serve as excellent priors in various imaging applications.  ...  In recent years, neural network based image priors have been shown to be highly effective for linear inverse problems, often significantly outperforming conventional methods that are based on sparsity  ...  Broadly speaking, our work is motivated by the following gaps in the literature on inverse problems with untrained neural networks: • Untrained neural networks invariably come with architectural hyperparameters  ... 
doi:10.3390/e23111481 pmid:34828179 pmcid:PMC8623203 fatcat:qna2tmzjjvdcfiigcnpfwtlena

How Powerful is Graph Convolution for Recommendation? [article]

Yifei Shen, Yongji Wu, Yao Zhang, Caihua Shan, Jun Zhang, Khaled B. Letaief, Dongsheng Li
2021 arXiv   pre-print
Graph convolutional networks (GCNs) have recently enabled a popular class of algorithms for collaborative filtering (CF).  ...  By identifying the critical role of smoothness, a key concept in graph signal processing, we develop a unified graph convolution-based framework for CF.  ...  Nevertheless, linear functions are non-trivial to learn for a neural network trained with SGD.  ... 
arXiv:2108.07567v1 fatcat:qv54soia5vguro46isbv5ukgbu

An Underparametrized Deep Decoder Architecture for Graph Signals [article]

Samuel Rey, Antonio G. Marques, Santiago Segarra
2019 arXiv   pre-print
A novel element is the incorporation of upsampling operators accounting for the structure of the supporting graph, which is achieved by considering a systematic graph coarsening approach based on hierarchical  ...  can also outperform state-of-the-art methods in several tasks such as image compression and denoising.  ...  transform (GFT) and than can be used for tasks such as compression, denoising, and inpainting of signals on graphs.  ... 
arXiv:1908.00878v1 fatcat:a62szb4npfeo3hgpojgwd7kwby

Artificial Language Training Reveals the Neural Substrates Underlying Addressed and Assembled Phonologies

Leilei Mei, Gui Xue, Zhong-Lin Lu, Qinghua He, Mingxia Zhang, Miao Wei, Feng Xue, Chuansheng Chen, Qi Dong, Emmanuel Andreas Stamatakis
2014 PLoS ONE  
Specifically, compared to untrained words, trained words in the assembled phonology group showed stronger activation in the addressed phonology network and less activation in the assembled phonology network  ...  At the neural level, we found a clear dissociation of the neural pathways for addressed and assembled phonologies: There was greater involvement of the anterior cingulate cortex, posterior cingulate cortex  ...  In contrast, activation in the assembled phonology network (i.e., bilateral PCG/IFG and SMG) was weaker for trained words than for untrained words (Table 2 ).  ... 
doi:10.1371/journal.pone.0093548 pmid:24676060 pmcid:PMC3968146 fatcat:iofjvqi7lndxvguxjhfczqhvxy

Multi-Scale Analysis and Pattern Recognition of Ultrasonic Signals of PD in a Liquid/Solid Composite of an Oil-Filled Terminal

Wang, Zhang, Li, Li, Gao, Guo
2020 Energies  
In summary, the method of combining multi-scale analysis and neural networks is used to distinguish the five discharge types by extracting the characteristic values of the characteristic signals.  ...  Finally, we designed six characteristic parameters of the ultrasound signal, and screened three feature quantities by a back propagation (BP) neural network to distinguish between plate-to-plate air gap  ...  The algorithm of BP network [26, 27] is described as follows: When applying an untrained network for data classification and pattern recognition, first of all, it is necessary to determine the data of  ... 
doi:10.3390/en13020366 fatcat:j47wbmvpxjhwnpvvn7khfqybpy

Blind Poissonian Image Deblurring Regularized by a Denoiser Constraint and Deep Image Prior

Yayuan Feng, Yu Shi, Dianjun Sun
2020 Mathematical Problems in Engineering  
To simultaneously solve the denoising and deblurring of Poissonian images better, we learn the implicit deep image prior from a single degraded image and use the denoiser as a regularization term to constrain  ...  The denoising and deblurring of Poisson images are opposite inverse problems. Single image deblurring methods are sensitive to image noise.  ...  Conclusions is paper proposed an algorithm for denoising and deblurring Poisson images by using neural networks.  ... 
doi:10.1155/2020/9483521 fatcat:22iy2p6dzzffnepvljlme5lb4a

On Measuring and Controlling the Spectral Bias of the Deep Image Prior [article]

Zenglin Shi, Pascal Mettes, Subhransu Maji, Cees G. M. Snoek
2021 arXiv   pre-print
First, it remains unclear how to control the prior beyond the choice of the network architecture.  ...  Finally, we show that our approach obtains favorable results compared to current approaches across various denoising, deblocking, inpainting, super-resolution and detail enhancement tasks.  ...  Related Work The deep image prior, introduced by Ulyanov et al. (2018 Ulyanov et al. ( , 2020 , revealed for the first time the remarkable ability of untrained neural networks to solve challenging inverse  ... 
arXiv:2107.01125v3 fatcat:wrsuiofpwnarrkukirovt74e4u

Hardware-irrelevant parallel processing system [article]

Xiuting Zou, Shaofu Xu, Anyi Deng, Rui Wang, Weiwen Zou
2020 arXiv   pre-print
Parallel processing technology has been a primary tool for achieving high-speed, high-accuracy, and broadband processing for many years across modern information systems and data processing such as optical  ...  Under one system state, a category of signals with two different mismatch degrees is utilized to train the CRAE, which can then compensate for mismatches in various categories of signals with multiple  ...  For example, a residual neural network has been trained with a single category of signals collected when the parallel system in one mismatch degree to achieve mismatch compensation.  ... 
arXiv:2006.13443v1 fatcat:f4zqcqznljecfl6llix3vvbngy

Native language experience shapes neural basis of addressed and assembled phonologies

Leilei Mei, Gui Xue, Zhong-Lin Lu, Qinghua He, Miao Wei, Mingxia Zhang, Qi Dong, Chuansheng Chen
2015 NeuroImage  
These results provide direct neuroimaging evidence for the effect of native language experience on the neural mechanisms of phonological access in a new language and support the assimilation-accommodation  ...  However, it is not clear whether native language experience shapes the neural mechanisms of addressed and assembled phonologies.  ...  The assimilation hypothesis assumes that the human brain will read a second language as if it is the native language and use the neural network for the native language to 8 support the second language.  ... 
doi:10.1016/j.neuroimage.2015.03.075 pmid:25858447 pmcid:PMC4446231 fatcat:ufhmtuvbcvfhznsyn5zudgkjhq

Neural Architecture Search for Deep Image Prior [article]

Kary Ho, Andrew Gilbert, Hailin Jin, John Collomosse
2020 arXiv   pre-print
Our binary representation encodes the design space for an asymmetric E-D network that typically converges to yield a content-specific DIP within 10-20 generations using a population size of 500.  ...  We present a neural architecture search (NAS) technique to enhance the performance of unsupervised image de-noising, in-painting and super-resolution under the recently proposed Deep Image Prior (DIP).  ...  Instead, we use an untrained (randomly initialized) network to perform the tasks by overfitting such a network to a single image under a task-specific loss using neural architecture search.  ... 
arXiv:2001.04776v1 fatcat:uto2umzjtzf23jquj6ywqt6yqi

Learning Sparse High Dimensional Filters: Image Filtering, Dense CRFs and Bilateral Neural Networks [article]

Varun Jampani and Martin Kiefel and Peter V. Gehler
2015 arXiv   pre-print
Finally, we introduce layers of bilateral filters in CNNs and propose bilateral neural networks for the use of high-dimensional sparse data.  ...  First, we demonstrate the use in applications where single filter applications are desired for runtime reasons.  ...  Acknowledgements We thank Jonas Wulff, Laura Sevilla, Abhilash Srikantha, Christoph Lassner, Andreas Lehrmann, Thomas Nestmeyer, Andreas Geiger, Fatma Güney and Gerard Pons-Moll for their valuable feedback  ... 
arXiv:1503.04949v3 fatcat:wcfekolyyvcmtar6jorisvahli

Deep Learning for Visual SLAM in Transportation Robotics: A review

Chao Duan, Steffen Junginger, Jiahao Huang, Kairong Jin, Kerstin Thurow
2019 Transportation Safety and Environment  
[71] proposed a graphregularization stacked denoising auto-encoder (G-SDA) network and the manifold learning graph regularization structure.  ...  But so far, deep networks are often trained for closed world scenarios. A deep network tends to perform well in its trained datasets, and tend to fail in untrained datasets.  ... 
doi:10.1093/tse/tdz019 fatcat:c5tj64xro5ftvcw6qwz7rgrgky

SONAR IMAGE RECOGNITION BASED ON MACHINE LEARNING FRAMEWORK

M. Dong, H. Qiu, H. Wang, P. Zhi, Z. Xu
2022 The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences  
In order to improve the robustness and generalization ability of model recognition, sonar images are enhanced by preprocessing such as conversion coordinates, interpolation, denoising and enhancement,  ...  chain layers) And the transfer learning method under the Python deep learning framework Inception-Resnet-v2 model for sonar image training and recognition.  ...  At present, commonly used deep learning methods are deep belief network (DEEP belief network, DBN), convolutional neural network (CNN) and recurrent neural network (RNN), etc., of which the convolutional  ... 
doi:10.5194/isprs-archives-xlvi-3-w1-2022-45-2022 fatcat:an3s6vcjw5cazkefupw4hyyc7e
« Previous Showing results 1 — 15 out of 208 results