Filters








7,304 Hits in 3.1 sec

Generalized Denoising Auto-Encoders as Generative Models [article]

Yoshua Bengio, Li Yao, Guillaume Alain, Pascal Vincent
2013 arXiv   pre-print
However, it remained unclear how to connect the training procedure of regularized auto-encoders to the implicit estimation of the underlying data-generating distribution when the data are discrete, or  ...  Recent work has shown how denoising and contractive autoencoders implicitly capture the structure of the data-generating density, in the case where the corruption noise is Gaussian, the reconstruction  ...  Cho as well as funding from NSERC, CIFAR (YB is a CIFAR Fellow), and Canada Research Chairs.  ... 
arXiv:1305.6663v4 fatcat:v5bcqadbcjh35jokfsotnqwpsi

Cascading Denoising Auto-Encoder as a Deep Directed Generative Model [article]

Dong-Hyun Lee
2017 arXiv   pre-print
., 2013) has shown howDenoising Auto-Encoders(DAE) become gener-ative models as a density estimator.  ...  By cascading these mod-els, we propose Cascading Denoising Auto-Encoders(CDAE) which can generate samples ofdata distribution from tractable prior distributionunder the assumption that probabilistic distribu-tion  ...  Method Denoising Auto-Encoder as a Directed Generative Model In general, features obtained from encoder in Auto-Encoder have no probabilistic interpretation without any sampling process on it to match  ... 
arXiv:1511.07118v2 fatcat:ipyhvemybjb2rkbwppj3sk2q2q

Improving Generative Adversarial Networks with Denoising Feature Matching

David Warde-Farley, Yoshua Bengio
2017 International Conference on Learning Representations  
We estimate and track the distribution of these features, as computed from data, with a denoising auto-encoder, and use it to propose high-level targets for the generator.  ...  We propose an augmented training procedure for generative adversarial networks designed to address shortcomings of the original by directing the generator towards probable configurations of abstract discriminator  ...  We thank Vincent Dumoulin and Ishmael Belghazi for making available code and model parameters used in comparison to ALI, as well as Alec Radford for making available the code and model parameters for his  ... 
dblp:conf/iclr/Warde-FarleyB17 fatcat:2b3rrigydjgzxg3uip4gb7k63q

A two-stage learning method for protein-protein interaction prediction [article]

Amir Ahooye Atashin, Parsa Bagherzadeh, Kamaledin Ghiasi-Shirazi
2016 arXiv   pre-print
Protein-protein interaction; Denoising auto encoder;Robust features; Unlabelled data;  ...  In the proposed method, the denoising auto encoders are employed for learning robust features. The obtained robust features are used in order to train a classifier with a better performance.  ...  The reminder of this paper is organized as follows: section II presents some preliminaries concerning auto encoder, denoising auto encoder and stacked auto encoder [16] .  ... 
arXiv:1606.04561v2 fatcat:7asp7wdsljd5fbdd7az4hstxpa

A Cloud Computing Fault Detection Method Based on Deep Learning

Weipeng Gao, Youchan Zhu
2017 Journal of Computer and Communications  
An auto-encoder with sparse denoising is used to construct a parallel structure network.  ...  The expression for ˆj ρ is as follows: Denoising Auto-Encoder The denoising auto-encoder is able to train more robust encoders by regulariza- tion the auto-encoder The central idea: in the input layer  ...  from the encoder, such structure is more generalization than other models.  ... 
doi:10.4236/jcc.2017.512003 fatcat:uonygekeojdavn7lz2igrm5jym

Denoising convolutional autoencoder based B-mode ultrasound tongue image feature extraction [article]

Bo Li, Kele Xu, Dawei Feng, Haibo Mi, Huaimin Wang, Jian Zhu
2019 arXiv   pre-print
A Word Error Rate of 6.17% is obtained with DCAE, compared to the state-of-the-art value of 6.45% using Discrete cosine transform as the feature extractor.  ...  By quantitative comparison between different unsupervised feature extraction approaches, the denoising convolutional autoencoder (DCAE)-based method outperforms the other feature extraction methods on  ...  The visual-articulatory modeling with denoising CAE gives better results than those obtained using non-convolutional architecture-based auto-encoder as input.  ... 
arXiv:1903.00888v1 fatcat:4xato7b7zbe2nb33zsz43hqxc4

A novel stellar spectrum denoising method based on deep Bayesian modeling [article]

Xin Kang, Shiyuan He, Yanxia Zhang
2021 arXiv   pre-print
Our denoising method demonstrates superior performance to the standard denoising auto-encoder, in respect of denoising quality and missing flux imputation.  ...  The construction of our model includes a prior distribution for each stellar subclass, a spectrum generator and a flow-based noise model.  ...  The generator is obtained from the auto-encoder framework, but with an additional local isometry constraint.  ... 
arXiv:2103.02896v1 fatcat:224j4revbnevfff2re2zcwmsii

Denoising Gravitational Waves with Enhanced Deep Recurrent Denoising Auto-Encoders [article]

Hongyu Shen, Daniel George, E. A. Huerta, Zhizhen Zhao
2019 arXiv   pre-print
To address this issue, we design a novel model, referred to as 'Enhanced Deep Recurrent Denoising Auto-Encoder' (EDRDAE), that incorporates a signal amplifier layer, and applies curriculum learning by  ...  For this task, a combination of a recurrent neural net (RNNs) with a Denoising Auto-Encoder (DAEs) has shown promising results.  ...  The exploitation of deep learning algorithms has led to spectacular progress in signal denoising research. Denoising Auto-Encoders (DAEs) stand out among de-noising methods [1, 2] .  ... 
arXiv:1903.03105v1 fatcat:2vvmjkgtnjdndha5qgnphgi5xe

Improving Grammatical Error Correction via Pre-Training a Copy-Augmented Architecture with Unlabeled Data [article]

Wei Zhao, Liang Wang, Kewei Shen, Ruoyu Jia, Jingming Liu
2019 arXiv   pre-print
We pre-train the copy-augmented architecture with a denoising auto-encoder using the unlabeled One Billion Benchmark and make comparisons between the fully pre-trained model and a partially pre-trained  ...  model.  ...  Denoising Auto-encoder Denoising auto-encoders (Vincent et al., 2008) are commonly used for model initialization to extract and select features from inputs.  ... 
arXiv:1903.00138v3 fatcat:z3tyvcqg5ndjbehxuuq5aa2hhy

Deep Denoising Auto-encoder for Statistical Speech Synthesis [article]

Zhenzhou Wu, Shinji Takaki, Junichi Yamagishi
2015 arXiv   pre-print
This paper proposes a deep denoising auto-encoder technique to extract better acoustic features for speech synthesis.  ...  Denoising Auto-encoder The denoising auto-encoder is a variant of the basic auto-encoder.  ...  The denoising auto-encoder is trained such that the reconstructed z is as close as possible to the original data x.  ... 
arXiv:1506.05268v1 fatcat:gwe2zstvtnhrzjjt2vq5f7ur24

Improving Grammatical Error Correction via Pre-Training a Copy-Augmented Architecture with Unlabeled Data

Wei Zhao, Liang Wang, Kewei Shen, Ruoyu Jia, Jingming Liu
2019 Proceedings of the 2019 Conference of the North  
We pre-train the copy-augmented architecture with a denoising auto-encoder using the unlabeled One Billion Benchmark and make comparisons between the fully pre-trained model and a partially pretrained  ...  model.  ...  Denoising Auto-encoder Denoising auto-encoders (Vincent et al., 2008) are commonly used for model initialization to extract and select features from inputs.  ... 
doi:10.18653/v1/n19-1014 dblp:conf/naacl/ZhaoWSJL19 fatcat:gnstwmpncfemhbhi4gqpynd2ni

DiVAE: Photorealistic Images Synthesis with Denoising Diffusion Decoder [article]

Jie Shi, Chenfei Wu, Jian Liang, Xiang Liu, Nan Duan
2022 arXiv   pre-print
image and a prior model to generate image embedding.  ...  In addition, we apply the DiVAE with an Auto-regressive generator on conditional synthesis tasks to perform more human-feeling and detailed samples.  ...  The auto-regressive generator is an transformer encoder-decoder framework covering language and image to realize conditioned synthesis.  ... 
arXiv:2206.00386v1 fatcat:cfdbdqcrrrbzpmbbo2uwy5jxca

Deep Adversarial Multi-view Clustering Network

Zhaoyang Li, Qianqian Wang, Zhiqiang Tao, Quanxue Gao, Zhaohua Yang
2019 Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence  
Specifically, our model adopts deep auto-encoders to learn latent representations shared by multiple views, and meanwhile leverages adversarial training to further capture the data distribution and disentangle  ...  As with the first step, we get V 2 outputs corresponding to V views by multi-view denoising encoder E and multiview denoising generator G.  ...  Multi-view denoising generator G: Our multi-view denoising generator network has an opposite architecture to our mult-view denoising encoder E.  ... 
doi:10.24963/ijcai.2019/409 dblp:conf/ijcai/LiWTGY19 fatcat:ucydkiitnfe5jda35is2tvww4u

Ladder Networks: Learning under Massive Label Deficit

Behroz Mirza, Tahir Syed, Jamshed Memon, Yameen Malik
2017 International Journal of Advanced Computer Science and Applications  
Advancement in deep unsupervised learning are finally bringing machine learning close to natural learning, which happens with as few as one labeled instance.  ...  The model learns from the structure, rather than the labels alone transforming it from a label learner to a structural observer.  ...  This forces the auto encoder to learn how to denoise the corrupted inputs.  ... 
doi:10.14569/ijacsa.2017.080769 fatcat:ix4dlifjc5hwjka6hohdrswz2m

Scrambled Translation Problem: A Problem of Denoising UNMT [article]

Tamali Banerjee, Rudra Murthy V, Pushpak Bhattacharyya
2021 arXiv   pre-print
We observe that UNMT models which use word shuffle noise (as in case of Undreamt) can generate correct words, but fail to stitch them together to form phrases.  ...  We hypothesise that the reason behind scrambled translation problem is 'shuffling noise' which is introduced in every input sentence as a denoising strategy.  ...  Our simple retraining strategy, i.e. retraining the trained models by removing the denoising component from auto-encoder objective (AE), results in significant improvements in BLEU scores for four language  ... 
arXiv:1911.01212v2 fatcat:kpor67kkvnbq5c6qgckcytjnra
« Previous Showing results 1 — 15 out of 7,304 results