6,771 Hits in 3.9 sec

GEN Model: An Alternative Approach to Deep Neural Network Models [article]

Jiawei Zhang and Limeng Cui and Fisher B. Gouza
2018 arXiv   pre-print
In this paper, we introduce an alternative approach, namely GEN (Genetic Evolution Network) Model, to the deep learning models.  ...  Instead of building one single deep model, GEN adopts a genetic-evolutionary learning strategy to build a group of unit models generations by generations.  ...  In this paper, we propose the GEN (Genetic Evolution Network) model, which can work as an alternative approach to the deep learning models.  ... 
arXiv:1805.07508v1 fatcat:aha4z6z6jvekdb6wr5lrqmsqqu

Dynamic imaging using a deep generative SToRM (Gen-SToRM) model [article]

Qing Zou, Abdul Haseeb Ahmed, Prashant Nagpal, Stanley Kruger, Mathews Jacob
2021 arXiv   pre-print
We use the deep convolutional neural network (CNN) to represent the non-linear transformation.  ...  To minimize the computational complexity of the algorithm, we introduce an efficient progressive training-in-time approach and an approximate cost function.  ...  ACKNOWLEDGEMENT The authors would like to thank Ms. Melanie Laverman from the University of Iowa for making editorial corrections to refine this paper.  ... 
arXiv:2102.00034v3 fatcat:cf3sprh2jbgyvfvkckn4grgiyu

Interactive sketching of urban procedural models

Gen Nishida, Ignacio Garcia-Dorado, Daniel G. Aliaga, Bedrich Benes, Adrien Bousseau
2016 ACM Transactions on Graphics  
We use nonphotorealistic rendering to generate artificial data for training convolutional neural networks capable of quickly recognizing the procedural rule intended by a sketch and estimating its parameters  ...  We use a machine learning approach to solve the inverse problem of finding the procedural model that best explains a user sketch.  ...  We asked some of the participants to create more complex buildings after showing them examples. These imaginary buildings were sketched in less than 15 minutes.  ... 
doi:10.1145/2897824.2925951 fatcat:3bgo7szfozfj7kic65rab27vqm

GEN: highly efficient SMILES explorer using autodidactic generative examination networks

Ruud van Deursen, Peter Ertl, Igor V. Tetko, Guillaume Godin
2020 Journal of Cheminformatics  
In this study, we introduce Generative Examination Networks (GEN) as a new approach to train deep generative networks for SMILES generation.  ...  Recurrent neural networks have been widely used to generate millions of de novo molecules in defined chemical spaces.  ...  The neural network of the generator is subsequently converted into a generative examination network (GEN). In GENs, the deep generative models autonomously learn to write valid molecular SMILES.  ... 
doi:10.1186/s13321-020-00425-8 pmid:33430998 fatcat:3zmc4jt4obej5iqasooe2rjfi4

The Efficacy of L_1 Regularization in Two-Layer Neural Networks [article]

Gen Li, Yuantao Gu, Jie Ding
2020 arXiv   pre-print
A crucial problem in neural networks is to select the most appropriate number of hidden neurons and obtain tight statistical risk bounds.  ...  As an alternative to selecting the number of neurons, we theoretically show that L_1 regularization can control the generalization error and sparsify the input dimension.  ...  Second, it would be interesting to emulate the current approach to yield similarly tight risk bounds for deep forward neural networks.  ... 
arXiv:2010.01048v1 fatcat:hxm5oycddvagpnsemdsrdiwzt4

HarrisZ^+: Harris Corner Selection for Next-Gen Image Matching Pipelines [article]

Fabio Bellavia, Dmytro Mishkin
2022 arXiv   pre-print
These results are quite close to those obtained by the more recent fully deep end-to-end trainable approaches and show that there is still a proper margin of improvement that can be granted by the research  ...  Due to its role in many computer vision tasks, image matching has been subjected to an active investigation by researchers, which has lead to better and more discriminant feature descriptors and to more  ...  design better deep networks as well as to provide more valid training data.  ... 
arXiv:2109.12925v4 fatcat:5be3hpyp3rakdoy5ze6xrkk3im

Procedural Modeling of a Building from a Single Image

Gen Nishida, Adrien Bousseau, Daniel G. Aliaga
2018 Computer graphics forum (Print)  
Each stage of our pipeline combines convolutional neural networks (CNNs) and optimization to select and parameterize procedural grammars that reproduce the building elements of the picture.  ...  Figure 1 : Procedural modeling generation from a single image. 1) Given an image and a silhouette of a building, 2) our approach estimates the camera parameters and building mass as a first step.  ...  Acknowledgements We would like to thank the authors of DeepFacade [LZZH17] for helping us generate façade parsing results, and the anonymous reviewers for their constructive suggestions.  ... 
doi:10.1111/cgf.13372 fatcat:vnisacbqujasbowvqh5qzfdya4

Full Face-and-Head 3D Model with Photorealistic Texture

Yangyu Fan, Yang Liu, Guoyun Lv, Shiya Liu, Gen Li, Yanhui Huang
2020 IEEE Access  
Additionally, to infer the invisible region texture information corresponding to the input face image, we design an effective architecture with the generative adversarial network (GAN) for panoramic UV  ...  To this end, we introduce a pipeline to integrate the highly-detailed face model into the basic model.  ...  The middle layer feature can be extracted from a deep convolutional neural network, and the texture map is synthesized by iteratively optimizing.  ... 
doi:10.1109/access.2020.3031886 fatcat:zzwmtc5ugbc75i5ln5xf66d3zy

Efficient Communication Acceleration for Next-Gen Scale-up Deep Learning Training Platforms [article]

Saeed Rashidi, Srinivas Sridharan, Sudarshan Srinivasan, Matthew Denton, Tushar Krishna
2020 arXiv   pre-print
We evaluate the benefits of the ACE with micro-benchmarks (e.g. single collective performance) and popular DL models using an end-to-end DL training simulator.  ...  ACE is a smart net-work interface (NIC) tuned to cope with the high-bandwidth and low latency requirements of scale-up networks and is able to efficiently drive the various scale-up network systems(e.g  ...  INTRODUCTION Deep Learning (DL) and Deep Neural network (DNN) models are being deployed pervasively across a wide range of real-world application domains such as image classification, natural language  ... 
arXiv:2007.00156v3 fatcat:jz6uh7b3zncl3h4k7zs3me644u

The Rate of Convergence of Variation-Constrained Deep Neural Networks [article]

Gen Li, Jie Ding
2022 arXiv   pre-print
An important and fundamental problem is to understand the learnability of a network model through its statistical risk, or the expected prediction error on future data.  ...  Our result also provides insight to the phenomena that deep neural networks do not easily suffer from overfitting when the number of neurons and learning parameters rapidly grow with n or even surpass  ...  Deep neural network model class Recall that our goal is to learn a regression function fn : x → fn (x) for prediction.  ... 
arXiv:2106.12068v2 fatcat:uq2rkdyoe5ayxaacewtso6hld4

Dext-Gen: Dexterous Grasping in Sparse Reward Environments with Full Orientation Control [article]

Martin Schuck and Jan Brüdigam and Alexandre Capone and Stefan Sosnowski and Sandra Hirche
2022 arXiv   pre-print
Our approach has reasonable training durations and provides the option to include desired prior knowledge.  ...  We present Dext-Gen, a reinforcement learning framework for Dexterous Grasping in sparse reward ENvironments that is applicable to a variety of grippers and learns unbiased and intricate policies.  ...  Therefore, such representations are not well suited for neural networks [19] .  ... 
arXiv:2206.13966v1 fatcat:x6nunp52hza4fhr7ucqatmgqma

Bayesian Semisupervised Learning with Deep Generative Models [article]

Jonathan Gordon, José Miguel Hernández-Lobato
2017 arXiv   pre-print
Neural network based generative models with discriminative components are a powerful approach for semi-supervised learning.  ...  We show how an efficient Gibbs sampling procedure can marginalize the stochastic inputs when inferring missing labels in this model.  ...  Closely related to deep generative models are Bayesian neural networks (BNNs) (Neal, 2012) .  ... 
arXiv:1706.09751v1 fatcat:4uobcisytrdh5pcwje6aigongq

Deep generative models in DataSHIELD

Stefan Lenz, Moritz Hess, Harald Binder
2021 BMC Medical Research Methodology  
Here, we employ deep Boltzmann machines (DBMs) as generative models.  ...  If a desired algorithm is not implemented in DataSHIELD or cannot be reformulated in such a way, using artificial data is an alternative.  ...  MH contributed to design and implementation of the experiment for comparing the different generative models on genetic variant data.  ... 
doi:10.1186/s12874-021-01237-6 pmid:33812380 fatcat:yqfivrotdff5fayvpgxwkeze6q

Modeling Sparse Deviations for Compressed Sensing using Generative Models [article]

Manik Dhar, Aditya Grover, Stefano Ermon
2018 arXiv   pre-print
due to a generative model prior.  ...  In compressed sensing, a small number of linear measurements can be used to reconstruct an unknown signal.  ...  Acknowledgements We are thankful to Tri Dao, Jonathan Kuck, Daniel Levy, Aditi Raghunathan, and Yang Song for helpful comments on early drafts.  ... 
arXiv:1807.01442v2 fatcat:mnor57pdbbcsnby2k2cjlrkrma

Deep Generative Modeling of LiDAR Data [article]

Lucas Caccia, Herke van Hoof, Aaron Courville, Joelle Pineau
2019 arXiv   pre-print
In this work, we show that one can adapt deep generative models for this task by unravelling lidar scans into a 2D point map.  ...  We show that this helps robustness to noisy and imputed input; the learned model can recover the underlying lidar scan from seemingly uninformative data  ...  Grid-based lidar generation An alternative approach for generative modeling of lidar data is from Ondruska et al [14] .  ... 
arXiv:1812.01180v4 fatcat:qzwlqrf7lbgj5py7mvj7nopxjq
« Previous Showing results 1 — 15 out of 6,771 results