A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Emerging Convolutions for Generative Normalizing Flows
[article]
2019
arXiv
pre-print
Recently, Kingma & Dhariwal (2018) demonstrated with Glow that generative flows are capable of generating high quality images. ...
We propose two methods to produce invertible convolutions that have receptive fields identical to standard convolutions: Emerging convolutions are obtained by chaining specific autoregressive convolutions ...
The definition of several generative normalizing flows. All flow functions have an inverse and determinant that are straightforward to compute. ...
arXiv:1901.11137v3
fatcat:6dfh43y4zbg5fdc4xasyx2u6hi
CInC Flow: Characterizable Invertible 3x3 Convolution
[article]
2021
arXiv
pre-print
Normalizing flows are an essential alternative to GANs for generative modelling, which can be optimized directly on the maximum likelihood of the dataset. ...
We study conditions such that 3×3 CNNs are invertible, allowing them to construct expressive normalizing flows. We derive necessary and sufficient conditions on a padded CNN for it to be invertible. ...
An essential type of Likelihood-based Generative models is normalizing flow-based models. ...
arXiv:2107.01358v1
fatcat:pcnjdwfsyjh7pbaknl7x6it3u4
Fast Flow Reconstruction via Robust Invertible nxn Convolution
[article]
2022
arXiv
pre-print
Flow-based generative models have recently become one of the most efficient approaches to model data generation. ...
Glow first introduced a simple type of generative flow using an invertible 1 × 1 convolution. However, the 1 × 1 convolution suffers from limited flexibility compared to the standard convolutions. ...
Table 1 . 1 Comparative invertible functions in several generative normalizing flows. ...
arXiv:1905.10170v3
fatcat:wejue6zthnf27jcbh7qi6ztuwi
Fast Flow Reconstruction via Robust Invertible n×n Convolution
2021
Future Internet
Flow-based generative models have recently become one of the most efficient approaches to model data generation. ...
Glow first introduced a simple type of generative flow using an invertible 1×1 convolution. However, the 1×1 convolution suffers from limited flexibility compared to the standard convolutions. ...
The authors would like to thank the reviewers for their valuable comments.
Conflicts of Interest: The authors declare no conflict of interest. ...
doi:10.3390/fi13070179
fatcat:565c33rsvndjbj3qmrqmgfbtom
Woodbury Transformations for Deep Generative Flows
[article]
2021
arXiv
pre-print
Normalizing flows are deep generative models that allow efficient likelihood calculation and sampling. ...
Other similar operations, such as 1x1 convolutions, emerging convolutions, or periodic convolutions allow at most two of these three advantages. ...
Acknowledgments We thank NVIDIA's GPU Grant Program and Amazon's AWS Cloud Credits for Research program for their support. ...
arXiv:2002.12229v3
fatcat:fvdm3ucyr5hrjhrfv3mqu6wskq
The Convolution Exponential and Generalized Sylvester Flows
[article]
2020
arXiv
pre-print
Empirically, we show that the convolution exponential outperforms other linear transformations in generative flows on CIFAR10 and the graph convolution exponential improves the performance of graph normalizing ...
In addition, we generalize Sylvester Flows and propose Convolutional Sylvester Flows which are based on the generalization and the convolution exponential as basis change. ...
However, periodicity is generally not a good inductive bias for images, and emerging convolutions are autoregressive and their inverse is solved iteratively over dimensions. ...
arXiv:2006.01910v2
fatcat:t3bdtfbkbbfqphe6ww2pf22fbm
MaCow: Masked Convolutional Generative Flow
[article]
2019
arXiv
pre-print
In this work, we introduce masked convolutional generative flow (MaCow), a simple yet effective architecture of generative flow using masked convolution. ...
Despite their computational efficiency, the density estimation performance of flow-based generative models significantly falls behind those of state-of-the-art autoregressive models. ...
Masked Convolutional Generative Flows In this section, we describe the architectural components of the masked convolutional generative flow (MACOW). ...
arXiv:1902.04208v5
fatcat:u4djxn3hwjf4ljwd63j7akyrwm
Invertible Convolutional Flow
2019
Neural Information Processing Systems
We show that these transforms allow more effective normalizing flow models to be developed for generative image models. ...
As an alternative, we investigate a set of novel normalizing flows based on the circular and symmetric convolutions. ...
For example, f * denotes the convolutional flow in general and σα is used to specify the pointwise nonlinear bijectors with its inverse being φα. ...
dblp:conf/nips/KaramiSSDD19
fatcat:sfkmxc3wprejhiwijheiadgoay
Intrusion Detection Algorithm Based on Convolutional Neural Network
2018
DEStech Transactions on Engineering and Technology Research
Therefore, each flow training weight in IDS model cannot be balanced. Compared with other kinds of flows, DoS attack and normal flow are easier to detect. ...
This is the main reason for why this thesis chose KDD99 dataset. There are 4,898,431 network flows in the dataset and each flow has 41 features. ...
doi:10.12783/dtetr/iceta2017/19916
fatcat:ss7fsz3ryrfddeiq7albfs6r4i
Flow-based Spatio-Temporal Structured Prediction of Dynamics
[article]
2022
arXiv
pre-print
Conditional Normalizing Flows (CNFs) are flexible generative models capable of representing complicated distributions with high dimensionality and large interdimensional correlations, making them appealing ...
We specifically propose to use conditional priors to factorize the latent space for the time dependent modeling. We also exploit the use of masked convolutions as autoregressive conditionals in CNFs. ...
As shown in Figure 2 , masked convolutions with spatial and temporal orderings generate conditional weights for the steps of the normalizing flows. ...
arXiv:2104.04391v2
fatcat:adddsj6dfzbldk2p2zgkzuq6li
ButterflyFlow: Building Invertible Layers with Butterfly Matrices
2022
International Conference on Machine Learning
Normalizing flows model complex probability distributions using maps obtained by composing invertible layers. ...
Based on our invertible butterfly layers, we construct a new class of normalizing flow models called ButterflyFlow. ...
The authors would like to thank Jiaming Song for the constructive feedback. This research was support by NSF (#1651565), AFOSR (FA95501910024), ARO (W911NF-21-1-0125) and Sloan Fellowship. ...
dblp:conf/icml/MengZCDE22
fatcat:ug6w4dsal5gyhmpofeg2syapui
Self Normalizing Flows
[article]
2021
arXiv
pre-print
In this work, we propose Self Normalizing Flows, a flexible framework for training normalizing flows by replacing expensive terms in the gradient by learned approximate inverses at each layer. ...
Efficient gradient computation of the Jacobian determinant term is a core problem in many machine learning settings, and especially so in the normalizing flow framework. ...
A General Framework for Self Normalizing Flows
Preliminaries Given an observation x ∈ R D , it is assumed that x is generated from an underlying real vector z ∈ R D through an invertible and differentiable ...
arXiv:2011.07248v2
fatcat:2v2xzvp2tzb3blaj5popop6xxe
A convolution method to assess subgrid‐scale interactions between flow and patchy vegetation in biogeomorphic models
2020
Journal of Advances in Modeling Earth Systems
This new methodology allows for spatially refining coarse-resolution hydrodynamic simulations of flow velocities (order of m) around fine-resolution patchy vegetation patterns (order of 10 cm). ...
Finally, we estimate that replacing a fine-resolution model by a coarser-resolution model associated with the convolution method could reduce the computational time of real-life fluctuating flow simulations ...
For SRM+, the general trend is easier to read. The relative error is smallest for the smallest patches because the convolution method is more efficient at that scale. ...
doi:10.1029/2020ms002116
fatcat:zd5lcode7vewrjyjft2nr2de6y
A new line integral convolution algorithm for visualizing time-varying flow fields
1998
IEEE Transactions on Visualization and Computer Graphics
To visualize data generated from these simulations, traditional techniques, such as displaying particle traces, can only reveal flow phenomena in preselected local regions and, thus, are unable to track ...
In addition, our algorithm maintains the coherence of the flow animation by successively updating the convolution results over time. ...
Special thanks to Randy Kaemmerer for his meticulous proofreading of this manuscript, and to Michael Cox and David Ellsworth for interesting discussions and valuable suggestions in the parallel implementation ...
doi:10.1109/2945.694952
fatcat:tkmn626qhfeyrn2ltf4xa3k6fq
A Deep Pedestrian Tracking SSD-Based Model in the Sudden Emergency or Violent Environment
2021
Journal of Advanced Transportation
In general, the model has good tracking results and credibility for multitarget tracking in emergency environment. ...
The research provides technical support for safety assurance and behavior monitoring in emergency environment. ...
accuracy and speed are improved. e detection speed for abnormal states is higher than normal states; on the contrary, the detection accuracy for normal is higher than it for abnormal. ...
doi:10.1155/2021/2085876
fatcat:l7hotbholnhsjebzk52okkj5nu
« Previous
Showing results 1 — 15 out of 54,018 results