Filters








824,933 Hits in 5.1 sec

Deep Complex Networks [article]

Chiheb Trabelsi, Olexa Bilaniuk, Ying Zhang, Dmitriy Serdyuk, Sandeep Subramanian, João Felipe Santos, Soroush Mehri, Negar Rostamzadeh, Yoshua Bengio, Christopher J Pal
2018 arXiv   pre-print
In this work, we provide the key atomic components for complex-valued deep neural networks and apply them to convolutional feed-forward networks and convolutional LSTMs.  ...  Despite their attractive properties and potential for opening up entirely new neural architectures, complex-valued deep neural networks have been marginalized due to the absence of the building blocks  ...  neural networks by providing theoretical and mathematical motivation for using complex-valued deep networks.  ... 
arXiv:1705.09792v4 fatcat:4jmhouzdfjespiytrb3f4hyzuy

Neuronal Synchrony in Complex-Valued Deep Networks [article]

David P. Reichert, Thomas Serre
2014 arXiv   pre-print
We introduce a neural network formulation based on complex-valued neuronal units that is not only biologically meaningful but also amenable to a variety of deep learning frameworks.  ...  Thus, neuronal synchrony could be a flexible mechanism that fulfills multiple functional roles in deep networks.  ...  Here, we consider how to incorporate such notions into deep networks.  ... 
arXiv:1312.6115v5 fatcat:ak4datu4wjcxvl5pxrjrcnlmai

Deep learning of contagion dynamics on complex networks [article]

Charles Murphy, Edward Laurence, Antoine Allard
2021 arXiv   pre-print
Our results demonstrate how deep learning offers a new and complementary perspective to build effective models of contagion dynamics on networks.  ...  Here, we propose a complementary approach based on deep learning where the effective local mechanisms governing a dynamic on a network are learned from time series data.  ...  We believe this work establishes solid foundations for the use of deep learning in the design of realistic effective models of complex systems.  ... 
arXiv:2006.05410v5 fatcat:bugrcugfpjf55dssiwhvwivvhe

Deep-Waveform: A Learned OFDM Receiver Based on Deep Complex-valued Convolutional Networks [article]

Zhongyuan Zhao, Mehmet C. Vuran, Fujuan Guo, Stephen D. Scott
2021 arXiv   pre-print
In this paper, a deep complex-valued convolutional network (DCCN) is developed to recover bits from time-domain OFDM signals without relying on any explicit DFT/IDFT.  ...  The proposed approach benefits from the expressive nature of complex-valued neural networks, which, however, currently lack support from popular deep learning platforms.  ...  In this paper, we propose a deep complex-valued convolutional network (DCCN) design to recover bits from synchronized time-domain OFDM signals.  ... 
arXiv:1810.07181v6 fatcat:nelcdghnbzhcrhrvcu2fmo5sve

Complex network prediction using deep learning [article]

Yoshihisa Tanaka, Ryosuke Kojima, Shoichi Ishida, Fumiyoshi Yamashita, Yasushi Okuno
2021 arXiv   pre-print
In this work, we propose a deep learning approach to this problem based on Graph Convolutional Networks for predicting networks while preserving their original structural properties.  ...  Real-world networks typically exhibit complex topologies whose structural properties are key factors in characterizing and further exploring the networks themselves.  ...  Discussion We propose a deep learning approach for network prediction (Fig. 1 ) based on GCNs that predicts complex networks while preserving their structural properties (Fig. 2, Fig. 3 ) Experiments  ... 
arXiv:2104.03871v1 fatcat:p2dxz7lvfrcodhitwkxkl5n5bm

Deep learning systems as complex networks [article]

Alberto Testolin, Michele Piccolini, Samir Suweis
2018 arXiv   pre-print
In this article we propose to study deep belief networks using techniques commonly employed in the study of complex networks, in order to gain some insights into the structural and functional properties  ...  In particular, one class of deep learning models, known as deep belief networks, can discover intricate statistical structure in large data sets in a completely unsupervised fashion, by learning a generative  ...  Deep Belief Networks A groundbreaking discovery is that RBMs can be used as building blocks to build more complex neural network architectures, where the hidden variables of the generative model are organized  ... 
arXiv:1809.10941v1 fatcat:qxeiq4qehbawbe3gpdfaujylrm

Complexity of Linear Regions in Deep Networks [article]

Boris Hanin, David Rolnick
2019 arXiv   pre-print
It is well-known that the expressivity of a neural network depends on its architecture, with deeper networks expressing more complex functions.  ...  It is possible to construct networks with merely a single region, or for which the number of linear regions grows exponentially with depth; it is not clear where within this range most networks fall in  ...  Moreover, both our theoretical and empirical findings suggest that for certain measures of complexity, trained deep networks are remarkably similar to the same networks at initialization.  ... 
arXiv:1901.09021v2 fatcat:4jbr7udbnzhj5jk6mprviabeeu

Dropout Rademacher Complexity of Deep Neural Networks [article]

Wei Gao, Zhi-Hua Zhou
2014 arXiv   pre-print
reduce the Rademacher complexity in polynomial, whereas for deep neural networks it can amazingly lead to an exponential reduction of the Rademacher complexity.  ...  Great successes of deep neural networks have been witnessed in various real applications.  ...  reduce the Rademacher complexity in polynomial, whereas for deep neural networks it can amazingly lead to an exponential reduction of the Rademacher complexity.  ... 
arXiv:1402.3811v2 fatcat:iclmwdkqu5htziileuj2p3gagq

Reducing Deep Network Complexity with Fourier Transform Methods [article]

Andrew Kiruluta
2018 arXiv   pre-print
The consequence of using this input is a reduced complexity neuron network, reduced computation load and the lifting of the requirement for a large number of training examples to achieve high classification  ...  We propose a novel way that uses shallow densely connected neuron network architectures to achieve superior performance to convolution based neural networks (CNNs) approaches with the added benefits of  ...  An additional complexity introduced by these deep networks is the requirement for multiple layers of dropout and pooling (in case of CNN) networks which add to the computational training load.  ... 
arXiv:1801.01451v2 fatcat:z4fbspswrfaajiyhigonsyjmti

Augmenting Physical Models with Deep Networks for Complex Dynamics Forecasting [article]

Yuan Yin, Vincent Le Guen, Jérémie Dona, Emmanuel de Bézenac, Ibrahim Ayed, Nicolas Thome, Patrick Gallinari
2021 arXiv   pre-print
Forecasting complex dynamical phenomena in settings where only partial knowledge of their dynamics is available is a prevalent problem across various scientific fields.  ...  In this work, we introduce the APHYNITY framework, a principled approach for augmenting incomplete physical dynamics described by differential equations with deep data-driven models.  ...  Solving APHYNITY with deep neural networks In the following, both terms of the decomposition are parametrized and are denoted as F θp p and F θa a .  ... 
arXiv:2010.04456v5 fatcat:bdct4vuywndi7mpnchx732usle

Complexity for deep neural networks and other characteristics of deep feature representations [article]

Romuald A. Janik, Przemek Witaszczyk
2021 arXiv   pre-print
We define a notion of complexity, which quantifies the nonlinearity of the computation of a neural network, as well as a complementary measure of the effective dimension of feature representations.  ...  The entropic character of the proposed notion of complexity should allow to transfer modes of analysis from neuroscience and statistical physics to the domain of artificial neural networks.  ...  This work was supported by the Foundation for Polish Science (FNP) project Bio-inspired Artificial Neural Networks POIR.04.04.00-00-14DE/18-00.  ... 
arXiv:2006.04791v2 fatcat:6lkcotpwfvenvpojqd6pxwadty

Bayesian Sparsification Methods for Deep Complex-valued Networks [article]

Ivan Nazarov, Evgeny Burnaev
2020 arXiv   pre-print
With continual miniaturization ever more applications of deep learning can be found in embedded systems, where it is common to encounter data with natural complex domain representation.  ...  We replicate the state-of-the-art result by Trabelsi et al. [2018] on MusicNet with a complex-valued network compressed by 50-100x at a small performance penalty.  ...  , despite somewhat higher arithmetic complexity of C-valued networks.  ... 
arXiv:2003.11413v2 fatcat:loaezwnv4ffvdalzripgjsxcxy

Low-Complexity Vector Quantized Compressed Sensing via Deep Neural Networks [article]

Markus Leinonen, Marian Codreanu
2020 arXiv   pre-print
We propose a deep encoder-decoder architecture, consisting of an encoder deep neural network (DNN), a quantizer, and a decoder DNN, that realizes low-complexity vector quantization aiming at minimizing  ...  Simulation results show that the proposed non-iterative DNN-based QCS method achieves higher rate-distortion performance with lower algorithm complexity as compared to standard QCS methods, conducive to  ...  As will be elaborated later, the method realizes (low-complexity) vector quantization (VQ); hence, we dub the proposed deep encoder-decoder architecture for QCS as DeepVQCS.  ... 
arXiv:2005.08385v3 fatcat:hmt4ofci7fcpjajinpim2wvni4

Characterizing Learning Dynamics of Deep Neural Networks via Complex Networks [article]

Emanuele La Malfa, Gabriele La Malfa, Giuseppe Nicosia, Vito Latora
2021 arXiv   pre-print
In this paper, we interpret Deep Neural Networks with Complex Network Theory.  ...  Complex Network Theory (CNT) represents Deep Neural Networks (DNNs) as directed weighted graphs to study them as dynamical systems.  ...  In [18] , CNT metrics to distill information from Deep Belief Networks: Deep Belief Networks -which are generative models that differ from feedforward neural networks as the learning phase is generally  ... 
arXiv:2110.02628v2 fatcat:4h42xxkvsve3vemm5fdsda4uim

Data-dependent Sample Complexity of Deep Neural Networks via Lipschitz Augmentation [article]

Colin Wei, Tengyu Ma
2020 arXiv   pre-print
For feedforward neural nets as well as RNNs, we obtain tighter Rademacher complexity bounds by considering additional data-dependent properties of the network: the norms of the hidden layers of the network  ...  Existing Rademacher complexity bounds for neural networks rely only on norm control of the weight matrices and depend exponentially on depth via a product of the matrix norms.  ...  as deep neural networks.  ... 
arXiv:1905.03684v3 fatcat:4txfh3dwtfdydm4cbe2mfimlm4
« Previous Showing results 1 — 15 out of 824,933 results