13,540 Hits in 5.8 sec

Motif-Matching Based Subgraph-Level Attentional Convolutional Network for Graph Classification

Hao Peng, Jianxin Li, Qiran Gong, Yuanxin Ning, Senzhang Wang, Lifang He
In this work, we present a novel motif-based attentional graph convolution neural network for graph classification, which can learn more discriminative and richer graph features.  ...  Graph classification is critically important to many real-world applications that are associated with graph data such as chemical drug analysis and social network mining.  ...  Second, we design subgraph-independent convolutional neural networks to learn different-levels of features for each subgraph without pooling operators.  ... 
doi:10.1609/aaai.v34i04.5987 fatcat:h3v7tuootnho3o4tj2zeqnghvu

Analyzing the Performance of Graph Neural Networks with Pipe Parallelism [article]

Matthew T. Dearing, Xiaoyan Wang
2021 arXiv   pre-print
In this study, we focus on Graph Neural Networks (GNN) that have found great success in tasks such as node or edge classification and link prediction.  ...  While new approaches for processing larger networks are needed to advance graph techniques, and several have been proposed, we study how GNNs could be parallelized using existing tools and frameworks that  ...  Rick Stevens, and Peng Ding for the guidance and feedback on this paper.  ... 
arXiv:2012.10840v2 fatcat:ygantb35i5ghrkulczmx7dyb2e

Semiparallel deep neural network hybrid architecture: first application on depth from monocular camera

Shabab Bazrafkan, Hossein Javidnia, Joseph Lemley
2018 Journal of Electronic Imaging (JEI)  
Eight different networks are designed to perform depth estimation, each of them suitable for a feature level. Networks with different pooling sizes determine different feature levels.  ...  Deep neural networks are applied to a wide range of problems in recent years.  ...  This paper also introduces the use of Semi Parallel Deep Neural Networks (SPDNN).  ... 
doi:10.1117/1.jei.27.4.043041 fatcat:45krx4q4arfe7gnzsjx7w3zyl4

Improving the expressiveness of deep learning frameworks with recursion

Eunji Jeong, Joo Seong Jeong, Soojeong Kim, Gyeong-In Yu, Byung-Gon Chun
2018 Proceedings of the Thirteenth EuroSys Conference on - EuroSys '18  
between nodes for efficient execution based on parallel computation.  ...  However, embedded control flow deep learning frameworks such as TensorFlow, Theano, Caffe2, and MXNet fail to efficiently represent and execute such neural networks, due to lack of support for recursion  ...  Instead, they create a new static computation graph for every activated control flow. This approach enables fast prototyping and easy development of various deep neural networks.  ... 
doi:10.1145/3190508.3190530 dblp:conf/eurosys/JeongJKYC18 fatcat:yjt6dn2sw5fdppk664dtgliqg4

AMPNet: Asynchronous Model-Parallel Training for Dynamic Neural Networks [article]

Alexander L. Gaunt, Matthew A. Johnson, Maik Riechert, Daniel Tarlow, Ryota Tomioka, Dimitrios Vytiniotis, Sam Webster
2017 arXiv   pre-print
We present an asynchronous model-parallel (AMP) training algorithm that is specifically motivated by training on networks of interconnected devices.  ...  Our framework opens the door for scaling up a new class of deep learning models that cannot be efficiently trained today.  ...  , John Langford for a discussion on asynchrony and reproducibility, and Frank Seide for discussions on dynamic networks.  ... 
arXiv:1705.09786v3 fatcat:4lnnnip4pvb4fm6nquzjl7a6c4

A Comprehensive Survey on Graph Neural Networks [article]

Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, Philip S. Yu
2019 arXiv   pre-print
spatial-temporal graph neural networks.  ...  We propose a new taxonomy to divide the state-of-the-art graph neural networks into four categories, namely recurrent graph neural networks, convolutional graph neural networks, graph autoencoders, and  ...  Neural Network for Graphs (NN4G) [24] , proposed in parallel with GNN*, is the first work towards spatial-based ConvGNNs.  ... 
arXiv:1901.00596v4 fatcat:xxuchvawonhczay2sgjgzw5wgu

Using Graph Neural Networks to model the performance of Deep Neural Networks [article]

Shikhar Singh, Benoit Steiner, James Hegarty, Hugh Leather
2021 arXiv   pre-print
Graphs present a natural and intuitive way to model deep-learning networks where each node represents a computational stage or operation.  ...  Existing performance models employ feed-forward networks, recurrent networks, or decision tree ensembles to estimate the performance of different implementations of a neural network.  ...  Index Terms-Deep Learning, Neural Networks, Code Optimization, Performance Modeling, Graph Neural Networks. I.  ... 
arXiv:2108.12489v1 fatcat:xvzxtg33uvhjzij4v2qrbnbfra

Scalable Deep Generative Modeling for Sparse Graphs [article]

Hanjun Dai, Azade Nazi, Yujia Li, Bo Dai, Dale Schuurmans
2020 arXiv   pre-print
However current deep neural methods suffer from limited scalability: for a graph with n nodes and m edges, existing deep neural methods require Ω(n^2) complexity by building up the adjacency matrix.  ...  Learning graph generative models is a challenging task for deep learning and has wide applicability to a range of domains like chemistry, biology and social science.  ...  Acknowledgements We would like to thank Azalia Mirhoseini, Polo Chau, Sherry Yang and anonymous reviewers for valuable comments and suggestions.  ... 
arXiv:2006.15502v1 fatcat:wkmy2mqskjfkvg5vjg5yc45ksi

Learning to Decode Linear Codes Using Deep Learning [article]

Eliya Nachmani, Yair Beery, David Burshtein
2016 arXiv   pre-print
A novel deep learning method for improving the belief propagation algorithm is proposed.  ...  The method generalizes the standard belief propagation algorithm by assigning weights to the edges of the Tanner graph. These edges are then trained using deep learning techniques.  ...  The Tesla K40c used for this research was donated by the NVIDIA Corporation.  ... 
arXiv:1607.04793v2 fatcat:noyvfqo7grf7xjgkpwx3cra3ly

Chainer: A Deep Learning Framework for Accelerating the Research Cycle [article]

Seiya Tokui, Ryosuke Okuta, Takuya Akiba, Yusuke Niitani, Toru Ogawa, Shunta Saito, Shuji Suzuki, Kota Uenishi, Brian Vogel, Hiroyuki Yamazaki Vincent
2019 arXiv   pre-print
Software frameworks for neural networks play a key role in the development and application of deep learning methods.  ...  packages for state-of-the-art computer vision models as well as distributed training.  ...  However, for implementing other types of NN models, two major problems arise. The first is that it can be cumbersome to support general dynamic graphs, i.e., neural networks with control flow.  ... 
arXiv:1908.00213v1 fatcat:abyp464b7vagndtiwcet5cagtq

Scalable algorithms for physics-informed neural and graph networks [article]

Khemraj Shukla, Mengjia Xu, Nathaniel Trask, George Em Karniadakis
2022 arXiv   pre-print
Such physics-informed machine learning integrates multimodality and multifidelity data with mathematical models, and implements them using neural networks or graph networks.  ...  Unlike commercial machine learning where training of deep neural networks requires big data, in PIML big data are not available.  ...  In contrast, Graph Neural Networks (GNNs) form a class of deep neural network methods designed to handle unstructured data.  ... 
arXiv:2205.08332v1 fatcat:3bq25a266vhyfhoommuj2uqtp4

Spatial Structured Prediction Models: Applications, Challenges, and Techniques

Zhe Jiang
2020 IEEE Access  
Graph neural networks are generalization of deep neural networks from images to graphs.  ...  Specifically, it can be categorized into deep convolutional neural networks for raster imagery or regular spatial grid and graph neural networks for spatial graphs.  ... 
doi:10.1109/access.2020.2975584 fatcat:oeseqyr3dbhx3hznh6omk2c7mm

Symbolic Relational Deep Reinforcement Learning based on Graph Neural Networks [article]

Jaromír Janisch, Tomáš Pevný, Viliam Lisý
2021 arXiv   pre-print
We present a deep RL framework based on graph neural networks and auto-regressive policy decomposition that naturally works with these problems and is completely domain-independent.  ...  In goal-oriented BlockWorld, we demonstrate multi-parameter actions with pre-conditions. In SysAdmin, we show how to select multiple objects simultaneously.  ...  The GPU used for this research was donated by the NVIDIA Corporation.  ... 
arXiv:2009.12462v3 fatcat:wwpu3u4zmrb4tczzzopu7y4kcy

Learning Latent Causal Structures with a Redundant Input Neural Network [article]

Jonathan D. Young, Bryan Andrews, Gregory F. Cooper, Xinghua Lu
2020 arXiv   pre-print
We developed a deep learning model, which we call a redundant input neural network (RINN), with a modified architecture and a regularized objective function to find causal relationships between input,  ...  More specifically, our model allows input variables to directly interact with all latent variables in a neural network to influence what information the latent variables should encode in order to generate  ...  Acknowledgments Funding for this work was provided by NIH grant R01 LM012011. Author BA was supported by training grant T15LM007059 from the NLM.  ... 
arXiv:2003.13135v3 fatcat:hcmmqcg2vvcwnji7x4gs3b235e

Urban Traffic Flow Prediction with Deep Neural Network

Jin Yang, Muhammad Arif
2022 Security and Communication Networks  
Aiming at the spatiotemporal characteristics, this paper studies two aspects and designs a traffic flow prediction model with a deep neural network.  ...  First, this work proposes a traffic flow spatial feature learning algorithm with the combination of graph convolutional neural network and attention mechanism.  ...  Due to the deep neural layer and the slow training speed, the deep belief network is usually used to deal with traffic prediction research with a large amount of data.  ... 
doi:10.1155/2022/8711873 fatcat:67hvmmyjyjgltaqm2xg57htk6u
« Previous Showing results 1 — 15 out of 13,540 results