Filters








103,526 Hits in 4.3 sec

How Powerful are Graph Neural Networks? [article]

Keyulu Xu, Weihua Hu, Jure Leskovec, Stefanie Jegelka
2019 arXiv   pre-print
Graph Neural Networks (GNNs) are an effective framework for representation learning of graphs.  ...  Our results characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures  ...  BUILDING POWERFUL GRAPH NEURAL NETWORKS First, we characterize the maximum representational capacity of a general class of GNN-based models.  ... 
arXiv:1810.00826v3 fatcat:rdxjhxusbvgqrhjoalsbmez2kq

Graph Neural Networks in Network Neuroscience [article]

Alaa Bessadok, Mohamed Ali Mahjoub, Islem Rekik
2021 arXiv   pre-print
Relying on its non-Euclidean data type, graph neural network (GNN) provides a clever way of learning the deep graph structure and it is rapidly becoming the state-of-the-art leading to enhanced performance  ...  in various network neuroscience tasks.  ...  WHAT DO GRAPH NEURAL NETWORKS OFFER TO NETWORK NEUROSCIENCE?  ... 
arXiv:2106.03535v1 fatcat:jx7ixd7xjngthaq6qhb25gssm4

Network In Graph Neural Network [article]

Xiang Song and Runjie Ma and Jiahang Li and Muhan Zhang and David Paul Wipf
2021 arXiv   pre-print
Graph Neural Networks (GNNs) have shown success in learning from graph structured data containing node/edge feature information, with application to social networks, recommendation, fraud detection and  ...  In Graph Neural Network (NGNN ), that allows arbitrary GNN models to increase their model capacity by making the model deeper.  ...  INTRODUCTION Graph Neural Networks (GNNs) capture local graph structure and feature information in a trainable fashion to derive powerful node representations.  ... 
arXiv:2111.11638v1 fatcat:n5ud4qfibjf67ch4qm2ai6dboe

Sparse Neural Networks Topologies [article]

Alfred Bourely, John Patrick Boueri, Krzysztof Choromonski
2017 arXiv   pre-print
We propose Sparse Neural Network architectures that are based on random or structured bipartite graph topologies.  ...  In this paper, we investigate how the accuracy and training speed of the models depend on the topology and sparsity of the neural network.  ...  We investigate how Algebraic Connectivity impacts a Neural Network's power, and show that there is a strong correlation between how interconnected a Neural Network is, and its resulting testing accuracy  ... 
arXiv:1706.05683v1 fatcat:ehnqw5rx7rewvi2pe36oox4vum

Neural Network based Network Traffic Predictability for IoT Environment

Min-Su Seok, Dept. of Computer Science Engineering, Dongguk Univ., Seoul, Republic of Korea, JinYeong Um*, Dept. of Computer Science Engineering, Dongguk Univ., Seoul, Republic of Korea
2020 The International Journal of Internet of Things and its Applications  
Experiments show how neural networks can be used to predict network changes and whether they can predict changes in real networks.  ...  The purpose of this paper is to predict of compare network changes using neural networks.  ...  The input gate determines how much of the data entered into the neural network. The Output Gate adjusts the output of the neural network.  ... 
doi:10.21742/ijiota.2020.4.1.03 fatcat:lf2jhqs7qnabzdmn2n54dueon4

Modeling Brain Networks with Artificial Neural Networks [article]

Baran Baris Kivilcim, Itir Onal Ertugrul, Fatos T. Yarman Vural
2018 arXiv   pre-print
The representation power of the suggested brain networks are tested in a task-fMRI dataset of Human Connectome Project and a Complex Problem Solving dataset.  ...  In this study, we propose a neural network approach to capture the functional connectivities among anatomic brain regions.  ...  , the weights of neural network are assigned to edge weights of the corresponding brain graph, a i,j ← w i,j , ∀ i,j .  ... 
arXiv:1807.08368v1 fatcat:ayoduddyt5hnniql7fhcuzs5qu

Hyperbolic Attention Networks [article]

Caglar Gulcehre, Misha Denil, Mateusz Malinowski, Ali Razavi, Razvan Pascanu, Karl Moritz Hermann, Peter Battaglia, Victor Bapst, David Raposo, Adam Santoro, Nando de Freitas
2018 arXiv   pre-print
We introduce hyperbolic attention networks to endow neural networks with enough capacity to match the complexity of data with hierarchical and power-law structure.  ...  Our method shows improvements in terms of generalization on neural machine translation, learning on graphs and visual question answering tasks while keeping the neural representations compact.  ...  Relation networks Relation Networks (RNs) [34] are a neural network architecture designed for reasoning about the relationships between objects.  ... 
arXiv:1805.09786v1 fatcat:46gpijmaavbv5pfra3wi54bdmi

An Overview on the Application of Graph Neural Networks in Wireless Networks [article]

S. He, S. Xiong, Y. Ou, J. Zhang, J. Wang, Y. Huang, Y. Zhang
2021 arXiv   pre-print
To effectively exploit the information of graph-structured data as well as contextual information, graph neural networks (GNNs) have been introduced to address a series of optimization problems of wireless  ...  In recent years, with the rapid enhancement of computing power, deep learning methods have been widely applied in wireless networks and achieved impressive performance.  ...  There are two types of neural networks that can take advantage of the permutation invariance, namely, graph neural networks (GNNs) [11] and deep sets [12] .  ... 
arXiv:2107.03029v3 fatcat:2hf2gelxrjbwpiwzyjw5cat5wu

Typed Graph Networks [article]

Marcelo O. R. Prates, Pedro H. C. Avelar, Henrique Lemos, Marco Gori, Luis Lamb
2019 arXiv   pre-print
Architectures of this family have been referred to with several definitions in the literature, such as Graph Neural Networks, Message-passing Neural Networks, Relational Networks and Graph Networks.  ...  To illustrate the generality of the original model, we present a Graph Neural Network formalisation, which partitions the vertices of a graph into a number of types.  ...  We are hopeful that by thinking about graph neural networks in terms of types, these powerful techniques can be understood and effectively employed by a wider audience of deep learning researchers.  ... 
arXiv:1901.07984v3 fatcat:xtcepapm3rakha2r7kj5sqecvq

Diffusion-Convolutional Neural Networks [article]

James Atwood, Don Towsley
2016 arXiv   pre-print
We present diffusion-convolutional neural networks (DCNNs), a new model for graph-structured data.  ...  Through the introduction of a diffusion-convolution operation, we show how diffusion-based representations can be learned from graph-structured data and used as an effective basis for node classification  ...  Other Graph-Based Neural Network Models Other researchers have investigated how CNNs can be extended from grid-structured to more general graph-structured data.  ... 
arXiv:1511.02136v6 fatcat:oslxcpxekndtlfcapumngfrc3a

Beyond Graph Neural Networks with Lifted Relational Neural Networks [article]

Gustav Sourek, Filip Zelezny, Ondrej Kuzelka
2020 arXiv   pre-print
We illustrate how this idea can be used for an efficient encoding of a diverse range of existing advanced neural architectures, with a particular focus on Graph Neural Networks (GNNs).  ...  We demonstrate a declarative differentiable programming framework based on the language of Lifted Relational Neural Networks, where small parameterized logic programs are used to encode relational learning  ...  Graph Neural Networks Graph Neural Networks (GNN) 9 can be seen as a further extension of the principle to completely irregular graph structures (Bronstein et al., 2017) .  ... 
arXiv:2007.06286v1 fatcat:s4sssahrovc7tahohktmgkvvmy

NetXplain: Real-time explainability of graph neural networks applied to networking

David Pujol-Perich, Jos� Su�rez-Varela, Shihan Xiao, Bo Wu, Albert Cabellos-Aparicio, Pere Barlet-Ros
2021 ITU Journal  
This paper focuses on the explainability of Graph Neural Networks (GNNs) applied to networking. GNNs are a novel DL family with unique properties to generalize over graphs.  ...  However, existing DL-based solutions are often considered as black boxes with high inner complexity.  ...  BACKGROUND Graph neural networks Graph neural networks are a novel neural network family designed to operate over graph-structured data, by capturing and modeling the inherent patterns in a graph.  ... 
doi:10.52953/akzh5253 fatcat:lvdlftwe6bddlabj4tsf4skfda

Network Representation [chapter]

Zhiyuan Liu, Yankai Lin, Maosong Sun
2020 Representation Learning for Natural Language Processing  
Network representation learning aims to embed the vertexes in a network into low-dimensional dense representations, in which similar vertices in the network should have "close" representations (usually  ...  The representations can be used as the feature of vertices and applied to many network study tasks. In this chapter, we will introduce network representation learning algorithms in the past decade.  ...  These work achieved better performance than previous algorithms, which proved the representation power of graph neural networks. Nowak et al.  ... 
doi:10.1007/978-981-15-5573-2_8 fatcat:2fljfkgpozhudbgqr7tgv45vxi

Directed hypergraph neural network [article]

Loc Hoang Tran, Linh Hoang Tran
2022 arXiv   pre-print
To deal with irregular data structure, graph convolution neural networks have been developed by a lot of data scientists.  ...  Among the classic directed graph based semi-supervised learning method, the novel directed hypergraph based semi-supervised learning method, the novel directed hypergraph neural network method that are  ...  There are two classes of graph convolution neural network. The first class of graph convolution neural network is the spatial based approach.  ... 
arXiv:2008.03626v2 fatcat:uyk2zndaqvarbbrbldny2t7vee

Scalable Power Control/Beamforming in Heterogeneous Wireless Networks with Graph Neural Networks [article]

Xiaochen Zhang, Haitao Zhao, Jun Xiong, Li Zhou, Jibo Wei
2021 arXiv   pre-print
interference graph neural network (HIGNN) to handle these challenges.  ...  Although superb performance is achieved on small and simple networks, most existing ML-based approaches are confronted with difficulties when heterogeneity occurs and network size expands.  ...  Segarra, “Unfolding WMMSE using graph neural networks for efficient power allocation,” size is shown in Fig  ... 
arXiv:2104.05463v2 fatcat:t4gyp47cyvhzzeemgakz7gecwu
« Previous Showing results 1 — 15 out of 103,526 results