Filters








25,618 Hits in 5.1 sec

Graph Random Neural Features for Distance-Preserving Graph Representations [article]

Daniele Zambon, Cesare Alippi, Lorenzo Livi
2020 arXiv   pre-print
We present Graph Random Neural Features (GRNF), a novel embedding method from graph-structured data to real vectors based on a family of graph neural networks.  ...  The embedding naturally deals with graph isomorphism and preserves the metric structure of the graph domain, in probability.  ...  Acknowledgements This research is funded by the Swiss National Science Foundation project 200021 172671: "ALPSFORT: A Learning graPh-baSed framework FOr cybeR-physical sysTems." The work of L.  ... 
arXiv:1909.03790v3 fatcat:2eink7sbbrhhdjsmnuszm3kd7e

Fast Sequence-Based Embedding with Diffusion Graphs [article]

Benedek Rozemberczki, Rik Sarkar
2020 arXiv   pre-print
A graph embedding is a representation of graph vertices in a low-dimensional space, which approximately preserves properties such as distances between nodes.  ...  Vertex sequence-based embedding procedures use features extracted from linear sequences of nodes to create embeddings using a neural network.  ...  Acknowledgements: Benedek Rozemberczki was supported by the Centre for Doctoral Training in Data Science, funded by EPSRC (grant EP/L016427/1).  ... 
arXiv:2001.07463v1 fatcat:ngjsvc3s3fezfndhuadcl5hzoa

CommPOOL: An Interpretable Graph Pooling Framework for Hierarchical Graph Representation Learning [article]

Haoteng Tang, Guixiang Ma, Lifang He, Heng Huang, Liang Zhan
2020 arXiv   pre-print
Recent years have witnessed the emergence and flourishing of hierarchical graph pooling neural networks (HGPNNs) which are effective graph representation learning approaches for graph level tasks such  ...  In this paper, we propose a new interpretable graph pooling framework - CommPOOL, that can capture and preserve the hierarchical community structure of graphs in the graph representation learning process  ...  Graph Neural Network Graph Neural Network (GNN) is an effective messagepassing architecture for embedding the graph nodes and their local structures.  ... 
arXiv:2012.05980v1 fatcat:toelhmu3mjb6ngtn3nnb56fz7e

Fast Sequence-Based Embedding with Diffusion Graphs [chapter]

Benedek Rozemberczki, Rik Sarkar
2018 Complex Networks IX  
A graph embedding is a representation of the vertices of a graph in a low dimensional space, which approximately preserves properties such as distances between nodes.  ...  Vertex sequence based embedding procedures use features extracted from linear sequences of vertices to create embeddings using a neural network.  ...  Sequence based graph embedding methods on the other hand obtain their vertex sequences by random walk on graphs and then apply analogous neural network methods for the embedding.  ... 
doi:10.1007/978-3-319-73198-8_9 fatcat:mbg4ctrzhrg75etcqdi676r4tu

Network Representation Learning: From Traditional Feature Learning to Deep Learning

Ke Sun, Lei Wang, Bo Xu, Wenhong Zhao, Shyh Wei Teng, Feng Xia
2020 IEEE Access  
FIGURE 4 : 4 Node2vec random walk strategy. FIGURE 5 : 5 The structure of deep network embedding. FIGURE 6 : 6 Deep Neural Graph Learning components.  ...  Global feature learning mainly focuses on preserving global information of data.  ... 
doi:10.1109/access.2020.3037118 fatcat:kca6htfarjdjpmtwcvbsppfzui

Graph representation learning: a survey

Fenxiao Chen, Yun-Cheng Wang, Bin Wang, C.-C. Jay Kuo
2020 APSIPA Transactions on Signal and Information Processing  
Various graph embedding techniques have been developed to convert the raw graph data into a low-dimensional vector representation while preserving the intrinsic graph properties.  ...  Research on graph representation learning has received great attention in recent years since most data in real-world applications come in the form of graphs.  ...  It preserves spatial distances.  ... 
doi:10.1017/atsip.2020.13 fatcat:lirq3kp25jfilgkf66u2rlkhky

Graph Representation Learning: A Survey [article]

Fenxiao Chen, Yuncheng Wang, Bin Wang, C.-C. Jay Kuo
2019 arXiv   pre-print
Various graph embedding techniques have been developed to convert the raw graph data into a low-dimensional vector representation while preserving the intrinsic graph properties.  ...  Research on graph representation learning has received a lot of attention in recent years since many data in real-world applications come in form of graphs.  ...  It preserves spatial distances.  ... 
arXiv:1909.00958v1 fatcat:6wbxy5jjx5ditbiiviwbuqyww4

A Tutorial on Network Embeddings [article]

Haochen Chen, Bryan Perozzi, Rami Al-Rfou, Steven Skiena
2018 arXiv   pre-print
These representations can be used as features for a wide range of tasks on graphs such as classification, clustering, link prediction, and visualization.  ...  Network embedding methods aim at learning low-dimensional latent representation of nodes in a network.  ...  To solve these problems, HARP [11] proposes a meta strategy for embedding graph datasets which preserves higher-order structural features.  ... 
arXiv:1808.02590v1 fatcat:ramuqdavczfabb4o7r42kice7q

Machine Learning on Graphs: A Model and Comprehensive Taxonomy [article]

Ines Chami, Sami Abu-El-Haija, Bryan Perozzi, Christopher Ré, Kevin Murphy
2022 arXiv   pre-print
The second, graph regularized neural networks, leverages graphs to augment neural network losses with a regularization objective for semi-supervised learning.  ...  There has been a surge of recent interest in learning representations for graph-structured data.  ...  Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright notation thereon.  ... 
arXiv:2005.03675v3 fatcat:6eoicgprdvfbze732nsmpaumqe

Isometric Graph Neural Networks [article]

Matthew Walker, Bo Yan, Yiou Xiao, Yafei Wang, Ayan Acharya
2020 arXiv   pre-print
that the learned embeddings do account for graph distances.  ...  Geometric techniques to extract such representations have poor scaling over large graph size, and recent advances in Graph Neural Network (GNN) algorithms have limited ability to reflect graph distance  ...  Therefore, learning an embedding from such representation of the features may not preserve the distances in the graph.  ... 
arXiv:2006.09554v1 fatcat:jytiaunwt5aqhaeuj4u5o6y42e

Survey of network embedding techniques for social networks

2019 Turkish Journal of Electrical Engineering and Computer Sciences  
This required use of kernel functions or graph statistics making the entire exercise task-dependent. NRL frameworks eliminated the need for feature engineering and made SNA task-independent.  ...  Before network representation learning methods were proposed, social network analysis (SNA) was performed by means of extensive feature engineering.  ...  The four classes of NRL techniques discussed in the literature are adjacency preserving methods [9-11, 13-18, 36] , multihop distance preserving methods, neighborhood overlap preserving methods, and random  ... 
doi:10.3906/elk-1807-333 fatcat:dugob6cllja5jmtdj63a2hzm7i

Learning Graph-Level Representations with Recurrent Neural Networks [article]

Yu Jin, Joseph F. JaJa
2018 arXiv   pre-print
Recurrent neural network (RNN) units are modified to accommodate both the node representations as well as their neighborhood information.  ...  Graph nodes are mapped into node sequences sampled from random walk approaches approximated by the Gumbel-Softmax distribution.  ...  s for s = 1, ..., K rw and for nodes with distance beyond K rw .  ... 
arXiv:1805.07683v4 fatcat:lsrbrfswtjejzcdbtggm77sa7y

Learning Graph Neural Networks with Positive and Unlabeled Nodes [article]

Man Wu, Shirui Pan, Lan Du, Xingquan Zhu
2021 arXiv   pre-print
Graph neural networks (GNNs) are important tools for transductive learning tasks, such as node classification in graphs, due to their expressive power in capturing complex interdependency between nodes  ...  In this paper, we propose a novel graph neural network framework, long-short distance aggregation networks (LSDAN), to overcome these limitations.  ...  For two-step graph embedding methods, graph embedding based algorithms first embed nodes in a given graph into vector representation by preserving both structure, node content, and other side information  ... 
arXiv:2103.04683v1 fatcat:7fofa7qkd5ggfdusspij5axwvu

Survey on graph embeddings and their applications to machine learning problems on graphs

Ilya Makarov, Dmitrii Kiselev, Nikita Nikitinsky, Lovro Subelj
2021 PeerJ Computer Science  
Using the constructed feature spaces, many machine learning problems on graphs can be solved via standard frameworks suitable for vectorized feature representation.  ...  So-called graph embeddings provide a powerful tool to construct vectorized feature spaces for graphs and their components, such as nodes, edges and subgraphs under preserving inner graph properties.  ...  Rows depict different models, which are grouped by model type: matrix factorization, random walks, graph neural networks with and without features.  ... 
doi:10.7717/peerj-cs.357 pmid:33817007 pmcid:PMC7959646 fatcat:ntronyrbgfbedez5dks6h4hoq4

Using Laplacian Spectrum as Graph Feature Representation [article]

Edouard Pineau
2019 arXiv   pre-print
To circumvent this problem and learn on graphs, graph feature representation is required.  ...  In particular, we derive bounds for the distance between two GLS that are related to the divergence to isomorphism, a standard computationally expensive graph divergence.  ...  At the same time, alternative to GNN exist and are related to random walk embedding. In [23] , neural networks help to sample paths which preserve significant graph properties.  ... 
arXiv:1912.00735v1 fatcat:xxc3o53ocfdjhkgkw5gkf6lfhm
« Previous Showing results 1 — 15 out of 25,618 results