Filters








17,375 Hits in 6.0 sec

Graph Random Neural Features for Distance-Preserving Graph Representations [article]

Daniele Zambon, Cesare Alippi, Lorenzo Livi
2020 arXiv   pre-print
We present Graph Random Neural Features (GRNF), a novel embedding method from graph-structured data to real vectors based on a family of graph neural networks.  ...  The embedding naturally deals with graph isomorphism and preserves the metric structure of the graph domain, in probability.  ...  Acknowledgements This research is funded by the Swiss National Science Foundation project 200021 172671: "ALPSFORT: A Learning graPh-baSed framework FOr cybeR-physical sysTems." The work of L.  ... 
arXiv:1909.03790v3 fatcat:2eink7sbbrhhdjsmnuszm3kd7e

Fast Sequence-Based Embedding with Diffusion Graphs [article]

Benedek Rozemberczki, Rik Sarkar
2020 arXiv   pre-print
Vertex sequence-based embedding procedures use features extracted from linear sequences of nodes to create embeddings using a neural network.  ...  A graph embedding is a representation of graph vertices in a low-dimensional space, which approximately preserves properties such as distances between nodes.  ...  In particular, it scales better with increasing density (vertex degrees) of graphs. Our experiments also show that the embedding preserves graph distances to a high accuracy.  ... 
arXiv:2001.07463v1 fatcat:ngjsvc3s3fezfndhuadcl5hzoa

Fast Sequence-Based Embedding with Diffusion Graphs [chapter]

Benedek Rozemberczki, Rik Sarkar
2018 Complex Networks IX  
Vertex sequence based embedding procedures use features extracted from linear sequences of vertices to create embeddings using a neural network.  ...  A graph embedding is a representation of the vertices of a graph in a low dimensional space, which approximately preserves properties such as distances between nodes.  ...  ) Learning an embedding from the features.  ... 
doi:10.1007/978-3-319-73198-8_9 fatcat:mbg4ctrzhrg75etcqdi676r4tu

Survey of network embedding techniques for social networks

2019 Turkish Journal of Electrical Engineering and Computer Sciences  
This required use of kernel functions or graph statistics making the entire exercise task-dependent. NRL frameworks eliminated the need for feature engineering and made SNA task-independent.  ...  Due to the scientific interest in this domain there has been a mushrooming of embedding techniques.  ...  Random walk cooccurrence preserving techniques These techniques perform random walks on the graph starting from different nodes in the network.  ... 
doi:10.3906/elk-1807-333 fatcat:dugob6cllja5jmtdj63a2hzm7i

Graph representation learning: a survey

Fenxiao Chen, Yun-Cheng Wang, Bin Wang, C.-C. Jay Kuo
2020 APSIPA Transactions on Signal and Information Processing  
Various graph embedding techniques have been developed to convert the raw graph data into a low-dimensional vector representation while preserving the intrinsic graph properties.  ...  In this review, we first explain the graph embedding task and its challenges. Next, we review a wide range of graph embedding techniques with insights.  ...  It preserves spatial distances.  ... 
doi:10.1017/atsip.2020.13 fatcat:lirq3kp25jfilgkf66u2rlkhky

A Tutorial on Network Embeddings [article]

Haochen Chen, Bryan Perozzi, Rami Al-Rfou, Steven Skiena
2018 arXiv   pre-print
These representations can be used as features for a wide range of tasks on graphs such as classification, clustering, link prediction, and visualization.  ...  We first discuss the desirable properties of network embeddings and briefly introduce the history of network embedding algorithms.  ...  Local Linear Embeddings (LLE) -Unlike MDS, which preserves pairwise distances between feature vectors, LLE [39] only exploits the local neighborhood of data points and does not attempt to estimate distance  ... 
arXiv:1808.02590v1 fatcat:ramuqdavczfabb4o7r42kice7q

Graph Representation Learning: A Survey [article]

Fenxiao Chen, Yuncheng Wang, Bin Wang, C.-C. Jay Kuo
2019 arXiv   pre-print
Various graph embedding techniques have been developed to convert the raw graph data into a low-dimensional vector representation while preserving the intrinsic graph properties.  ...  In this review, we first explain the graph embedding task and its challenges. Next, we review a wide range of graph embedding techniques with insights.  ...  It preserves spatial distances.  ... 
arXiv:1909.00958v1 fatcat:6wbxy5jjx5ditbiiviwbuqyww4

Machine Learning on Graphs: A Model and Comprehensive Taxonomy [article]

Ines Chami, Sami Abu-El-Haija, Bryan Perozzi, Christopher Ré, Kevin Murphy
2022 arXiv   pre-print
Here, we aim to bridge the gap between graph neural networks, network embedding and graph regularization models.  ...  The first, network embedding (such as shallow graph embedding or graph auto-encoders), focuses on learning unsupervised representations of relational structure.  ...  in a multi-step pipeline where random walks are first generated from the graph and then used to learn embeddings.  ... 
arXiv:2005.03675v3 fatcat:6eoicgprdvfbze732nsmpaumqe

Network Representation Learning: From Traditional Feature Learning to Deep Learning

Ke Sun, Lei Wang, Bo Xu, Wenhong Zhao, Shyh Wei Teng, Feng Xia
2020 IEEE Access  
The transductive graph embedding is applied for predicting class label and graph context based on the input feature of observed labeled data and embeddings extracted from graph structure.  ...  Node features can be obtained from the summation of neighbor features, so that the algorithm has the ability to preserve the locally linear structure of neighborhood.  ... 
doi:10.1109/access.2020.3037118 fatcat:kca6htfarjdjpmtwcvbsppfzui

Survey on graph embeddings and their applications to machine learning problems on graphs

Ilya Makarov, Dmitrii Kiselev, Nikita Nikitinsky, Lovro Subelj
2021 PeerJ Computer Science  
So-called graph embeddings provide a powerful tool to construct vectorized feature spaces for graphs and their components, such as nodes, edges and subgraphs under preserving inner graph properties.  ...  First, we start with the methodological approach and extract three types of graph embedding models based on matrix factorization, random-walks and deep learning approaches.  ...  IsoMap and LLE were proposed to model global structure while preserving local distances or sampling from the local neighborhood of nodes.  ... 
doi:10.7717/peerj-cs.357 pmid:33817007 pmcid:PMC7959646 fatcat:ntronyrbgfbedez5dks6h4hoq4

Isometric Graph Neural Networks [article]

Matthew Walker, Bo Yan, Yiou Xiao, Yafei Wang, Ayan Acharya
2020 arXiv   pre-print
that the learned embeddings do account for graph distances.  ...  Geometric techniques to extract such representations have poor scaling over large graph size, and recent advances in Graph Neural Network (GNN) algorithms have limited ability to reflect graph distance  ...  Therefore, learning an embedding from such representation of the features may not preserve the distances in the graph.  ... 
arXiv:2006.09554v1 fatcat:jytiaunwt5aqhaeuj4u5o6y42e

Semantic preserving embeddings for multi-relational graphs

Pedro Almagro Blanco, Fernando Sancho Caparrini
2017 2017 Computing Conference  
It shows how vector representations that maintain semantic and topological features of the original data can be obtained from neural encoding architectures and considering the topological properties of  ...  the graph.  ...  In this way, we are interested in finding embeddings that can reflect, within the vector space features (distance, linearity, clustering, etc.) some semantic features of the original graph.  ... 
doi:10.1109/sai.2017.8252079 fatcat:dvu6u7jvl5fqjdar4nvgdioz4m

CommPOOL: An Interpretable Graph Pooling Framework for Hierarchical Graph Representation Learning [article]

Haoteng Tang, Guixiang Ma, Lifang He, Heng Huang, Liang Zhan
2020 arXiv   pre-print
Recent years have witnessed the emergence and flourishing of hierarchical graph pooling neural networks (HGPNNs) which are effective graph representation learning approaches for graph level tasks such  ...  baseline methods, and its effectiveness in capturing and preserving the community structure of graphs.  ...  Graph Neural Network Graph Neural Network (GNN) is an effective messagepassing architecture for embedding the graph nodes and their local structures.  ... 
arXiv:2012.05980v1 fatcat:toelhmu3mjb6ngtn3nnb56fz7e

Machine Learning for Scent: Learning Generalizable Perceptual Representations of Small Molecules [article]

Benjamin Sanchez-Lengeling, Jennifer N. Wei, Brian K. Lee, Richard C. Gerkin, Alán Aspuru-Guzik, Alexander B. Wiltschko
2019 arXiv   pre-print
Additional analysis shows that the learned embeddings from graph neural networks capture a meaningful odor space representation of the underlying relationship between structure and odor, as demonstrated  ...  We propose the use of graph neural networks for QSOR, and show they significantly out-perform prior methods on a novel data set labeled by olfactory experts.  ...  Each Graph Neural Network (GNN) layer, here represented as different colors, transforms the features from the previous layer.  ... 
arXiv:1910.10685v2 fatcat:2smeyl4bargkvfjx5jelubgwbi

Learning Graph-Level Representations with Recurrent Neural Networks [article]

Yu Jin, Joseph F. JaJa
2018 arXiv   pre-print
Graph nodes are mapped into node sequences sampled from random walk approaches approximated by the Gumbel-Softmax distribution.  ...  The majority of these methods start by embedding the graph nodes into a low-dimensional vector space, followed by using some scheme to aggregate the node embeddings.  ...  Figure 1 : Graph recurrent neural network model to learn graph-level representations. Step 1: Node embeddings are learned from the graph structures and node features over the entire training samples.  ... 
arXiv:1805.07683v4 fatcat:lsrbrfswtjejzcdbtggm77sa7y
« Previous Showing results 1 — 15 out of 17,375 results