Filters








18,005 Hits in 4.9 sec

Node Similarity Preserving Graph Convolutional Networks [article]

Wei Jin, Tyler Derr, Yiqi Wang, Yao Ma, Zitao Liu, Jiliang Tang
2021 arXiv   pre-print
Specifically, to balance information from graph structure and node features, we propose a feature similarity preserving aggregation which adaptively integrates graph structure and node features.  ...  Thus, it has motivated the proposed framework SimP-GCN that can effectively and efficiently preserve node similarity while exploiting graph structure.  ...  Hence, it urges us to develop a new graph neural network capable of preserving node similarity in the original feature space.  ... 
arXiv:2011.09643v2 fatcat:tpacfkgf3bctvddecr5ukqrvii

Learning Asymmetric Embedding for Attributed Networks via Convolutional Neural Network [article]

Mohammadreza Radmanesh, Hossein Ghorbanzadeh, Ahmad Asgharian Rezaei, Mahdi Jalili, Xinghuo Yu
2022 arXiv   pre-print
Here, we propose a novel deep asymmetric attributed network embedding model based on convolutional graph neural network, called AAGCN.  ...  The main idea is to maximally preserve the asymmetric proximity and asymmetric similarity of directed attributed networks.  ...  Here we incorporate the network topology and node features to represent non-linear relations between network nodes.  ... 
arXiv:2202.06307v1 fatcat:5s4nptehkrgjrel2rlwbjyv5iy

Community-preserving Graph Convolutions for Structural and Functional Joint Embedding of Brain Networks [article]

Jiahao Liu, Guixiang Ma, Fei Jiang, Chun-Ta Lu, Philip S. Yu, Ann B. Ragin
2019 arXiv   pre-print
In this paper, we propose a framework of Siamese community-preserving graph convolutional network (SCP-GCN) to learn the structural and functional joint embedding of brain networks.  ...  Specifically, we use graph convolutions to learn the structural and functional joint embedding, where the graph structure is defined with structural connectivity and node features are from the functional  ...  the Siamese community-preserving graph convolutional network.  ... 
arXiv:1911.03583v1 fatcat:ahuskrkwyjdfrabpx23lrlwfkm

K-Core based Temporal Graph Convolutional Network for Dynamic Graphs [article]

Jingxin Liu, Chang Xu, Chang Yin, Weiqiang Wu, You Song
2020 arXiv   pre-print
Inspired by the success of graph convolutional networks(GCNs) in static graph embedding, we propose a novel k-core based temporal graph convolutional network, the CTGCN, to learn node representations for  ...  In contrast to previous dynamic graph embedding methods, CTGCN can preserve both local connective proximity and global structural similarity while simultaneously capturing graph dynamics.  ...  convolutional network, namely, the CTGCN, to preserve both connective proximity and structural similarity in dynamic graphs.  ... 
arXiv:2003.09902v3 fatcat:6apuy6vp5zalhphkihas7rab5m

Multiplex Network Embedding Model with High-Order Node Dependence

Nianwen Ning, Qiuyue Li, Kai Zhao, Bin Wu, Shenghua Liu
2021 Complexity  
In the intralayer embedding phase, we present a symmetric graph convolution-deconvolution model to embed high-order proximity information as the intralayer embedding of nodes in an unsupervised manner.  ...  To address this problem, we propose a novel multiplex network embedding model with high-order node dependence, called HMNE.  ...  Preserving High-Order Proximity 4.1.1. Graph Convolution.  ... 
doi:10.1155/2021/6644111 fatcat:qsxs4kyoqrgv5ne2xpuns4qsru

Distilling Knowledge from Graph Convolutional Networks [article]

Yiding Yang, Jiayan Qiu, Mingli Song, Dacheng Tao, Xinchao Wang
2021 arXiv   pre-print
Existing knowledge distillation methods focus on convolutional neural networks (CNNs), where the input samples like images lie in a grid domain, and have largely overlooked graph convolutional networks  ...  Moreover, the proposed approach is readily extendable to dynamic graph models, where the input graphs for the teacher and the student may differ.  ...  Graph Convolutional Network.  ... 
arXiv:2003.10477v4 fatcat:vtwlv7dsjrefxhhxmwtqbufrdu

CoANE: Modeling Context Co-occurrence for Attributed Network Embedding [article]

I-Chung Hsieh, Cheng-Te Li
2021 arXiv   pre-print
Attributed network embedding (ANE) is to learn low-dimensional vectors so that not only the network structure but also node attributes can be preserved in the embedding space.  ...  To better encode structural and semantic knowledge of nodes, we devise a three-way objective function, consisting of positive graph likelihood, contextual negative sampling, and attribute reconstruction  ...  Graph Convolutional Network (GCN) [14] is proposed to adopt the spectral graph convolution for semi-supervised node classification.  ... 
arXiv:2106.09241v1 fatcat:mwdbq3wqojat5ggdh34yodream

DANE: Domain Adaptive Network Embedding [article]

Yizhou Zhang, Guojie Song, Lun Du, Shuwen Yang, Yilun Jin
2019 arXiv   pre-print
In this paper, we propose a novel Domain Adaptive Network Embedding framework, which applies graph convolutional network to learn transferable embeddings.  ...  In DANE, nodes from multiple networks are encoded to vectors via a shared set of learnable parameters so that the vectors share an aligned embedding space.  ...  DANE consists of two major components: (a) shared weight graph convolutional network (SWGCN) projects the nodes from two networks into a shared embedding space and preserve cross-network similarity; (b  ... 
arXiv:1906.00684v2 fatcat:5tmdx55hvjg2to7s6ie44ef454

Distilling Knowledge From Graph Convolutional Networks

Yiding Yang, Jiayan Qiu, Mingli Song, Dacheng Tao, Xinchao Wang
2020 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)  
As we can see, model trained with LSP learns a similar structure as that of the teacher, while the model without LSP fails to do so.  ...  Top Row: structures obtained from the teacher; Middle Row: structures obtained from the student trained with the local structure preserving (LSP) module; Bottom Row: structures obtained from the student  ...  Graph Convolutional Network.  ... 
doi:10.1109/cvpr42600.2020.00710 dblp:conf/cvpr/YangQSTW20 fatcat:nzswe5fls5brhgj2n3lof7zjmy

Machine Learning on Graphs: A Model and Comprehensive Taxonomy [article]

Ines Chami, Sami Abu-El-Haija, Bryan Perozzi, Christopher Ré, Kevin Murphy
2022 arXiv   pre-print
GraphSage, Graph Convolutional Networks, Graph Attention Networks), and unsupervised learning of graph representations (e.g. DeepWalk, node2vec, etc) into a single consistent approach.  ...  Here, we aim to bridge the gap between graph neural networks, network embedding and graph regularization models.  ...  On the other hand, structure-aware embeddings capture local structural information about nodes in a graph, i.e. nodes with similar node features or similar structural roles in a network should have similar  ... 
arXiv:2005.03675v3 fatcat:6eoicgprdvfbze732nsmpaumqe

Capturing High-order Structures on Motif-based Graph Nerual Networks [article]

Kejia Zhang
2022 arXiv   pre-print
However, the local topology information of many nodes in the network is similar, the network obtained by shallow embedding represents the network that is susceptible to structural noise, and the low-order  ...  Graph Nerual Networks (GNNs) are effective models in graph embedding.  ...  The graph convolutional neural network applies the deep learning neural network to the graph data, and obtains the node representation of the network by aggregating the neighbor information.  ... 
arXiv:2205.00867v2 fatcat:q2gj6glk7rcmjluarrkwn3nh7u

Multi-Label Graph Convolutional Network Representation Learning [article]

Min Shi, Yufei Tang, Xingquan Zhu, Jianxun Liu
2019 arXiv   pre-print
In this paper, we propose a novel multi-label graph convolutional network (ML-GCN) for learning node representation for multi-label networks.  ...  To fully explore label-label correlation and network topology structures, we propose to model a multi-label network as two Siamese GCNs: a node-node-label graph and a label-label-node graph.  ...  Graph Convolutional Networks GCN [22] is a general class of convolutional neural networks that operate directly on graphs for node representation learning and classification by encoding both the graph  ... 
arXiv:1912.11757v1 fatcat:2cvhv2nu6jeghm4nrynm5jg3uq

iPool – Information-based Pooling in Hierarchical Graph Neural Networks [article]

Xing Gao, Hongkai Xiong, Pascal Frossard
2019 arXiv   pre-print
The majority of these works however focus on adapting the convolution operator to graph representation.  ...  This new criterion determines how nodes are selected and coarsened graphs are constructed in the pooling layer.  ...  Notably, the hyper-parameter s plays a similar role as the hyper-parameter stride in Figure 2 : Hierarchical graph convolution network architecture.  ... 
arXiv:1907.00832v2 fatcat:ulyngrifq5a3bjkqbas4azxpdu

Locality Preserving Dense Graph Convolutional Networks with Graph Context-Aware Node Representations [article]

Wenfeng Liu, Maoguo Gong, Zedong Tang, A. K. Qin
2020 arXiv   pre-print
Graph convolutional networks (GCNs) have been widely used for representation learning on graph data, which can capture structural patterns on a graph via specifically designed convolution and readout operations  ...  In this work, we propose a locality-preserving dense GCN with graph context-aware node representations.  ...  Conclusion and Future Work In this paper, we develop a locality preserving dense graph convolutional network architecture with graph context-aware node representations for graph classification.  ... 
arXiv:2010.05404v1 fatcat:mslxe7uxt5fcrbrmntgqdt2oxy

Hashing Graph Convolution for Node Classification

Wenting Zhao, Zhen Cui, Chunyan Xu, Chengzheng Li, Tong Zhang, Jian Yang
2019 Proceedings of the 28th ACM International Conference on Information and Knowledge Management - CIKM '19  
their orders but also makes a graph convolution run like the standard shape-girded convolution.  ...  Convolution on graphs has aroused great interest in AI due to its potential applications to non-gridded data.  ...  Based on the assumption that connected nodes should be similar, in essence, graph convolution is a smooth strategy, which aggregates the features of neighbors to the reference vertex.  ... 
doi:10.1145/3357384.3357922 dblp:conf/cikm/ZhaoCXLZ019 fatcat:lp3wcblwtjestcqdmx7swd5zey
« Previous Showing results 1 — 15 out of 18,005 results