Filters








23,939 Hits in 3.0 sec

Self-supervised Contrastive Attributed Graph Clustering [article]

Wei Xia, Quanxue Gao, Ming Yang, Xinbo Gao
2021 arXiv   pre-print
To address these issues, we propose a novel attributed graph clustering network, namely Self-supervised Contrastive Attributed Graph Clustering (SCAGC).  ...  In SCAGC, by leveraging inaccurate clustering labels, a self-supervised contrastive loss, which aims to maximize the similarities of intra-cluster nodes while minimizing the similarities of inter-cluster  ...  We also propose a new self-supervised contrastive loss based on imprecise clustering label to improve the quality of node representation.  ... 
arXiv:2110.08264v1 fatcat:gcdfo4ulhbhj3ax6e3obpwoxqu

Graph Self-Supervised Learning: A Survey [article]

Yixin Liu, Ming Jin, Shirui Pan, Chuan Zhou, Yu Zheng, Feng Xia, Philip S. Yu
2022 arXiv   pre-print
Under the umbrella of graph self-supervised learning, we present a timely and comprehensive review of the existing approaches which employ SSL techniques for graph data.  ...  To address these issues, self-supervised learning (SSL), which extracts informative knowledge through well-designed pretext tasks without relying on manual labels, has become a promising and trending learning  ...  Supervised Learning, Unsupervised Learning and Self-Supervised Learning.  ... 
arXiv:2103.00111v4 fatcat:y3zfg4ennnbnhhvmujd5rvltty

NCAGC: A Neighborhood Contrast Framework for Attributed Graph Clustering [article]

Tong Wang, Guanyu Yang, Qijia He, Zhenquan Zhang, Junhua Wu
2022 arXiv   pre-print
In this paper, we propose a Neighborhood Contrast Framework for Attributed Graph Clustering, namely NCAGC, seeking for conquering the aforementioned limitations.  ...  Attributed graph clustering is one of the most fundamental tasks among graph learning field, the goal of which is to group nodes with similar representations into the same cluster without human annotations  ...  ] introduces multi-scale self-expression layers and self-supervised method improving the clustering performance. • GRACE [30] is an unsupervised graph embedding method with graph contrastive learning  ... 
arXiv:2206.07897v2 fatcat:bhetrud665fndgsippndgytlga

Self-supervised Learning on Graphs: Contrastive, Generative,or Predictive [article]

Lirong Wu, Haitao Lin, Zhangyang Gao, Cheng Tan, Stan.Z.Li
2021 arXiv   pre-print
Latest advances in graph SSL are summarized in a GitHub repository https://github.com/LirongWu/awesome-graph-self-supervised-learning.  ...  Finally, we discuss the technical challenges and potential future directions for improving graph self-supervised learning.  ...  graph level for self-supervised learning.  ... 
arXiv:2105.07342v4 fatcat:iak3xwlx5nci3mlzerxhcylojm

Heterogeneous Graph Neural Networks using Self-supervised Reciprocally Contrastive Learning [article]

Di Jin, Cuiying Huo, Jianwu Dang, Peican Zhu, Weixiong Zhang, Witold Pedrycz, Lingfei Wu
2022 arXiv   pre-print
Self-supervised contrastive learning has been proposed to address the problem of requiring annotated data by mining intrinsic information hidden within the given data.  ...  Most existing HGNN-based approaches are supervised or semi-supervised learning methods requiring graphs to be annotated, which is costly and time-consuming.  ...  Overview HGCL is a self-supervised contrastive learning approach for heterogeneous graphs.  ... 
arXiv:2205.00256v1 fatcat:w2e7lrsqcjgt3jz74a2ekvl2si

Unsupervised Constrained Community Detection via Self-Expressive Graph Neural Network [article]

Sambaran Bandyopadhyay, Vishal Peter
2021 arXiv   pre-print
Traditionally, GNNs are trained on a semi-supervised or self-supervised loss function and then clustering algorithms are applied to detect communities.  ...  To tackle this problem, we combine the principle of self-expressiveness with the framework of self-supervised graph neural network for unsupervised community detection for the first time in literature.  ...  More recently, unsupervised and self-supervised graph neural networks have been proposed where a reconstruction loss [4, 21] or noise contrastive loss [32, 44] is used to train the networks.  ... 
arXiv:2011.14078v2 fatcat:s73ymcectffujc5bosjupqtlva

Self-Supervised Learning of Graph Neural Networks: A Unified Review [article]

Yaochen Xie, Zhao Xu, Jingtun Zhang, Zhengyang Wang, Shuiwang Ji
2022 arXiv   pre-print
When labeled samples are limited, self-supervised learning (SSL) is emerging as a new paradigm for making use of large amounts of unlabeled samples.  ...  Specifically, we categorize SSL methods into contrastive and predictive models.  ...  M3S [58] and ICF-GCN [59] employs self-training and node clustering to provide self-supervision.  ... 
arXiv:2102.10757v5 fatcat:mau6lbphw5hxjhc7oyejmo2zpu

Graph Communal Contrastive Learning [article]

Bolian Li, Baoyu Jing, Hanghang Tong
2021 arXiv   pre-print
Specifically, the proposed gCooL consists of two components: a Dense Community Aggregation (DeCA) algorithm for community detection and a Reweighted Self-supervised Cross-contrastive (ReSC) training scheme  ...  To address this issue, we propose a novel Graph Communal Contrastive Learning (gCooL) framework to jointly learn the community partition and learn node representations in an end-to-end fashion.  ...  ) Reweighted Self-supervised Cross-contrastive Training We propose the Reweighted Self-supervised Cross-contrastive (𝑅𝑒𝑆𝐶) training scheme in this section.  ... 
arXiv:2110.14863v1 fatcat:p2f3jqm3vregvpnd77yvfg4fym

CAGNN: Cluster-Aware Graph Neural Networks for Unsupervised Graph Representation Learning [article]

Yanqiao Zhu and Yichen Xu and Feng Yu and Shu Wu and Liang Wang
2020 arXiv   pre-print
In this paper, we present a novel cluster-aware graph neural network (CAGNN) model for unsupervised graph representation learning using self-supervised techniques.  ...  Unsupervised graph representation learning aims to learn low-dimensional node embeddings without supervision while preserving graph topological structures and node attributive features.  ...  Self-supervised Learning on Graphs by Clustering Typically, GNN models are trained using the classification objective in a supervised manner.  ... 
arXiv:2009.01674v1 fatcat:3wihohgjzrbmpba3jgnoevedii

CGMN: A Contrastive Graph Matching Network for Self-Supervised Graph Similarity Learning [article]

Di Jin, Luzhi Wang, Yizhen Zheng, Xiang Li, Fei Jiang, Wei Lin, Shirui Pan
2022 arXiv   pre-print
To this end, we propose a contrastive graph matching network (CGMN) for self-supervised graph similarity learning in order to calculate the similarity between any two input graph objects.  ...  In addition, existing unsupervised graph similarity learning methods are mainly clustering-based, which ignores the valuable information embodied in graph pairs.  ...  Graph Contrastive Learning Contrastive learning is popular in self-supervised graph representation learning, which aims to learn discriminative representations by comparing positive and negative samples  ... 
arXiv:2205.15083v2 fatcat:c6aqnv27nvcx7c7lkq4tl43nwm

Structural and Semantic Contrastive Learning for Unsupervised Node Representation Learning [article]

Kaize Ding, Yancheng Wang, Yingzhen Yang, Huan Liu
2022 arXiv   pre-print
Graph Contrastive Learning (GCL) recently has drawn much research interest for learning generalizable node representations in a self-supervised manner.  ...  In general, the contrastive learning process in GCL is performed on top of the representations learned by a graph neural network (GNN) backbone, which transforms and propagates the node contextual information  ...  Hence, an attributed graph can be described as G = (X, A) for simplicity. Graph Contrastive Learning.  ... 
arXiv:2202.08480v2 fatcat:7t4xmfyxyjab3jqagzbyjqfvrq

Rumor Detection with Self-supervised Learning on Texts and Social Graph [article]

Yuan Gao, Xiang Wang, Xiangnan He, Huamin Feng, Yongdong Zhang
2022 arXiv   pre-print
In this work, we explore contrastive self-supervised learning on heterogeneous information sources, so as to reveal their relations and characterize rumors better.  ...  Technically, we supplement the main supervised task of detection with an auxiliary self-supervised task, which enriches post representations via post self-discrimination.  ...  Contrastive models are another important branch in self-supervised learning. Michael et al.  ... 
arXiv:2204.08838v1 fatcat:ybmyd4ipxfh3zamwxcd53ha7k4

Prototypical Graph Contrastive Learning [article]

Shuai Lin, Pan Zhou, Zi-Yuan Hu, Shuojia Wang, Ruihui Zhao, Yefeng Zheng, Liang Lin, Eric Xing, Xiaodan Liang
2022 arXiv   pre-print
To mitigate this sampling bias issue, in this paper, we propose a Prototypical Graph Contrastive Learning (PGCL) approach.  ...  Specifically, PGCL models the underlying semantic structure of the graph data via clustering semantically similar graphs into the same group, and simultaneously encourages the clustering consistency for  ...  Concretely, a self-supervised contrastive attributed graph clustering approach [67] is proposed to benefit from imprecise clustering labels for the node classification task.  ... 
arXiv:2106.09645v2 fatcat:35degmdeqrcuzopjgav2yyxhwa

Deep Contrastive Learning for Multi-View Network Embedding [article]

Mengqi Zhang, Yanqiao Zhu, Shu Wu, Liang Wang
2021 arXiv   pre-print
However, most contrastive learning-based methods mostly rely on high-quality graph embedding and explore less on the relationships between different graph views.  ...  Multi-view network embedding aims at projecting nodes in the network to low-dimensional vectors, while preserving their multiple relations and attribute information.  ...  To capture the attributes and structure information together, some others [1, 14] combine graph neural networks and relational reconstruction tasks for self-supervised learning.  ... 
arXiv:2108.08296v1 fatcat:ogwffoabb5htbnkib6fhyzkpju

Augmentation-Free Self-Supervised Learning on Graphs

Namkyeong Lee, Junseok Lee, Chanyoung Park
2022 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
Inspired by the recent success of self-supervised methods applied on images, self-supervised learning on graph structured data has seen rapid growth especially centered on augmentation-based contrastive  ...  In this paper, we propose a novel augmentation-free self-supervised learning framework for graphs, named AFGRL.  ...  Related Work Recently, motivated by the great success of self-supervised methods on images, contrastive methods have been increasingly adopted to graphs.  ... 
doi:10.1609/aaai.v36i7.20700 fatcat:4g74znhrujgw5mnvinwesv24z4
« Previous Showing results 1 — 15 out of 23,939 results