1,433 Hits in 6.4 sec

Self-supervised Auxiliary Learning for Graph Neural Networks via Meta-Learning [article]

Dasol Hwang, Jinyoung Park, Sunyoung Kwon, Kyung-Min Kim, Jung-Woo Ha, Hyunwoo J. Kim
2021 arXiv   pre-print
In this paper, we propose a novel self-supervised auxiliary learning framework to effectively learn graph neural networks.  ...  Moreover, this work is the first study showing that a meta-path prediction is beneficial as a self-supervised auxiliary task for heterogeneous graphs.  ...  We introduce meta-path prediction as a self-supervised auxiliary task to improve the representational power of graph neural networks.  ... 
arXiv:2103.00771v2 fatcat:rrq6lhmtdze4ljr3yikrfwktvi

Self-supervised Auxiliary Learning with Meta-paths for Heterogeneous Graphs [article]

Dasol Hwang, Jinyoung Park, Sunyoung Kwon, Kyung-Min Kim, Jung-Woo Ha, Hyunwoo J. Kim
2021 arXiv   pre-print
In this paper, to learn graph neural networks on heterogeneous graphs we propose a novel self-supervised auxiliary learning method using meta-paths, which are composite relations of multiple edge types  ...  Our proposed method is learning to learn a primary task by predicting meta-paths as auxiliary tasks. This can be viewed as a type of meta-learning.  ...  To our knowledge, the meta-path prediction has not been studied in the context of self-supervised learning for graph neural networks in the literature.  ... 
arXiv:2007.08294v5 fatcat:k2s54yfa45efhiglqs7fndgoy4

Contrastive Meta Learning with Behavior Multiplicity for Recommendation [article]

Wei Wei and Chao Huang and Lianghao Xia and Yong Xu and Jiashu Zhao and Dawei Yin
2022 arXiv   pre-print
In addition, to capture the diverse multi-behavior patterns, we design a contrastive meta network to encode the customized behavior heterogeneity for different users.  ...  To tackle the above challenges, we devise a new model CML, Contrastive Meta Learning (CML), to maintain dedicated cross-type behavior dependency for different users.  ...  The behavior-aware graph neural architecture with multi-behavior self-supervision bring benefits to the heterogeneous relational learning for recommendation.  ... 
arXiv:2202.08523v1 fatcat:mvx3u5hxkvh2hekxisptmupyp4

Bootstrapping Informative Graph Augmentation via A Meta Learning Approach [article]

Hang Gao, Jiangmeng Li, Wenwen Qiang, Lingyu Si, Fuchun Sun, Changwen Zheng
2022 arXiv   pre-print
Recent works explore learning graph representations in a self-supervised manner. In graph contrastive learning, benchmark methods apply various graph augmentation approaches.  ...  The objective of the graph augmenter is to promote our feature extraction network to learn a more discriminative feature representation, which motivates us to propose a meta-learning paradigm.  ...  Introduction Recently, there has been a surge of interest in learning a graph representation via self-supervised Graph Neural Network (GNN) approaches.  ... 
arXiv:2201.03812v3 fatcat:pokbe4si3nasrnl7d7qptjcdlu

Adaptive Transfer Learning on Graph Neural Networks [article]

Xueting Han, Zhenhuan Huang, Bang An, Jing Bai
2021 arXiv   pre-print
Graph neural networks (GNNs) is widely used to learn a powerful representation of graph-structured data.  ...  In addition, we learn the weighting model through meta-learning.  ...  INTRODUCTION Graph neural networks (GNNs) [1, 17] attract a lot of attention in representation learning for graph-structured data.  ... 
arXiv:2107.08765v2 fatcat:ngnexsimgfezjfrvqhrtl4cslq

Meta Propagation Networks for Graph Few-shot Semi-supervised Learning [article]

Kaize Ding, Jianling Wang, James Caverlee, Huan Liu
2022 arXiv   pre-print
Inspired by the extensive success of deep learning, graph neural networks (GNNs) have been proposed to learn expressive node representations and demonstrated promising performance in various graph learning  ...  In this paper, we propose a decoupled network architecture equipped with a novel meta-learning algorithm to solve this problem.  ...  Graph neural networks (GNNs), a family of neural models for learning latent node representations in a graph, have achieved gratifying success in different graph learning tasks (Defferrard, Bresson, and  ... 
arXiv:2112.09810v2 fatcat:723hwqhbaff55lqifrnscmh6ay

Trainable Class Prototypes for Few-Shot Learning [article]

Jianyi Li, Guizhong Liu
2021 arXiv   pre-print
Overall we solve the few-shot tasks in two phases: meta-training a transferable feature extractor via self-supervised learning and training the prototypes for metric classification.  ...  Also to avoid the disadvantages that the episodic meta-training brought, we adopt non-episodic meta-training based on self-supervised learning.  ...  Overall we solve the few-shot tasks in two phases: meta-training a transferable feature extractor via self-supervised learning and training the prototypes for metric classification.  ... 
arXiv:2106.10846v1 fatcat:jejraksfvjep5jacdrosnrrafq

Meta-Learning with Graph Neural Networks: Methods and Applications [article]

Debmalya Mandal, Sourav Medya, Brian Uzzi, Charu Aggarwal
2021 arXiv   pre-print
Graph Neural Networks (GNNs), a generalization of deep neural networks on graph data have been widely used in various domains, ranging from drug discovery to recommender systems.  ...  Meta-learning has been an important framework to address the lack of samples in machine learning, and in recent years, researchers have started to apply meta-learning to GNNs.  ...  Conclusion In this survey, we have performed a comprehensive review of the works that are combination of graph neural networks (GNNs) and meta-learning.  ... 
arXiv:2103.00137v3 fatcat:odsdbw34hjazxg43slwvfz7nki

Few-Shot Image Classification via Contrastive Self-Supervised Learning [article]

Jianyi Li, Guizhong Liu
2020 arXiv   pre-print
We solve the few-shot tasks in two phases: meta-training a transferable feature extractor via contrastive self-supervised learning and training a classifier using graph aggregation, self-distillation and  ...  Most previous few-shot learning algorithms are based on meta-training with fake few-shot tasks as training samples, where large labeled base classes are required.  ...  We solve the few-shot tasks in two phases: meta-training a transferable feature extractor via contrastive self-supervised learning and training a classifier using graphaggregation, self-distillation and  ... 
arXiv:2008.09942v1 fatcat:bj7qyiacqnfnfnd4dqeklyj4li

Predictive and Contrastive: Dual-Auxiliary Learning for Recommendation [article]

Yinghui Tao, Min Gao, Junliang Yu, Zongwei Wang, Qingyu Xiong, Xu Wang
2022 arXiv   pre-print
Self-supervised learning (SSL) recently has achieved outstanding success on recommendation.  ...  Based on the finding, we design two auxiliary tasks that are tightly coupled with the target task (one is predictive and the other one is contrastive) towards connecting recommendation with the self-supervision  ...  In light of this, we propose a self-supervised Dual-Auxiliary Learning framework (DUAL) in this paper.  ... 
arXiv:2203.03982v2 fatcat:wvqlbmet6vhfzldyvis2nn3774

Self-supervised edge features for improved Graph Neural Network training [article]

Arijit Sehanobish, Neal G. Ravindra, David van Dijk
2020 arXiv   pre-print
In this work, we present a framework for creating new edge features, applicable to any domain, via a combination of self-supervised and unsupervised learning.  ...  Graph Neural Networks (GNN) have been extensively used to extract meaningful representations from graph structured data and to perform predictive tasks such as node classification and link prediction.  ...  Acknowledgements We acknowledge the Yale Center for Research Computing for our use of their High Performance Computing infrastructure. We thank Mia Madel Alfajaro and Craig B.  ... 
arXiv:2007.04777v1 fatcat:vcgjxc5z6zgdbmscf2odvdpcd4

Curriculum Pre-Training Heterogeneous Subgraph Transformer for Top-N Recommendation [article]

Hui Wang, Kun Zhou, Wayne Xin Zhao, Jingyuan Wang, Ji-Rong Wen
2021 arXiv   pre-print
Due to the flexibility in modelling data heterogeneity, heterogeneous information network (HIN) has been adopted to characterize complex and heterogeneous auxiliary data in top-N recommender systems, called  ...  Then we capture the rich semantics (graph structure and path semantics) within the subgraph via a heterogeneous subgraph Transformer, where we encode the subgraph with multi-slot sequence representations  ...  [38] utilize contrastive learning to capture the universal network topological properties across multiple networks, which empowers graph neural networks to learn the intrinsic and transferable structural  ... 
arXiv:2106.06722v1 fatcat:5sxsricufveodho3fblxxrhb2a

Self-Supervised Learning of Graph Neural Networks: A Unified Review [article]

Yaochen Xie, Zhao Xu, Jingtun Zhang, Zhengyang Wang, Shuiwang Ji
2022 arXiv   pre-print
SSL has achieved promising performance on natural language and image learning tasks. Recently, there is a trend to extend such success to graph data using graph neural networks (GNNs).  ...  When labeled samples are limited, self-supervised learning (SSL) is emerging as a new paradigm for making use of large amounts of unlabeled samples.  ...  We thank the scientific community for providing valuable feedback and comments, which lead to improvements of this work.  ... 
arXiv:2102.10757v5 fatcat:mau6lbphw5hxjhc7oyejmo2zpu

Deep Metric Learning for Few-Shot Image Classification: A Review of Recent Developments [article]

Xiaoxu Li, Xiaochen Yang, Zhanyu Ma, Jing-Hao Xue
2022 arXiv   pre-print
These methods, by classifying unseen samples according to their distances to few seen samples in an embedding space learned by powerful deep neural networks, can avoid overfitting to few training images  ...  In this paper, we provide an up-to-date review of deep metric learning methods for few-shot image classification from 2018 to 2022 and categorize them into three groups according to three stages of metric  ...  Learning similarity scores via neural networks The Relation Network [12] is the first work of introducing a neural network to model the similarity of feature embeddings in few-shot learning.  ... 
arXiv:2105.08149v2 fatcat:yxsvfdspbrhfpcrzgnny27vgjy

Informative Pseudo-Labeling for Graph Neural Networks with Few Labels [article]

Yayong Li, Jie Yin, Ling Chen
2022 arXiv   pre-print
Graph Neural Networks (GNNs) have achieved state-of-the-art results for semi-supervised node classification on graphs.  ...  Extensive experiments on six real-world graph datasets demonstrate that our proposed approach significantly outperforms state-of-the-art baselines and strong self-supervised methods on graphs.  ...  For example, Ding et al. [6] proposed a graph prototypical network for node classification, which learns a transferable metric space via meta-learning, such that the model can extract meta-knowledge to  ... 
arXiv:2201.07951v1 fatcat:k5lb5bbnlfailn2va7u5w4beha
« Previous Showing results 1 — 15 out of 1,433 results