Filters








7,333 Hits in 4.9 sec

Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning [article]

Qimai Li, Zhichao Han, Xiao-Ming Wu
2018 arXiv   pre-print
For graph-based semisupervised learning, a recent important development is graph convolutional networks (GCNs), which nicely integrate local vertex features and graph topology in the convolutional layers  ...  Extensive experiments on benchmarks have verified our theory and proposals.  ...  The authors would like to thank the reviewers for their insightful comments and useful discussions.  ... 
arXiv:1801.07606v1 fatcat:jeuaoipf7zhp5gp5bjdfppusiq

Deeper Insights Into Graph Convolutional Networks for Semi-Supervised Learning

Qimai Li, Zhichao Han, Xiao-ming Wu
2018 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
For graph-based semi-supervised learning, a recent important development is graph convolutional networks (GCNs), which nicely integrate local vertex features and graph topology in the convolutional layers  ...  Extensive experiments on benchmarks have verified our theory and proposals.  ...  The authors would like to thank the reviewers for their insightful comments and useful discussions.  ... 
doi:10.1609/aaai.v32i1.11604 fatcat:cfaq4vntibbgvb6zey3zwvil34

Towards Gene Expression Convolutions using Gene Interaction Graphs [article]

Francis Dutil, Joseph Paul Cohen, Martin Weiss, Georgy Derevyanko, Yoshua Bengio
2018 arXiv   pre-print
We explore the usage of Graph Convolutional Neural Networks coupled with dropout and gene embeddings to utilize the graph information.  ...  by convolutions on an image.  ...  Gene Graph Convolutions Most existing work with graph convolutional networks focuses on settings where data is confined to a graph structure, for example points clouds, social networks, or protein structures  ... 
arXiv:1806.06975v1 fatcat:nflujghoqrcwlbkdti7aux2v3a

RGCNN: Regularized Graph CNN for Point Cloud Segmentation [article]

Gusi Te, Wei Hu, Zongming Guo, Amin Zheng
2018 arXiv   pre-print
In this paper, we instead propose a regularized graph convolutional neural network (RGCNN) that directly consumes point clouds.  ...  Due to the irregularity of the data format, previous deep learning works often convert point clouds to regular 3D voxel grids or collections of images before feeding them into neural networks, which leads  ...  Graph Convolutional Neural Network As CNN only deals with data defined on regular grids, it is extended to graphs for irregular data, which is referred to as GCNN.  ... 
arXiv:1806.02952v1 fatcat:fdxhxw3eivcdrlgg2c74a425ua

Graph Convolutional Networks: analysis, improvements and results [article]

Ihsan Ullah, Mario Manzo, Mitul Shah, Michael Madden
2019 arXiv   pre-print
Graph convolutional networks were introduced to utilize the convolutional models concepts that shows good results.  ...  In this context, we enhanced two of the existing Graph convolutional network models by proposing four enhancements.  ...  In Kipf and Welling [2017] a variant of convolutional neural networks, called Graph Convolutional Networks (GCNs), which operate directly on graphs is presented.  ... 
arXiv:1912.09592v1 fatcat:f7xobt3nqrexzl7bhxoxferuli

Spectral Networks and Locally Connected Networks on Graphs [article]

Joan Bruna, Wojciech Zaremba, Arthur Szlam, Yann LeCun
2014 arXiv   pre-print
Convolutional Neural Networks are extremely efficient architectures in image and audio recognition tasks, thanks to their ability to exploit the local translational invariance of signal classes over their  ...  In particular, we propose two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian.  ...  In this work, we will discuss constructions of deep neural networks on graphs other than regular grids. We propose two different constructions.  ... 
arXiv:1312.6203v3 fatcat:ubgtphe57bgjvgctfd5i6xny6i

An Enhanced Neural Graph based Collaborative Filtering with Item Knowledge Graph

Sangeetha M., Meera Devi Thiagarajan
2022 International Journal of Computers Communications & Control  
Based on the comparison of recall and NDCG metric, the proposed neural graph-based filtering outperforms the collaborative filtering based on graph convolution neural network.  ...  This technique can suggest better recommendations as compared to the existing graph based or convolutional based networks.  ...  based collaborative filtering (NGCF), graph-based convolution neural network using two metrics recall and NDCG.  ... 
doi:10.15837/ijccc.2022.4.4568 fatcat:b6dqcrnl3ndkff457vhidar3b4

DropCluster: A structured dropout for convolutional networks [article]

Liyan Chen, Philip Gautier, Sergul Aydore
2020 arXiv   pre-print
Dropout as a regularizer in deep neural networks has been less effective in convolutional layers than in fully connected layers. This is due to the fact that dropout drops features randomly.  ...  It finds clusters of correlated features in convolutional layer outputs and drops the clusters randomly at each iteration.  ...  Introduction Convolutional neural networks (CNNs) have become a foundational tool for computer vision problems.  ... 
arXiv:2002.02997v1 fatcat:xo463w37sjgmpodcga25iy7cay

Unsupervised Neural-Based Graph Clustering for Variable-Length Speech Representation Discovery of Zero-Resource Languages

Shun Takahashi, Sakriani Sakti, Satoshi Nakamura
2021 Conference of the International Speech Communication Association  
Subsequently, we extract and encode the topological features of nodes in the graph to cluster them using graph convolution. By this process, we can obtain coarsened speech representation.  ...  In this work, to discover variable-length, low bit-rate speech representation from a limited amount of unannotated speech data, we propose an approach based on graph neural networks (GNNs), and we study  ...  [17] addressed a text classification task as node classification by using graph convolutional neural networks (GCNs) [18] and successfully outperformed the traditional convolutional neural network  ... 
doi:10.21437/interspeech.2021-1340 dblp:conf/interspeech/TakahashiSN21 fatcat:umdir7ldejfrznsuesgx4z5dra

Modeling Physico-Chemical ADMET Endpoints with Multitask Graph Convolutional Networks

Floriane Montanari, Lara Kuhnke, Antonius Ter Laak, Djork-Arné Clevert
2019 Molecules  
For seven endpoints of interest, we compared the performance of that approach to fully connected neural networks and different single task models.  ...  In this paper, we report our finding that, especially for predicting physicochemical ADMET endpoints, a multitask graph convolutional approach appears a highly competitive choice.  ...  PotentialNet is a type of graph convolutional networks that have been designed to predict protein-ligand affinities based in gated graph neural networks.  ... 
doi:10.3390/molecules25010044 pmid:31877719 pmcid:PMC6982787 fatcat:4jfr73hkd5hvhbzilp56vkmb64

Modeling physico-chemical ADMET endpoints with multitask graph convolutional networks

Floriane Montanari, Lara Kuhnke, Antonius Ter Laak, Djork-Arné Clevert
2021 figshare.com  
We report that, for the endpoints studied here, a multitask graph convolutional network appears a highly competitive choice.  ...  The new model shows increased predictive performance on all endpoints compared to previous modeling methods.  ...  We compared this correlation with the one obtained on the cluster split test set by our multitask graph convolutional network.  ... 
doi:10.6084/m9.figshare.14499807.v1 fatcat:jsu4h4scbvcrjfhrtppo7srti4

Partial Graph Reasoning for Neural Network Regularization [article]

Tiange Xiang, Chaoyi Zhang, Yang Song, Siqi Liu, Hongliang Yuan, Weidong Cai
2022 arXiv   pre-print
This add-on graph regularizes the network during training and can be completely skipped during inference.  ...  Regularizers help deep neural networks prevent feature co-adaptations. Dropout, as a commonly used regularization technique, stochastically disables neuron activations during network optimization.  ...  Conclusion In this paper, we introduce the very first learning-based framework for neural network regularization, namely Drop-Graph.  ... 
arXiv:2106.01805v2 fatcat:lhbrtvxnhvhvbn5oz2omoichr4

Exploring convolutional auto-encoders for representation learning on networks

Pranav Nerurkar, Madhav Chandane, Sunil Bhirud
2019 Computer Science  
To address this challenge, the technical focus of this dissertation is on the use of graph neural networks for network representation learning (NRL); i.e., learning the vector representations of nodes  ...  Extensive experiments are performed on publicly available benchmark network datasets to highlight the validity of this approach.  ...  In the literature, deep-learning architectures are based on graph convolutional neural (GCN) networks or graph auto-encoder (GAE) networks.  ... 
doi:10.7494/csci.2019.20.3.3167 fatcat:lgwnh6nqbvdpxj22kyqfwapiwa

Variationally Regularized Graph-based Representation Learning for Electronic Health Records [article]

Weicheng Zhu, Narges Razavian
2021 arXiv   pre-print
Recent progress on graph neural networks (GNN) enables end-to-end learning of topological structures for non-grid or non-sequential data.  ...  Besides the improvements in empirical experiment performances, we provide an interpretation of the effect of variational regularization compared to standard graph neural network, using singular value analysis  ...  Graph neural network (GNN) has been considered an effective way to generalize Convolutional Neural Networks (CNN) in extracting signals from non-grid structured data [5, 20] .  ... 
arXiv:1912.03761v2 fatcat:e42r44tatzdebeqnjzw5nsdvgq

Graph Convolutional Neural Networks with Node Transition Probability-based Message Passing and DropNode Regularization [article]

Tien Huu Do, Duc Minh Nguyen, Giannis Bekoulis, Adrian Munteanu, Nikos Deligiannis
2020 arXiv   pre-print
Graph convolutional neural networks (GCNNs) have received much attention recently, owing to their capability in handling graph-structured data.  ...  DropNode randomly discards part of a graph, thus it creates multiple deformed versions of the graph, leading to data augmentation regularization effect.  ...  In order to handle data in such domains, graph neural networks (GNNs) have been proposed.  ... 
arXiv:2008.12578v1 fatcat:ub7pio6fbbbknj6wqsl472xqfy
« Previous Showing results 1 — 15 out of 7,333 results