A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
DeepGCNs: Can GCNs Go as Deep as CNNs?
[article]
2019
arXiv
pre-print
We believe that the community can greatly benefit from this work, as it opens up many opportunities for advancing GCN-based research. ...
In this work, we present new ways to successfully train very deep GCNs. ...
Extensive experiments show that by adding skip connections to GCNs, we can alleviate the difficulty of training, which is the primary problem impeding GCNs to go deeper. ...
arXiv:1904.03751v2
fatcat:kkcxgwcchvb3nc7mxfdvrhaoyy
DeepGCNs: Making GCNs Go as Deep as CNNs
[article]
2020
arXiv
pre-print
This work transfers concepts such as residual/dense connections and dilated convolutions from CNNs to GCNs in order to successfully train very deep GCNs. ...
We show the benefit of using deep GCNs (with as many as 112 layers) experimentally across various datasets and tasks. ...
from CNNs (i.e. residual/dense connections and dilated convolutions) can be transferred to GCNs in order to make GCNs go as deep as CNNs. ...
arXiv:1910.06849v2
fatcat:4rjqgbw3y5ae7hhthuxnwjjodi
DeeperGCN: All You Need to Train Deeper GCNs
[article]
2020
arXiv
pre-print
Unlike Convolutional Neural Networks (CNNs), which are able to take advantage of stacking very deep layers, GCNs suffer from vanishing gradient, over-smoothing and over-fitting issues when going deeper ...
These challenges limit the representation power of GCNs on large-scale graphs. This paper proposes DeeperGCN that is capable of successfully and reliably training very deep GCNs. ...
Inspired by the benefit of training deep CNN-based networks [He et al., 2016a , Huang et al., 2017 , Yu and Koltun, 2016 , DeepGCNs [Li et al., 2019b] propose to train very deep GCNs (56 layers) by ...
arXiv:2006.07739v1
fatcat:vgpxy3ksrreephzbbporfbh4sq
Attacking Point Cloud Segmentation with Color-only Perturbation
[article]
2021
arXiv
pre-print
Recent research efforts on 3D point-cloud semantic segmentation have achieved outstanding performance by adopting deep CNN (convolutional neural networks) and GCN (graph convolutional networks). ...
By evaluating COLPER on an indoor dataset (S3DIS) and an outdoor dataset (Semantic3D) against three point cloud segmentation models (PointNet++, DeepGCNs, and RandLA-Net), we found color-only perturbation ...
Multi-Path Region Mining for Weakly Supervised 3D Se-
Deepgcns: Can gcns go as deep as cnns? In Proceedings mantic Segmentation on Point Clouds. ...
arXiv:2112.05871v2
fatcat:rru4yw6drjbrvpboxaruqo7asa
PU-GCN: Point Cloud Upsampling using Graph Convolutional Networks
[article]
2020
arXiv
pre-print
These upsampling modules are versatile and can be incorporated into any point cloud upsampling pipeline. ...
We propose three novel point upsampling modules: Multi-branch GCN, Clone GCN, and NodeShuffle. ...
instead of CNNs. ...
arXiv:1912.03264v2
fatcat:ac5ddulxlbgttmhyrrgopvjgxi
Dense Graph Convolutional Neural Networks on 3D Meshes for 3D Object Segmentation and Classification
[article]
2021
arXiv
pre-print
We use the faces of the mesh as basic processing units and represent a 3D mesh as a graph where each node corresponds to a face. ...
efficient practical GCN models for 3D object classification and segmentation. ...
Ghanem, Deepgcns: Can gcns go as deep as
cnns?, in: Proceedings of the IEEE International Conference on Computer
Vision, 2019, pp. 9267–9276.
[23] H. Choi, G. Moon, K. M. ...
arXiv:2106.15778v1
fatcat:bja4cmbuxvd33p6aogftfpwqeq
Going Deeper with Lean Point Networks
[article]
2020
arXiv
pre-print
We report systematic accuracy and memory consumption improvements on multiple publicly available segmentation tasks by using our generic modules as drop-in replacements for the blocks of multiple architectures ...
We attain similar performance to Deep GCN, while relying on our generic memory-efficient network blocks and while based on a weaker baseline compared to Deep GCN (i.e., DGCNN). ...
S4 Memory and speed efficiency of our deep network Deep LPN with respect to two different implementations of DeepGCNs. ...
arXiv:1907.00960v2
fatcat:goym2wkhmretba255e6d44vuoy
G-TAD: Sub-Graph Localization for Temporal Action Detection
2020
2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
With graph convolution as the basic operation, we design a GCN block called GCNeXt, which learns the features of each node by aggregating its context and dynamically updates the edges in the graph. ...
In this work, we propose a graph convolutional network (GCN) model to adaptively incorporate multi-level semantic context into video features and cast temporal action detection as a sub-graph localization ...
[25, 26] propose DeepGCNs to enable GCNs to go as deep as 100 layers using residual/dense graph connections and dilated graph convolutions, and explore ways to automatically design GCNs [27] . ...
doi:10.1109/cvpr42600.2020.01017
dblp:conf/cvpr/XuZRTG20
fatcat:pbrohzod6fexzjgz5h42ducduq
Boosting-GNN: Boosting Algorithm for Graph Networks on Imbalanced Node Classification
2021
Frontiers in Neurorobotics
This study proposes an ensemble model called Boosting-GNN, which uses GNNs as the base classifiers during boosting. ...
Traditional methods such as resampling, reweighting, and synthetic samples that deal with imbalanced datasets are no longer applicable in GNN. ...
“Deepgcns: can gcns go as deep as cnns?” in 2019 IEEE/CVF International Conference on Computer Vision (ICCV) (Seoul), 9266–9275. ...
doi:10.3389/fnbot.2021.775688
pmid:34899230
pmcid:PMC8655128
fatcat:pphvv2e3xnhwbeh4pt3xqacxbm
G-TAD: Sub-Graph Localization for Temporal Action Detection
[article]
2020
arXiv
pre-print
With graph convolution as the basic operation, we design a GCN block called GCNeXt, which learns the features of each node by aggregating its context and dynamically updates the edges in the graph. ...
In this work, we propose a graph convolutional network (GCN) model to adaptively incorporate multi-level semantic context into video features and cast temporal action detection as a sub-graph localization ...
[25, 26] propose DeepGCNs to enable GCNs to go as deep as 100 layers using residual/dense graph connections and dilated graph convolutions, and explore ways to automatically design GCNs [27] . ...
arXiv:1911.11462v2
fatcat:qndnvzcl45hwjp6hb7lr2zz4ca
Graph Convolutional Networks for Multi-modality Medical Imaging: Methods, Architectures, and Clinical Applications
[article]
2022
arXiv
pre-print
The development of graph convolutional networks (GCNs) has created the opportunity to address this information complexity via graph-driven architectures, since GCNs can perform feature aggregation, interaction ...
To foster cross-disciplinary research, we present GCNs technical advancements, emerging medical applications, identify common challenges in the use of image-based GCNs and their extensions in model interpretation ...
In this study, the dynamics superpixels can be viewed as a key bridge between CNN and the GCN model, which are generated according to the CNN feature extraction. ...
arXiv:2202.08916v3
fatcat:zskcqvgjpnb6vdklmyy5rozswq
Deep Graph Neural Networks with Shallow Subgraph Samplers
[article]
2022
arXiv
pre-print
We propose a simple "deep GNN, shallow sampler" design principle to improve both the GNN accuracy and efficiency -- to generate representation of a target node, we use a deep GNN to pass messages only ...
We theoretically justify why the combination of deep GNNs with shallow samplers yields the best learning performance. ...
DeepGCNs: Can GCNs go as deep as CNNs? In The IEEE International Conference on Computer Vision (ICCV), 2019. Guohao Li, Chenxin Xiong, Ali Thabet, and Bernard Ghanem. ...
arXiv:2012.01380v3
fatcat:qbztim226rc2vhpnri7hdzzbou
Dissecting the Diffusion Process in Linear Graph Convolutional Networks
[article]
2021
arXiv
pre-print
Recent works show that a linear GCN can achieve comparable performance to the original non-linear GCN while being much more computationally efficient. ...
Experiments demonstrate that our proposed DGC improves linear GCNs by a large margin and makes them competitive with many modern variants of non-linear GCNs. ...
DeepGCNs [11] shows that residual connection and dilated convolution can make GCNs go as deep as CNNs, although increased depth does not contribute much. ...
arXiv:2102.10739v2
fatcat:fbdmfrynhnbw3atavfjwnvrtni
Training Graph Neural Networks with 1000 Layers
[article]
2022
arXiv
pre-print
Deep graph neural networks (GNNs) have achieved excellent results on various tasks on increasingly large graph datasets with millions of nodes and edges. ...
However, memory complexity has become a major obstacle when training deep GNNs for practical applications due to the immense number of nodes, edges, and intermediate activations. ...
For the other datasets such as ogbn-products and ogbn-arxiv, we observe less improvement when going very deep. ...
arXiv:2106.07476v3
fatcat:fa5bkhyy6fhmpkap2koir4jbz4
Graph Neural Networks: A Review of Methods and Applications
[article]
2021
arXiv
pre-print
In recent years, variants of GNNs such as graph convolutional network (GCN), graph attention network (GAT), graph recurrent network (GRN) have demonstrated ground-breaking performances on many deep learning ...
In other domains such as learning from non-structural data like texts and images, reasoning on extracted structures (like the dependency trees of sentences and the scene graphs of images) is an important ...
It can also be combined with models like GCN, GraphSAGE and GAT to improve their performance. DeepGCNs. ...
arXiv:1812.08434v6
fatcat:ncz44kny6nairjjnysrqd5qjoi
« Previous
Showing results 1 — 15 out of 24 results