Filters








17,571 Hits in 4.8 sec

Multi-scale Graph Convolutional Networks with Self-Attention [article]

Zhilong Xiong, Jia Cai
2021 arXiv   pre-print
Graph convolutional networks (GCNs) have achieved remarkable learning ability for dealing with various graph structural data recently.  ...  In this paper, we propose two novel multi-scale GCN frameworks by incorporating self-attention mechanism and multi-scale information into the design of GCNs.  ...  Conclusion and future work In this paper, we propose two novel multi-scale graph convolutional networks based on self-attention.  ... 
arXiv:2112.03262v1 fatcat:t4jbajb4djbgfcei3sy34kn6gy

Stack Attention-Pruning Aggregates Multiscale Graph Convolution Networks for Hyperspectral Remote Sensing Image Classification

Na Liu, Bin Zhang, Qiuhuan Ma, Qingqing Zhu, Xiaoling Liu
2021 IEEE Access  
INDEX TERMS Hyperspectral remote sensing image classification, stack attention-pruning, multiscale graph convolution networks, longdistances joint interaction, multiscale spatial-temporal information,  ...  In the work, we propose a stack attention-pruning multiscale aggregates graph convolution framework (SAP-MAGACN).  ...  (e) graph convolution black of multi-scale dense connection, where represents adjacency matrices constructed by multi-head self-attention networks; For the mask fusion layer, which integrates feature information  ... 
doi:10.1109/access.2021.3061489 fatcat:jqsobopyxnhb7ptvcldvepwk5e

Multi-hop Graph Convolutional Network with High-order Chebyshev Approximation for Text Reasoning [article]

Shuoran Jiang, Qingcai Chen, Xin Liu, Baotian Hu, Lisai Zhang
2021 arXiv   pre-print
In this study, we define the spectral graph convolutional network with the high-order dynamic Chebyshev approximation (HDGCN), which augments the multi-hop graph reasoning by fusing messages aggregated  ...  Graph convolutional network (GCN) has become popular in various natural language processing (NLP) tasks with its superiority in long-term and non-consecutive word interactions.  ...  We design a multi-vote-based cross-attention (MVCAttn) mechanism. The MVCAttn scales the computation complexity O(N 2 ) in self-attention to O(N ).  ... 
arXiv:2106.05221v1 fatcat:5cw563x46fbhxmeoe35xxzz6qm

MSASGCN : Multi-Head Self-Attention Spatiotemporal Graph Convolutional Network for Traffic Flow Forecasting

Yang Cao, Detian Liu, Qizheng Yin, Fei Xue, Hengliang Tang, Yong Zhang
2022 Journal of Advanced Transportation  
Therefore, we propose a multi-head self-attention spatiotemporal graph convolutional network (MSASGCN) model.  ...  The multi-head self-attention mechanism is a valuable method to capture dynamic spatial-temporal correlations, and combining it with graph convolutional networks is a promising solution.  ...  Graph Convolution and Multi-Head Self-Attention Module.  ... 
doi:10.1155/2022/2811961 fatcat:ieo4kxzsw5hbdgxhob5rnxgazy

SGA-Net: Self-Constructing Graph Attention Neural Network for Semantic Segmentation of Remote Sensing Images

Wenjie Zi, Wei Xiong, Hao Chen, Jun Li, Ning Jing
2021 Remote Sensing  
In this paper, a novel self-constructing graph attention neural network is proposed for such a purpose.  ...  Secondly, pixel-wise dependency graphs were constructed from the feature maps of images, and a graph attention network is designed to extract the correlations of pixels of the remote sensing images.  ...  Third, multi-view feature maps were used to obtain self-constructing graphs A 0 , A 1 , A 2 and A 3 by a convolution neural network, separately.  ... 
doi:10.3390/rs13214201 fatcat:4wfmg6n3uraarlx3ccjogjr4ee

DP-GCN: Node Classification Based on Both Connectivity and Topology Structure Convolutions for Risky Seller Detection [article]

Chen Zhe, Aixin Sun
2021 arXiv   pre-print
self-attention module to align both properties.  ...  Motivated by business need, we present a dual-path graph convolution network, named DP-GCN, for node classification. DP-GCN considers both node connectivity and topology structure similarity.  ...  Large-scale learn- [29] Ryan A. Rossi and Nesreen K. Ahmed. 2015. The Network Data Repository with able graph convolutional networks.  ... 
arXiv:2112.04757v1 fatcat:w2fozgyeqjfv5bsh7gi4ohg564

3DCTN: 3D Convolution-Transformer Network for Point Cloud Classification [article]

Dening Lu, Qian Xie, Linlin Xu, Jonathan Li
2022 arXiv   pre-print
are implemented by using Graph Convolution and Transformer respectively.  ...  This paper presents a novel hierarchical framework that incorporates convolution with Transformer for point cloud classification, named 3D Convolution-Transformer Network (3DCTN), to combine the strong  ...  Multi-scale Strategy. We compare the multi-scale local feature aggregating with the single-scale (middle scale) way , and show the results in Table III (row 3).  ... 
arXiv:2203.00828v1 fatcat:zlcs3l2xtfhivabprb5kjgysrm

Multi-scale Mixed Dense Graph Convolution Network for Skeleton-based Action Recognition

Hailun Xia, Xinkai Gao
2021 IEEE Access  
In this paper, we design a multi-scale mixed dense graph convolutional network (MMDGCN) to overcome both shortcomings.  ...  INDEX TERMS Dense graph convolution, spatial and temporal attention module, multi-scale mixed temporal convolution, skeleton-based action recognition.  ...  Illustration of the overall architecture of the multi-stream attention enhanced dense graph convolutional attention network.  ... 
doi:10.1109/access.2020.3049029 fatcat:xlmmcsmp3vbnvj422wctwwjiei

Traffic Flow Forecasting with Spatial-Temporal Graph Diffusion Network [article]

Xiyue Zhang, Chao Huang, Yong Xu, Lianghao Xia, Peng Dai, Liefeng Bo, Junbo Zhang, Yu Zheng
2021 arXiv   pre-print
Furthermore, a multi-scale attention network is developed to empower ST-GDN with the capability of capturing multi-level temporal dynamics.  ...  To tackle these challenges, we develop a new traffic prediction framework-Spatial-Temporal Graph Diffusion Network (ST-GDN).  ...  latent representations, with the cooperation of the designed multi-scale self-attention network and temporal hierarchy aggregation layer. • ST-GDN preserves both local and global region-wise dependencies  ... 
arXiv:2110.04038v1 fatcat:o5httsdvevewvmc43nwplp7hj4

GraphFPN: Graph Feature Pyramid Network for Object Detection [article]

Gangming Zhao, Weifeng Ge, Yizhou Yu
2022 arXiv   pre-print
To make these layers more powerful, we introduce two types of local channel attention for graph neural networks by generalizing global channel attention for convolutional neural networks.  ...  State-of-the-art methods for multi-scale feature learning focus on performing feature interactions across space and scales using neural networks with a fixed topology.  ...  GraphFPN receives mapped multi-scale features from the convolutional backbone.  ... 
arXiv:2108.00580v3 fatcat:il7d5yj54vfovdhrwlio55dyr4

Spatiotemporal Graph Convolutional Network for Multi-Scale Traffic Forecasting

Yi Wang, Changfeng Jing
2022 ISPRS International Journal of Geo-Information  
To address this multi-scale problem, we adopted the idea of Res2Net and designed a hierarchical temporal attention layer and hierarchical adaptive graph convolution layer.  ...  Based on the above methods, a novel model, called Temporal Residual II Graph Convolutional Network (Tres2GCN), was proposed to capture not only multi-scale spatiotemporal but also fine-grained features  ...  With the increasing popularity of GNN, several approaches to GNN have emerged in recent years, such as graph convolutional networks (GCN) [19] , Chebyshev networks (ChebNet) [20] , graph attention networks  ... 
doi:10.3390/ijgi11020102 fatcat:vfbjdc7t3bfyre3cbu5df3m4mm

Spatio-Temporal meets Wavelet: Disentangled Traffic Flow Forecasting via Efficient Spectral Graph Attention Network [article]

Yuchen Fang, Yanjun Qin, Haiyong Luo, Fang Zhao, Bingbing Xu, Chenxing Wang, Liang Zeng
2022 arXiv   pre-print
To achieve the effective traffic flow forecasting, we propose an efficient spectral graph attention network with disentangled traffic sequences.  ...  positional encoding limit the extraction of spatial information in the commonly used full graph attention network; iii) the quadratic complexity of the full graph attention introduces heavy computational  ...  Gman: A graph multi- attention network for traffic prediction. In Proceedings of AAAI, 2020.  ... 
arXiv:2112.02740v2 fatcat:xbaudqqkbzhz5jjiva3gmkx33y

Multi-Stream Semantics-Guided Dynamic Aggregation Graph Convolution Networks to Extract Overlapping Relations

XiuShan Liu, Jun Cheng, Qin Zhang
2021 IEEE Access  
convolution network (SG-DAGCN) to realize the extraction of overlapping relations.  ...  Subsequently, this framework models the relational graphs between the entities through a dynamic aggregation graph convolution module and gradually produces the discriminative embedded features and a refined  ...  of the graph convolution layers, and W is the weight matrix. 2) MULTI-SCALE GRAPH CONSTRUCTION The structural information of the multiscale graph is useful for the entity and relation extraction.  ... 
doi:10.1109/access.2021.3062231 fatcat:zu4rrzkalfcsxdol7x63lzhadu

Temporal‐enhanced graph convolution network for skeleton‐based action recognition

Yulai Xie, Yang Zhang, Fang Ren
2022 IET Computer Vision  
Graph convolution networks (GCNs) have drawn attention for skeleton-based action recognition.  ...  Experimental results on three large-scale datasets, NTU-RGB + D, Kinetics-Skeleton, and UAV-Human, indicate that the authors' network achieves accuracy improvement with better generalisation capability  ...  [22] present a similar approach in which the multi-scale convolutional filters are used to encode the spatial graph structure.  ... 
doi:10.1049/cvi2.12086 fatcat:7dixtccaevculmwknpccrtgiam

Fast and Accurate: Structure Coherence Component for Face Alignment [article]

Beier Zhu, Chunze Lin, Quan Wang, Renjie Liao, Chen Qian
2020 arXiv   pre-print
Instead, our structure coherence component leverages a dynamic sparse graph structure to passing features among the most related landmarks.  ...  Extensive experiments on three popular benchmarks, including WFLW, COFW and 300W, demonstrate the effectiveness of the proposed method, achieving state-of-the-art performance with fast speed.  ...  (c) Sparse and relation-aware graph convolutional layer. Fig. 3 . 3 The architecture of the attention guided multi-scale feature learning module.  ... 
arXiv:2006.11697v1 fatcat:o2jfaz5ivjfjvix2gdf4urshmu
« Previous Showing results 1 — 15 out of 17,571 results