Filters








2,337 Hits in 1.9 sec

Depth-based Subgraph Convolutional Neural Networks

Chuanyu Xu, Dong Wang, Zhihong Zhang, Beizhan Wang, Da Zhou, Guijun Ren, Lu Bai, Lixin Cui, Edwin R. Hancock
2018 2018 24th International Conference on Pattern Recognition (ICPR)  
This paper proposes a new graph convolutional neural architecture based on a depth-based representation of graph structure, called the depth-based subgraph convolutional neural networks (DS-CNNs), which  ...  The idea is to apply convolution filters sliding over the entire subgraphs of a vertex to extract the local features analogous to the standard convolution operation on grid data.  ...  Then, we present our depth-based subgraph convolution neural networks for the K-level m-ary tree. Figure 1 shows an example of the whole process with K = 4 and m = 3.  ... 
doi:10.1109/icpr.2018.8545090 dblp:conf/icpr/XuWZWZR0CH18 fatcat:hkqnbgelareopg3jljh7m65k3e

Broken Character Recognition using Connected Components and Convolutional Neural Network

2020 International journal of recent technology and engineering  
The proposed method uses a hybrid approach which uses connected component concepts and convolutional neural network to identify the broken characters.  ...  The Convolutional neural networks are already trained and tested using chars74k dataset.We use convolutional neural networks due to higher rate of recognition when compared to others. IV.  ...  The Convolutional neural network can recognize any type and kind of characters.  ... 
doi:10.35940/ijrte.e1011.0285s20 fatcat:h37euk4h5vgufdxrpv3hqzyogm

Automated flow for compressing convolution neural networks for efficient edge-computation with FPGA [article]

Farhan Shafiq, Takato Yamada, Antonio T. Vilchez, Sakyasingha Dasgupta
2017 arXiv   pre-print
Deep convolutional neural networks (CNN) based solutions are the current state- of-the-art for computer vision tasks.  ...  This flow involves quantization of model parameters and activations, generation of network and model in embedded-C, followed by automatic generation of the FPGA accelerator for binary convolutions.  ...  Introduction Deep Convolutional Neural Networks (CNN) have achieved significant results in computer vision, speech recognition and language translation.  ... 
arXiv:1712.06272v1 fatcat:3dwv7runyndtdlvicdctb633bm

Structural Temporal Graph Neural Networks for Anomaly Detection in Dynamic Graphs [article]

Lei Cai, Zhengzhang Chen, Chen Luo, Jiaping Gui, Jingchao Ni, Ding Li, Haifeng Chen
2020 arXiv   pre-print
Previous network embedding based methods have been mostly focusing on learning good node representations, whereas largely ignoring the subgraph structural changes related to the target nodes in dynamic  ...  In this paper, we propose StrGNN, an end-to-end structural temporal Graph Neural Network model for detecting anomalous edges in dynamic graphs.  ...  Then, GSFE module leverages Graph Convolution Neural Network and pooling technologies to extract fixed-size feature from each subgraph.  ... 
arXiv:2005.07427v2 fatcat:w6tpcpvehfag5nrfcyae463axm

Abnormal Event Detection via Feature Expectation Subgraph Calibrating Classification in Video Surveillance Scenes

Ou Ye, Jun Deng, Zhenhua Yu, Tao Liu, Lihong Dong
2020 IEEE Access  
First, we employ convolutional neural network and long short-term memory models to extract the spatiotemporal features of video frame, and then construct the feature expectation subgraph for each key frame  ...  At present, the existing abnormal event detection models based on deep learning mainly focus on data represented by a vectorial form, which pay little attention to the impact of the internal structure  ...  Convolutional neural networks are a kind of common deep neural network, which are suitable for spatial relationships learning on raw input data.  ... 
doi:10.1109/access.2020.2997357 fatcat:yrytr47smnapdla7cwo7sbyqma

DLA: Compiler and FPGA Overlay for Neural Network Inference Acceleration [article]

Mohamed S. Abdelfattah, David Han, Andrew Bitar, Roberto DiCecco, Shane OConnell, Nitika Shanker, Joseph Chu, Ian Prins, Joshua Fender, Andrew C. Ling, Gordon R. Chiu
2018 arXiv   pre-print
We show how our graph compiler performs architecture-driven software optimizations to significantly boost performance of both convolutional and recurrent neural networks (CNNs/RNNs) - we demonstrate a  ...  ) network.  ...  For example, with convolutional neural networks (CNNs), a subgraph is typically a single convolution with an optional pooling layer afterwards.  ... 
arXiv:1807.06434v1 fatcat:xmtpjod5vnarvi6kkhogwwiwti

Anomalous Subgraph Detection in Given Expected Degree Networks with Deep Learning

Mingan Luan, Bo Wang, Yanping Zhao, Fengye Hu
2021 IEEE Access  
INDEX TERMS Anomalous subgraph detection, given expected degree models, deep learning, convolutional neural network.  ...  Furthermore, based on the developed framework, we propose a residual matrix-based convolutional neural network (RM-CNN) algorithm with respect to the given expected degree models, which are more general  ...  Based on the developed detection framework, we propose a residual matrix-based convolutional neural network (RM-CNN) algorithm for the anomalous subgraph detection associated with the given expected degree  ... 
doi:10.1109/access.2021.3073696 fatcat:hrj45ljmjzghjh7fyudgf76msu

Residual convolutional graph neural network with subgraph attention pooling

Yutai Duan, Jianming Wang, Haoran Ma, Yukuan Sun
2022 Tsinghua Science and Technology  
In this work, we propose a residual convolutional graph neural network to tackle the problem of key classification features losing.  ...  By feeding discarded features back into the network architecture, we reduce the probability of losing critical features for graph classification. (3) We propose a new method for graph-level representation  ...  Introduction The recent success of deep learning neural networks (e.g., convolutional neural networks (CNNs) [1] and recurrent neural networks (RNNs) [2] ) has boosted research on pattern recognition  ... 
doi:10.26599/tst.2021.9010058 fatcat:bz4cfxu2nvf4xcd7wn7sotuow4

EWS-GCN: Edge Weight-Shared Graph Convolutional Network for Transactional Banking Data [article]

Ivan Sukharev, Valentina Shumovskaia, Kirill Fedyanin, Maxim Panov, Dmitry Berestnev
2020 arXiv   pre-print
As a final solution, we develop a new graph neural network model EWS-GCN that combines ideas of graph convolutional and recurrent neural networks via attention mechanism.  ...  We also demonstrate that our model outperforms the state-of-the-art graph neural networks achieving excellent results  ...  In this work, we consider models based only on the client purchases (for example, the Recurrent Neural Network (RNN)) as well as models explicitly employing the graph structure such as Graph Neural Network  ... 
arXiv:2009.14588v1 fatcat:3sgawtastjhanjm5yfuddecmi4

HAO: Hardware-aware neural Architecture Optimization for Efficient Inference [article]

Zhen Dong, Yizhao Gao, Qijing Huang, John Wawrzynek, Hayden K.H. So, Kurt Keutzer
2021 arXiv   pre-print
However, this process remains challenging due to the intractable search space of neural network architectures and hardware accelerator implementation.  ...  Differing from existing hardware-aware neural architecture search (NAS) algorithms that rely solely on the expensive learning-based approaches, our work incorporates integer programming into the search  ...  DW Conv stands for depth-wise convolution. power consumption with no workload running on the programming logic side and 5.5W power when running the network.  ... 
arXiv:2104.12766v1 fatcat:wvpt6sil4zhf5dknqhv5zj76lu

RRGCCAN: Re-ranking via Graph Convolution Channel Attention Network for Person Re-Identification

Xiaoqiang Chen, Ling Zheng, Chong Zhao, Qicong Wang, Maozhen Li
2020 IEEE Access  
Experimental study shows that the proposed network structure is superior to the state-of-the-art deep neural networks on three very challenging datasets that are popular in examining person re-identification  ...  INDEX TERMS Person re-identification, graph convolution network, attention mechanism, context information.  ...  A series of advanced convolutional neural networks are designed to deal with graph structure data.  ... 
doi:10.1109/access.2020.3009653 fatcat:qjiss6dbtfboxnd6adroyndjji

Heterogeneous Graph Neural Networks for Malicious Account Detection [article]

Ziqi Liu, Chaochao Chen, Xinxing Yang, Jun Zhou, Xiaolong Li, Le Song
2020 arXiv   pre-print
We present, GEM, the first heterogeneous graph neural network approach for detecting malicious accounts at Alipay, one of the world's leading mobile cashless payment platform.  ...  Our approach, inspired from a connected subgraph approach, adaptively learns discriminative embeddings from heterogeneous account-device graphs based on two fundamental weaknesses of attackers, i.e. device  ...  For graph convolutional network-style methods including our methods, we set embedding size as 16 with a depth of the convolution layers as 5, unless otherwise stated.  ... 
arXiv:2002.12307v1 fatcat:zatv56fb2bgddbwpf77i32zq6y

A Convolutional Neural Network for Modelling Sentences [article]

Nal Kalchbrenner, Edward Grefenstette, Phil Blunsom
2014 arXiv   pre-print
We describe a convolutional architecture dubbed the Dynamic Convolutional Neural Network (DCNN) that we adopt for the semantic modelling of sentences.  ...  The network does not rely on a parse tree and is easily applicable to any language.  ...  A central class of models are those based on neural networks.  ... 
arXiv:1404.2188v1 fatcat:nxrvbawn7bbehk6g5st2nm6gbm

Decoupling the Depth and Scope of Graph Neural Networks [article]

Hanqing Zeng, Muhan Zhang, Yinglong Xia, Ajitesh Srivastava, Andrey Malevich, Rajgopal Kannan, Viktor Prasanna, Long Jin, Ren Chen
2022 arXiv   pre-print
State-of-the-art Graph Neural Networks (GNNs) have limited scalability with respect to the graph and model sizes.  ...  scope, and then apply a GNN of arbitrary depth on top of the subgraph.  ...  Graph convolutional neural networks for web-scale recommender systems.  ... 
arXiv:2201.07858v1 fatcat:lzoilhqdrnbefntai4onjjoie4

A Convolutional Neural Network for Modelling Sentences

Nal Kalchbrenner, Edward Grefenstette, Phil Blunsom
2014 Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)  
We describe a convolutional architecture dubbed the Dynamic Convolutional Neural Network (DCNN) that we adopt for the semantic modelling of sentences.  ...  The network does not rely on a parse tree and is easily applicable to any language.  ...  A central class of models are those based on neural networks.  ... 
doi:10.3115/v1/p14-1062 dblp:conf/acl/KalchbrennerGB14 fatcat:zqy3b2v2qzajnnpz7lvfz7eea4
« Previous Showing results 1 — 15 out of 2,337 results