Filters








8,994 Hits in 2.8 sec

Improving Recommendation Techniques by Deep Learning and Large Scale Graph Partitioning

Gourav Bathla, Rinkle Rani, Himanshu Aggarwal
2018 International Journal of Advanced Computer Science and Applications  
Social big graph is most suitable for large scale social data. Further improvements for recommendations are explained with the use of large scale graph partitioning.  ...  Recommendation is very crucial technique for social networking sites and business organizations.  ...  Convolutional neural network, deep feedforward model, recurrent neural network model and deep belief model are described with their relevance.  ... 
doi:10.14569/ijacsa.2018.091049 fatcat:duhdn5ipknbflekirpkpnpvdhi

A Review of Graph Signal Processing with Neural Networks

Yuzhong Yan, Cajetan M. Akujuobi
2022 North atlantic university union: International Journal of Circuits, Systems and Signal Processing  
For the popular topics on processing the graph data with neural networks, the main models/frameworks, dataset and applications are discussed in details.  ...  In this paper, we review the development of the traditional graph signal processing methodology, and the recent research areas that are applying graph neural networks on graph data.  ...  At the same time, with the great success of deep neural networks applied in image analytics and natural language processing, and the more intuitive thought is applying the deep neural networks on the large  ... 
doi:10.46300/9106.2022.16.91 fatcat:l6llhel3xfbhjhfkxzyoxp4oqm

PyTorch Geometric Temporal: Spatiotemporal Signal Processing with Neural Machine Learning Models

Benedek Rozemberczki, Paul Scherer, Yixuan He, George Panagopoulos, Alexander Riedel, Maria Astefanoaei, Oliver Kiss, Ferenc Beres, Guzmán López, Nicolas Collignon, Rik Sarkar
2021 Proceedings of the 30th ACM International Conference on Information & Knowledge Management  
PyTorch Geometric Temporal was created with foundations on existing libraries in the PyTorch eco-system, streamlined neural network layer definitions, temporal snapshot generators for batching, and integrated  ...  We present PyTorch Geometric Temporal a deep learning framework combining state-of-the-art machine learning algorithms for neural spatiotemporal signal processing.  ...  Listings 6: Evaluating the recurrent graph convolutional neural network with GPU based acceleration.  ... 
doi:10.1145/3459637.3482014 fatcat:gzljodd7c5cqjg2cdnbfawq7hq

PyTorch Geometric Temporal: Spatiotemporal Signal Processing with Neural Machine Learning Models [article]

Benedek Rozemberczki and Paul Scherer and Yixuan He and George Panagopoulos and Alexander Riedel and Maria Astefanoaei and Oliver Kiss and Ferenc Beres and Guzmán López and Nicolas Collignon and Rik Sarkar
2021 arXiv   pre-print
PyTorch Geometric Temporal was created with foundations on existing libraries in the PyTorch eco-system, streamlined neural network layer definitions, temporal snapshot generators for batching, and integrated  ...  We present PyTorch Geometric Temporal a deep learning framework combining state-of-the-art machine learning algorithms for neural spatiotemporal signal processing.  ...  Operating on a temporal graph sequence these models perform message-passing at each time point with a graph neural network block and the new temporal information is incorporated by a temporal deep learning  ... 
arXiv:2104.07788v3 fatcat:ktnji6kjrzd7no6blp6zimfutu

Survey on Software Tools that Implement Deep Learning Algorithms on Intel/x86 and IBM/Power8/Power9 Platforms

2019 Supercomputing Frontiers and Innovations  
But to get these results neural networks become progressively more complex, thus needing a lot more training. The training of neural networks today can take weeks.  ...  Neural networks are becoming more and more popular in scientific field and in the industry.  ...  Acknowledgements The results described in this paper were obtained with the financial support of the grant from the Russian Federation President Fund (MK-2330.2019.9).  ... 
doi:10.14529/jsfi190404 fatcat:7ou5vt4bdvdljcvzjz2jsx7osa

Analyzing the Performance of Graph Neural Networks with Pipe Parallelism [article]

Matthew T. Dearing, Xiaoyan Wang
2021 arXiv   pre-print
In this study, we focus on Graph Neural Networks (GNN) that have found great success in tasks such as node or edge classification and link prediction.  ...  Many interesting datasets ubiquitous in machine learning and deep learning can be described via graphs.  ...  The neural network model was implemented in PyTorch with the graph frameworks, PyTorch Geometric (PyG) (Fey & Lenssen, 2019) , and Deep Graph Library (DGL) (Wang et al., 2019) .  ... 
arXiv:2012.10840v2 fatcat:ygantb35i5ghrkulczmx7dyb2e

RPC Considered Harmful: Fast Distributed Deep Learning on RDMA [article]

Jilong Xue, Youshan Miao, Cheng Chen, Ming Wu, Lintao Zhang, Lidong Zhou
2018 arXiv   pre-print
The tensor abstraction and data-flow graph, coupled with an RDMA network, offers the opportunity to reduce the unnecessary overhead (e.g., memory copy) without sacrificing programmability and generality  ...  We show that RPC is sub-optimal for distributed deep learning computation, especially on an RDMA-capable network.  ...  convolutional neural network (CNN), recurrent neural network (RNN) and fully connected neural network (FCN).  ... 
arXiv:1805.08430v1 fatcat:6ifs7r7suvdlzojyvcsocbu5ia

NeuGraph: Parallel Deep Neural Network Computation on Large Graphs

Lingxiao Ma, Zhi Yang, Youshan Miao, Jilong Xue, Ming Wu, Lidong Zhou, Yafei Dai
2019 USENIX Annual Technical Conference  
We present NeuGraph, a new framework that bridges the graph and dataflow models to support efficient and scalable parallel neural network computation on graphs.  ...  This evolution has led to large graph-based neural network models that go beyond what existing deep learning frameworks or graph computing systems are designed for.  ...  These methods, known as graph neural networks (GNNs), combine standard neural networks with iterative graph propagation: the property of a vertex is computed recursively (with neural networks) from the  ... 
dblp:conf/usenix/MaYMXWZD19 fatcat:zr2sgdhlefa3rj77j3hi3bsvnq

ChemicalX: A Deep Learning Library for Drug Pair Scoring [article]

Benedek Rozemberczki, Charles Tapley Hoyt, Anna Gogleva, Piotr Grabowski, Klas Karis, Andrej Lamov, Andriy Nikolov, Sebastian Nilsson, Michael Ughetto, Yu Wang, Tyler Derr, Benjamin M Gyori
2022 arXiv   pre-print
Our system provides neural network layers, custom pair scoring architectures, data loaders, and batch iterators for end users.  ...  In this paper, we introduce ChemicalX, a PyTorch-based deep learning library designed for providing a range of state of the art models to solve the drug pair scoring task.  ...  It can use various architectures such as feedforward neural networks or graph neural networks to achieve this. Definition 5. Context encoder.  ... 
arXiv:2202.05240v3 fatcat:fikimsykfrey3kchdrezkbal5y

Solving differential equations with unknown constitutive relations as recurrent neural networks [article]

Tobias Hagge, Panos Stinis, Enoch Yeung, Alexandre M. Tartakovsky
2017 arXiv   pre-print
Use of techniques from recent deep learning literature enables training of functions with behavior manifesting over thousands of time steps.  ...  We extend TensorFlow's recurrent neural network architecture to create a simple but scalable and effective solver for the unknown functions, and apply it to a fedbatch bioreactor simulation problem.  ...  TensorFlow Basics TensorFlow [1] is an open source software library for scalable, vectorized numerical computation.  ... 
arXiv:1710.02242v1 fatcat:2kuqcgfanvax5jt6bz4cmqjjpi

Scalable variational Monte Carlo with graph neural ansatz [article]

Li Yang, Wenjun Hu, Li Li
2020 arXiv   pre-print
Deep neural networks have been shown as a potentially powerful ansatz in variational Monte Carlo for solving quantum many-body problems. We propose two improvements in this direction.  ...  The first is graph neural ansatz (GNA), which is a variational wavefunction universal to arbitrary geometry.  ...  Broader Impact This research develops a scalable variational Monte Carlo (VMC) algorithm and a universal graph neural ansatz (GNA).  ... 
arXiv:2011.12453v1 fatcat:vrccelrumzfvjadebwvwvjfmxu

Scalable Power Control/Beamforming in Heterogeneous Wireless Networks with Graph Neural Networks [article]

Xiaochen Zhang, Haitao Zhao, Jun Xiong, Li Zhou, Jibo Wei
2021 arXiv   pre-print
interference graph neural network (HIGNN) to handle these challenges.  ...  It is noteworthy that HIGNN is scalable to wireless networks of growing sizes with robust performance after trained on small-sized networks.  ...  Sun, et al., where deep neural networks (DNNs) are scalable to wireless networks of growing sizes with robust perfor- adopted to imitate the input-output  ... 
arXiv:2104.05463v2 fatcat:t4gyp47cyvhzzeemgakz7gecwu

Resources [chapter]

Zhiyuan Liu, Yankai Lin, Maosong Sun
2020 Representation Learning for Natural Language Processing  
However, training a deep neural network is usually a very time-intensive process and requires lots of code to build related models.  ...  To alleviate these issues, some deep learning frameworks have been developed and released, which incorporate some existing and necessary arithmetic operators for neural network constructions.  ...  Moreover, GraphVite is designed to be scalable. Even with limited memory, GraphVite can process node embedding task on billion-scale graphs.  ... 
doi:10.1007/978-981-15-5573-2_10 fatcat:qs6uihkvjnfmndbdtr32buqt7y

Review on Different Software Tools for Deep Learning

Anshuja Anand Meshram
2022 International Journal for Research in Applied Science and Engineering Technology  
In this review paper we have discussed the features of some popular open source software tools available for deep learning along with their advantages and disadvantages.  ...  Abstract: Deep Learning Applications are being applied in various domains in recent years. Training a deep learning model is a very time consuming task.  ...  a programming support of deep neural networks and machine learning techniques. c) It includes a high scalable feature of computation with various data sets. d) TensorFlow uses GPU computing, automating  ... 
doi:10.22214/ijraset.2022.39873 fatcat:dyrasieoujgw3damqexfq2urfm

Representation Learning on Spatial Networks

Zheng Zhan, Liang Zhao
2021 Neural Information Processing Systems  
Hence it can not be modeled merely with either spatial or network models individually.  ...  Spatial networks are networks for which the nodes and edges are constrained by geometry and embedded in real space, which has crucial effects on their topological properties.  ...  . • Geometric Deep Learning on Graphs. The earliest attempts we are aware of to generalize neural networks to graphs are attributed to M. Gori et at. [40] .  ... 
dblp:conf/nips/ZhanZ21 fatcat:djdflwk635anpf6sqyjbacbs5q
« Previous Showing results 1 — 15 out of 8,994 results