Filters








7,475 Hits in 7.1 sec

Smooth Variational Graph Embeddings for Efficient Neural Architecture Search [article]

Jovita Lukasik and David Friede and Arber Zela and Frank Hutter and Margret Keuper
2021 arXiv   pre-print
Architecture optimization from a learned embedding space for example through graph neural network based variational autoencoders builds a middle ground and leverages advantages from both sides.  ...  In this paper, we propose a two-sided variational graph autoencoder, which allows to smoothly encode and accurately reconstruct neural architectures from various search spaces.  ...  Conclusion In this paper, we proposed SVGe and SVGe triplet, a Smooth Variational Graph embedding model for NAS.  ... 
arXiv:2010.04683v3 fatcat:geopbtghf5ftfcyxwwr4ckd7ym

Graph-based Neural Architecture Search with Operation Embeddings [article]

Michail Chatzianastasis, George Dasoulas, Georgios Siolas, Michalis Vazirgiannis
2021 arXiv   pre-print
Neural Architecture Search (NAS) has recently gained increased attention, as a class of approaches that automatically searches in an input space of network architectures.  ...  Finally, our method produces top-performing architectures that share similar operation and graph patterns, highlighting a strong correlation between the structural properties of the architecture and its  ...  Graph Representation Learning for Neural Architecture Search.  ... 
arXiv:2105.04885v2 fatcat:sb4d3jnlw5bvzp6pyw5oevynpy

Neural Architecture Optimization with Graph VAE [article]

Jian Li, Yong Liu, Jiankun Liu, Weiping Wang
2020 arXiv   pre-print
In this paper, we propose an efficient NAS approach to optimize network architectures in a continuous space, where the latent space is built upon variational autoencoder (VAE) and graph neural networks  ...  Due to their high computational efficiency on a continuous space, gradient optimization methods have shown great potential in the neural architecture search (NAS) domain.  ...  Related Work Variational Graph Autoencoder Variational graph autoencoder is a kind of graph generative model using variational autoencoder (VAE) on graph embeddings.  ... 
arXiv:2006.10310v1 fatcat:knuhg6vd6fei5bxtrri2mcx2tq

Does Unsupervised Architecture Representation Learning Help Neural Architecture Search? [article]

Shen Yan, Yu Zheng, Wei Ao, Xiao Zeng, Mi Zhang
2020 arXiv   pre-print
efficiency.  ...  This helps to map neural architectures with similar performance to the same regions in the latent space and makes the transition of architectures in the latent space relatively smooth, which considerably  ...  Conducting architecture search on such smooth performance surface is much easier and is hence more efficient.  ... 
arXiv:2006.06936v2 fatcat:us3ka7vop5cr7auvfbjnt3kgee

D-VAE: A Variational Autoencoder for Directed Acyclic Graphs [article]

Muhan Zhang, Shali Jiang, Zhicheng Cui, Roman Garnett, Yixin Chen
2019 arXiv   pre-print
In this paper, we study deep generative models for DAGs, and propose a novel DAG variational autoencoder (D-VAE). To encode DAGs into the latent space, we leverage graph neural networks.  ...  We demonstrate the effectiveness of our proposed DVAE through two tasks: neural architecture search and Bayesian network structure learning.  ...  The authors would like to thank Liran Wang for the helpful discussions.  ... 
arXiv:1904.11088v4 fatcat:6aqylg4l4rbwjiqvs35pnwlwv4

Automating Neural Architecture Design without Search [article]

Zixuan Liang, Yanan Sun
2022 arXiv   pre-print
We implemented the proposed approach by using a graph neural network for link prediction and acquired the knowledge from NAS-Bench-101.  ...  Neural structure search (NAS), as the mainstream approach to automate deep neural architecture design, has achieved much success in recent years.  ...  This design could smooth the automation process. • We show that NAL can efficiently and effectively automate neural architecture for CIFAR-10 when the knowledge is obtained from NAS-Bench-101, and also  ... 
arXiv:2204.11838v1 fatcat:ge3ewpsqkjd4pjwz3wxwoq55xy

Graph-RISE: Graph-Regularized Image Semantic Embedding [article]

Da-Cheng Juan, Chun-Ta Lu, Zhen Li, Futang Peng, Aleksei Timofeev, Yi-Ting Chen, Yaxi Gao, Tom Duerig, Andrew Tomkins, Sujith Ravi
2019 arXiv   pre-print
In this paper, we present Graph-Regularized Image Semantic Embedding (Graph-RISE), a large-scale neural graph learning framework that allows us to train embeddings to discriminate an unprecedented O(40M  ...  Graph-RISE outperforms state-of-the-art image embedding algorithms on several evaluation tasks, including image classification and triplet ranking.  ...  Thomas Leung for the reviews and suggestions. We also thank Expander, Image Understanding and several related teams for the technical support.  ... 
arXiv:1902.10814v1 fatcat:m5a6vg7yz5g5bcayu3st7lua2m

Evolutionary Architecture Search for Graph Neural Networks [article]

Min Shi, David A.Wilson, Xingquan Zhu, Yu Huang, Yuan Zhuang, Jianxun Liu, Yufei Tang
2020 arXiv   pre-print
To the best of our knowledge, this is the first work to introduce and evaluate evolutionary architecture search for GNN models.  ...  In particular, Neural Architecture Search (NAS) has seen significant attention throughout the AutoML research community, and has pushed forward the state-of-the-art in a number of neural models to address  ...  Definition 2 (Graph Neural Architecture Search).  ... 
arXiv:2009.10199v1 fatcat:k2h23byz2jfebbiw3mzckveblm

Network representation learning: A macro and micro view

Xueyi Liu, Jie Tang
2021 AI Open  
Existing algorithms can be categorized into three groups: shallow embedding models, heterogeneous network embedding models, graph neural network based models.  ...  We review state-of-the-art algorithms for each category and discuss the essential differences between these algorithms.  ...  Acknowledgments The work is supported by the National Key R&D Program of China (2018YFB1402600), NSFC for Distinguished Young Scholar (61825602), NSFC (61672313), NSFC (61836013), and Tsinghua-Bosch Joint  ... 
doi:10.1016/j.aiopen.2021.02.001 fatcat:6ktfheijvjdnfhja5oqobse5b4

Learning Where To Look – Generative NAS is Surprisingly Efficient [article]

Jovita Lukasik, Steffen Jung, Margret Keuper
2022 arXiv   pre-print
The efficient, automated search for well-performing neural architectures (NAS) has drawn increasing attention in the recent past.  ...  Thereby, the predominant research objective is to reduce the necessity of costly evaluations of neural architectures while efficiently exploring large search spaces.  ...  a generator with a supernet and searches for neural architectures for different device information. [65] facilitates [62] with an MLP decoder. [35] proposes smooth variational graph embeddings (SVGe  ... 
arXiv:2203.08734v2 fatcat:mrjt27bauvhh5gdlsll2wzohee

Anisotropic Graph Convolutional Network for Semi-supervised Learning [article]

Mahsa Mesgaran, A. Ben Hamza
2020 arXiv   pre-print
However, these networks suffer from the issue of over-smoothing and shrinking effect of the graph due in large part to the fact that they diffuse features across the edges of the graph using a linear Laplacian  ...  Graph convolutional networks learn effective node embeddings that have proven to be useful in achieving high-accuracy prediction results in semi-supervised learning tasks, such as node classification.  ...  [27] present the graph attention network, which is a graph-based neural network architecture that uses an attention mechanism to assign self-attention scores to neighboring node embeddings.  ... 
arXiv:2010.10284v1 fatcat:b3mjk6th2fajhlvx2clcseyfye

RoSGAS: Adaptive Social Bot Detection with Reinforced Self-Supervised GNN Architecture Search [article]

Yingguang Yang, Renyu Yang, Yangyang Li, Kai Cui, Zhiqin Yang, Yue Wang, Jie Xu, Haiyong Xie
2022 arXiv   pre-print
to design a dedicated neural network architecture for a specific classification task.  ...  RoSGAS uses a multi-agent deep reinforcement learning (RL) mechanism for navigating the search of optimal neighborhood and network layers to learn individually the subgraph embedding for each target user  ...  ACKNOWLEDGMENT We thank anonymous reviewers for the provided helpful comments on earlier drafts of the manuscript. Zhiqin  ... 
arXiv:2206.06757v1 fatcat:hcpaigg55jcbbhb77leoplm6ba

PACE: A Parallelizable Computation Encoder for Directed Acyclic Graphs [article]

Zehao Dong, Muhan Zhang, Fuhai Li, Yixin Chen
2022 arXiv   pre-print
Optimization of directed acyclic graph (DAG) structures has many applications, such as neural architecture search (NAS) and probabilistic graphical model learning.  ...  We demonstrate the superiority of PACE through encoder-dependent optimization subroutines that search the optimal DAG structure based on the learned DAG embeddings.  ...  For NAS301, since we do not have an oracle for the (offline) best neural architecture, we use the test accuracy of the best searched neural architecture, instead. OGBG-CODE2.  ... 
arXiv:2203.10304v1 fatcat:qkuei5c7mzeppiy43jgkl57isa

Network representation learning: A macro and micro view [article]

Xueyi Liu, Jie Tang
2021 arXiv   pre-print
Existing algorithms can be categorized into three groups: shallow embedding models, heterogeneous network embedding models, graph neural network based models.  ...  We review state-of-the-art algorithms for each category and discuss the essential differences between these algorithms.  ...  Acknowledgments The work is supported by the National Key R&D Program of China (2018YFB1402600), NSFC for Distinguished Young Scholar (61825602), NSFC (61672313), NSFC (61836013), and Tsinghua-Bosch Joint  ... 
arXiv:2111.10772v1 fatcat:rmmplc4qbzhkxloauzz3micbay

Learning To Solve Circuit-SAT: An Unsupervised Differentiable Approach

Saeed Amizadeh, Sergiy Matusevych, Markus Weimer
2019 International Conference on Learning Representations  
Our framework is built upon two fundamental contributions: a rich embedding architecture that encodes the problem structure, and an end-to-end differentiable training procedure that mimics Reinforcement  ...  Recent efforts to combine Representation Learning with Formal Methods, commonly known as Neuro-Symbolic Methods, have given rise to a new trend of applying rich neural architectures to solve classical  ...  For this very reason, there has been recently a strong push to employ structure-aware architectures such as different variations of neural graph embedding.  ... 
dblp:conf/iclr/AmizadehMW19 fatcat:byst75hnqrce7ch3gjd4ugc4dq
« Previous Showing results 1 — 15 out of 7,475 results