1,972 Hits in 6.4 sec

Fitting the Search Space of Weight-sharing NAS with Graph Convolutional Networks [article]

Xin Chen, Lingxi Xie, Jun Wu, Longhui Wei, Yuhui Xu, Qi Tian
2020 arXiv   pre-print
We alleviate this issue by training a graph convolutional network to fit the performance of sampled sub-networks so that the impact of random errors becomes minimal.  ...  of architectures in the entire search space.  ...  Conclusions This paper introduces a novel idea that uses graph convolutional network to assist weight-sharing neural architecture search.  ... 
arXiv:2004.08423v2 fatcat:jak2qi2wm5ffbflnpylyfq2txq

Efficient Deep Reinforcement Learning via Adaptive Policy Transfer

Tianpei Yang, Jianye Hao, Zhaopeng Meng, Zongzhang Zhang, Yujing Hu, Yingfeng Chen, Changjie Fan, Weixun Wang, Wulong Liu, Zhaodong Wang, Jiajie Peng
2020 Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence  
PTF can be easily combined with existing DRL methods and experimental results show it significantly accelerates RL and surpasses state-of-the-art policy transfer methods in terms of learning efficiency  ...  and final performance in both discrete and continuous action spaces.  ...  A neural architecture in the search space can be seen as a sub-graph of the one-shot model.  ... 
doi:10.24963/ijcai.2020/424 dblp:conf/ijcai/WangXYYHS20 fatcat:u4byp6tcq5cafbpt54533hjvn4

Neural Graph Embedding for Neural Architecture Search

Wei Li, Shaogang Gong, Xiatian Zhu
Existing neural architecture search (NAS) methods often operate in discrete or continuous spaces directly, which ignores the graphical topology knowledge of neural networks.  ...  Specifically, we represent the building block (i.e. the cell) of neural networks with a neural DAG, and learn it by leveraging a Graph Convolutional Network to propagate and model the intrinsic topology  ...  Acknowledgments This work is supported by the China Scholarship Council, the Alan Turing Institute, and Innovate UK Industrial Challenge Project (98111-571149).  ... 
doi:10.1609/aaai.v34i04.5903 fatcat:iphvazyxbbhirkhh5cop2e7tya

An Introduction to Neural Architecture Search for Convolutional Networks [article]

George Kyriakides, Konstantinos Margaritis
2020 arXiv   pre-print
In this work, we provide an introduction to the basic concepts of NAS for convolutional networks, along with the major advances in search spaces, algorithms and evaluation techniques.  ...  Neural Architecture Search (NAS) is a research field concerned with utilizing optimization algorithms to design optimal neural network architectures.  ...  Evaluation methods may further introduce bias in the search, as they can augment the fitness landscape of the search space.  ... 
arXiv:2005.11074v1 fatcat:dlbggh7f5vfgvirjklonhudx24

Graph HyperNetworks for Neural Architecture Search [article]

Chris Zhang, Mengye Ren, Raquel Urtasun
2020 arXiv   pre-print
To perform NAS, we randomly sample architectures and use the validation accuracy of networks with GHN generated weights as the surrogate search signal.  ...  In this work, we propose the Graph HyperNetwork (GHN) to amortize the search cost: given an architecture, it directly generates the weights by running inference on a graph neural network.  ...  We review the two major building blocks of our model: graph neural networks and hypernetworks.  ... 
arXiv:1810.05749v3 fatcat:u6ox5k5tifacvlveblgri45eha

GraphNAS: Graph Neural Architecture Search with Reinforcement Learning [article]

Yang Gao, Hong Yang, Peng Zhang, Chuan Zhou, Yue Hu
2019 arXiv   pre-print
Specifically, GraphNAS first uses a recurrent network to generate variable-length strings that describe the architectures of graph neural networks, and then trains the recurrent network with reinforcement  ...  In this paper, we propose a Graph Neural Architecture Search method (GraphNAS for short) that enables automatic search of the best graph neural architecture based on reinforcement learning.  ...  The controller network used in Graph-NAS is implemented as a recurrent neural network which requires a state space.  ... 
arXiv:1904.09981v2 fatcat:5pyltylpyrf7rhaqhrm32xlmuy

FL-AGCNS: Federated Learning Framework for Automatic Graph Convolutional Network Search [article]

Chunnan Wang, Bozhou Chen, Geng Li, Hongzhi Wang
2021 arXiv   pre-print
Recently, some Neural Architecture Search (NAS) techniques are proposed for the automatic design of Graph Convolutional Network (GCN) architectures.  ...  Besides, it applies the GCN SuperNet and a weight sharing strategy to speed up the evaluation of GCN models.  ...  Graph Convolutional Network Graph Convolutional Networks (GCNs) are a kind of neural networks which generalize the operation of convolution from grid data to graph data (Wu et al., 2019c) .  ... 
arXiv:2104.04141v1 fatcat:i4uhec6gcjbwtfwgbhkvy33nd4

Evolutionary Architecture Search for Graph Neural Networks [article]

Min Shi, David A.Wilson, Xingquan Zhu, Yu Huang, Yuan Zhuang, Jianxun Liu, Yufei Tang
2020 arXiv   pre-print
find the best fit of each other.  ...  However, very litter work has been done about Graph Neural Networks (GNN) learning on unstructured network data.  ...  through the design of multiple graph convolution layers.  ... 
arXiv:2009.10199v1 fatcat:k2h23byz2jfebbiw3mzckveblm

Evolutionary NAS with Gene Expression Programming of Cellular Encoding [article]

Clifford Broni-Bediako, Yuki Murata, Luiz Henrique Mormille, Masayasu Atsumi
2020 arXiv   pre-print
The renaissance of neural architecture search (NAS) has seen classical methods such as genetic algorithms (GA) and genetic programming (GP) being exploited for convolutional neural network (CNN) architectures  ...  tasks; and achieves a competitive classification error rate with the existing NAS methods using less GPU resources.  ...  Search Space The basic search units in the search space consists of regular convolutions with batch-norm and ReLU and CE program symbols.  ... 
arXiv:2005.13110v2 fatcat:7j3t2eb6hvbjbotnu7wq6oorv4

Bridging the Gap between Sample-based and One-shot Neural Architecture Search with BONAS [article]

Han Shi, Renjie Pi, Hang Xu, Zhenguo Li, James T. Kwok, Tong Zhang
2020 arXiv   pre-print
However, due to the weight-sharing of vastly different networks, the one-shot approach is less reliable than the sample-based approach.  ...  Sample-based NAS is the most reliable approach which aims at exploring the search space and evaluating the most promising architectures. However, it is computationally very costly.  ...  As weight-sharing is now performed only on a small subset of similarly-performing sub-networks with high BO scores, this is more reasonable than sharing the weights of all sub-networks in the search space  ... 
arXiv:1911.09336v4 fatcat:md2fzod77zg55lf7zxmbzewama

NAS-FCOS: Fast Neural Architecture Search for Object Detection [article]

Ning Wang, Yang Gao, Hao Chen, Peng Wang, Zhi Tian, Chunhua Shen, Yanning Zhang
2020 arXiv   pre-print
With carefully designed search space, search algorithms and strategies for evaluating network quality, we are able to efficiently search a top-performing detection architecture within 4 days using 8 V100  ...  Here we propose to search for the decoder structure of object detectors with search efficiency being taken into consideration.  ...  Fig. 5 shows a trend graph of head weight sharing during search. We set 50 structures as a statistical cycle.  ... 
arXiv:1906.04423v4 fatcat:ujblotcgjnd4boaq3fmegv5zvu

Poisoning the Search Space in Neural Architecture Search [article]

Robert Wu, Nayan Saxena, Rohan Jain
2021 arXiv   pre-print
In this paper, we evaluate the robustness of one such algorithm known as Efficient NAS (ENAS) against data agnostic poisoning attacks on the original search space with carefully designed ineffective operations  ...  More recently, this process of finding the most optimal architectures, given an initial search space of possible operations, was automated by Neural Architecture Search (NAS).  ...  We are also grateful to Kanav Singla (University of Toronto) and Benjamin Zhuo (University of Toronto) for their initial contributions to the codebase.  ... 
arXiv:2106.14406v1 fatcat:djzxfy3kerei5mabmqg2o5a7va

Optimizing Neural Architecture Search using Limited GPU Time in a Dynamic Search Space: A Gene Expression Programming Approach [article]

Jeovane Honorio Alves, Lucas Ferrari de Oliveira
2020 arXiv   pre-print
Despite having limited GPU resource time and broad search space, our proposal achieved similar state-of-the-art to manually-designed convolutional networks and also NAS-generated ones, even beating similar  ...  in evolutionary-based NAS, for search and network representation improvements.  ...  ACKNOWLEDGMENT We would like to thank NVIDIA Corporation with the donation of the Titan Xp GPU used in our experiments.  ... 
arXiv:2005.07669v1 fatcat:xvbfzdiphzduplawqupb3wptsu

Learning Graph Convolutional Network for Skeleton-Based Human Action Recognition by Neural Searching

Wei Peng, Xiaopeng Hong, Haoyu Chen, Guoying Zhao
Human action recognition from skeleton data, fuelled by the Graph Convolutional Network (GCN) with its powerful capability of modeling non-Euclidean data, has attracted lots of attention.  ...  Specifically, we explore the spatial-temporal correlations between nodes and build a search space with multiple dynamic graph modules.  ...  /2017), Infotech Oulu, and the National Natural Science Foundation of China (Grants No. 61772419).  ... 
doi:10.1609/aaai.v34i03.5652 fatcat:caoobxaaj5cexedhx4bkp4nzje

Efficient Architecture Search for Deep Neural Networks

Ram Deepak Gottapu, Cihan H Dagli
2020 Procedia Computer Science  
This paper addresses the scalability challenge of automatic deep neural architecture search by implementing a parameter sharing approach with regularized genetic algorithm (RGE).  ...  Because of parameter sharing the trained weights in each generation are carried to the next, thereby reducing the GPU hours required for maximizing the validation accuracy.  ...  These random modifications make the genetic algorithm traverse the search space to find an architecture that maximizes the fitness value/validation accuracy.  ... 
doi:10.1016/j.procs.2020.02.246 fatcat:vvxkomnj7zegditowshkp2t6t4
« Previous Showing results 1 — 15 out of 1,972 results