17,970 Hits in 5.7 sec

Accelerating Multi-Objective Neural Architecture Search by Random-Weight Evaluation [article]

Shengran Hu, Ran Cheng, Cheng He, Zhichao Lu, Jing Wang, Miao Zhang
2021 arXiv   pre-print
For the goal of automated design of high-performance deep convolutional neural networks (CNNs), Neural Architecture Search (NAS) methodology is becoming increasingly important for both academia and industries.Due  ...  a complexity metric is adopted for multi-objective NAS to balance the model size and performance.  ...  Then we present our proposed approach in Section 3, including the detailed random-Weight evaluation, search strategy, and search space and encoding.  ... 
arXiv:2110.05242v1 fatcat:ka663tjptngwtpov2nmbf7ydfq

Searching Toward Pareto-Optimal Device-Aware Neural Architectures [article]

An-Chieh Cheng, Jin-Dong Dong, Chi-Hung Hsu, Shu-Huan Chang, Min Sun, Shih-Chieh Chang, Jia-Yu Pan, Yu-Ting Chen, Wei Wei, Da-Cheng Juan
2018 arXiv   pre-print
Both MONAS and DPP-Net are capable of optimizing accuracy and other objectives imposed by devices, searching for neural architectures that can be best deployed on a wide spectrum of devices: from embedded  ...  Recent breakthroughs in Neural Architectural Search (NAS) have achieved state-of-the-art performance in many tasks such as image classification and language understanding.  ...  , (c) search acceleration, and (d) multi-objective search.  ... 
arXiv:1808.09830v2 fatcat:423unviiuzabhdxnkwhaglt64m

Automated design of error-resilient and hardware-efficient deep neural networks [article]

Christoph Schorn, Thomas Elsken, Sebastian Vogel, Armin Runge, Andre Guntoro, Gerd Ascheid
2019 arXiv   pre-print
Since there are numerous design choices for the architecture of DNNs, with partially opposing effects on the preferred characteristics (such as small error rates at low latency), multi-objective optimization  ...  For this purpose, we derive a set of easily computable objective functions, which enable the fast evaluation of DNN architectures with respect to their hardware efficiency and error resilience solely based  ...  multi-objective optimization (Section 2.3) and neural architecture search (NAS) (Section 2.4).  ... 
arXiv:1909.13844v1 fatcat:x4pdboypqjepbnss4papdgt7ce

A Hardware-Aware Framework for Accelerating Neural Architecture Search Across Modalities [article]

Daniel Cummings, Anthony Sarah, Sharath Nittur Sridhar, Maciej Szankin, Juan Pablo Munoz, Sairam Sundaresan
2022 arXiv   pre-print
Specifically, we show how evolutionary algorithms can be paired with lightly trained objective predictors in an iterative cycle to accelerate architecture search in a multi-objective setting for various  ...  Recent advances in Neural Architecture Search (NAS) such as one-shot NAS offer the ability to extract specialized hardware-aware sub-network configurations from a task-specific super-network.  ...  Figure 1 : 1 Figure 1: Generalizable framework for accelerating super-network type neural architecture search.  ... 
arXiv:2205.10358v1 fatcat:xu7ezcvzhfhcplbvzmp5za7mi4

Accelerating Evolutionary Neural Architecture Search via Multi-Fidelity Evaluation [article]

Shangshang Yang, Ye Tian, Xiaoshu Xiang, Shichen Peng, Xingyi Zhang
2021 arXiv   pre-print
Evolutionary neural architecture search (ENAS) has recently received increasing attention by effectively finding high-quality neural architectures, which however consumes high computational cost by training  ...  To address this issue, in this paper we propose an accelerated ENAS via multifidelity evaluation termed MFENAS, where the individual evaluation cost is significantly reduced by training the architecture  ...  Following the above idea, we propose an accelerated ENAS approach via multi-fidelity evaluation, named MFE-NAS, where each individual is evaluated by training the architecture encoded by each individual  ... 
arXiv:2108.04541v1 fatcat:hlp7mmnfwbe6pjnrxisryye3jq

Accelerating neural architecture exploration across modalities using genetic algorithms

Daniel Cummings, Sharath Nittur Sridhar, Anthony Sarah, Maciej Szankin
2022 Proceedings of the Genetic and Evolutionary Computation Conference Companion  
In this work, we show how genetic algorithms can be paired with lightly trained objective predictors in an iterative cycle to accelerate multi-objective architectural exploration in the modalities of both  ...  Neural architecture search (NAS), the study of automating the discovery of optimal deep neural network architectures for tasks in domains such as computer vision and natural language processing, has seen  ...  CONCLUSION The goal of the work was to demonstrate how GAs can be uniquely leveraged to accelerate multi-objective neural architecture search for the modalities of machine translation and image classification  ... 
doi:10.1145/3520304.3528786 fatcat:kvf65k6xerckznllnmgt6vtfty

Rethinking Co-design of Neural Architectures and Hardware Accelerators [article]

Yanqi Zhou, Xuanyi Dong, Berkin Akin, Mingxing Tan, Daiyi Peng, Tianjian Meng, Amir Yazdanbakhsh, Da Huang, Ravi Narayanaswami, James Laudon
2021 arXiv   pre-print
Neural architectures and hardware accelerators have been two driving forces for the progress in deep learning.  ...  We systematically study the importance and strategies of co-designing neural architectures and hardware accelerators.  ...  ., 2019) used random forests in automatic tuning of hardware accelerator configuration in a multi-objective setting.  ... 
arXiv:2102.08619v1 fatcat:rah3mjwrdna5hk7hqdipuynmva

Multi-objective Neural Architecture Search with Almost No Training [article]

Shengran Hu, Ran Cheng, Cheng He, Zhichao Lu
2020 arXiv   pre-print
In this work, we propose an effective alternative, dubbed Random-Weight Evaluation (RWE), to rapidly estimate the performance of network architectures.  ...  When integrated within an evolutionary multi-objective algorithm, RWE obtains a set of efficient architectures with state-of-the-art performance on CIFAR-10 with less than two hours' searching on a single  ...  To address the aforementioned issues, we propose the Random-Weight Evaluation (RWE), a flexible and effective method to accelerate the performance evaluation of architectures.  ... 
arXiv:2011.13591v1 fatcat:3mm3x5kg4nboxevba6hlgtgrla

FTT-NAS: Discovering Fault-Tolerant Convolutional Neural Architecture [article]

Xuefei Ning, Guangjun Ge, Wenshuo Li, Zhenhua Zhu, Yin Zheng, Xiaoming Chen, Zhen Gao, Yu Wang, Huazhong Yang
2021 arXiv   pre-print
We propose Fault-Tolerant Neural Architecture Search (FT-NAS) to automatically discover convolutional neural network (CNN) architectures that are reliable to various faults in nowadays devices.  ...  By inspecting the discovered architectures, we find that the operation primitives, the weight quantization range, the capacity of the model, and the connection pattern have influences on the fault resilience  ...  faults and weight faults, respectively. • We establish a multi-objective neural architecture search framework.  ... 
arXiv:2003.10375v2 fatcat:2fmahmgy7rckvlqaobtd6jco2i

MOO-DNAS: Efficient Neural Network Design via Differentiable Architecture Search Based on Multi-Objective Optimization

Hui Wei, Feifei Lee, Chunyan Hu, Qiu Chen
2022 IEEE Access  
In this paper, we propose an efficient CNN architecture search framework, MOO-DNAS, with multi-objective optimization based on differentiable neural architecture search.  ...  Fortunately, the emergence of Neural Architecture Search improves the speed of network design, but most excellent works only optimize for high accuracy without penalizing the model complexity.  ...  INDEX TERMS Differentiable neural architecture search, CNNs, multi-objective optimization, accuracyefficiency trade-off I.  ... 
doi:10.1109/access.2022.3148323 fatcat:j33vitmq2rgbtdtkgqll3pttfe

Multi-Objective Neural Architecture Search Based on Diverse Structures and Adaptive Recommendation [article]

Chunnan Wang, Hongzhi Wang, Guosheng Feng, Fei Geng
2020 arXiv   pre-print
In addition, we designs a novel multi-objective method to effectively analyze the historical evaluation information, so as to efficiently search for the Pareto optimal architectures with high accuracy  ...  The search space of neural architecture search (NAS) for convolutional neural network (CNN) is huge.  ...  The explored network architecture is transferable to ImageNet and 5 additional datasets, and achieves good results, e.g., 76.0% top-1 accuracy with only 4.9M parameters on ImageNet.  ... 
arXiv:2007.02749v2 fatcat:zaf2fbakkbc6bkkd7rwe55usmu


Li Chen, Hua Xu
2022 Proceedings of the Genetic and Evolutionary Computation Conference Companion  
Neural Architecture Search (NAS) aims to automatically find neural network architectures competitive with human-designed ones.  ...  Inspired by MFEA, we model the NAS task as a two-factorial problem and propose a multifactorial evolutionary neural architecture search (MFENAS) algorithm to solve it.  ...  The simplest way to tackle a multi-objective optimization problem is by converting it into a single objective optimization problem with a linear weighted factor [12] or a nonlinear penalty term [1]  ... 
doi:10.1145/3520304.3528882 fatcat:k2et64ejknhvlkoenfhkajesvi

Neural Architecture Search Survey: A Hardware Perspective

Krishna Teja Chitty-Venkata, Arun K. Somani
2022 ACM Computing Surveys  
of neural algorithm and hardware accelerator specifications.  ...  Hardware-Aware Neural Architecture Search (HW-NAS) automates the architectural design process of DNNs to alleviate human effort, and generate efficient models accomplishing acceptable accuracy-performance  ...  Acknowledgements: The research reported in this paper is supported by the Philip and Virginia Sproul Professorship at Iowa State University.  ... 
doi:10.1145/3524500 fatcat:4ibnwmgbdnbhjpk4u7soc6aom4

A Comprehensive Survey of Neural Architecture Search: Challenges and Solutions [article]

Pengzhen Ren, Yun Xiao, Xiaojun Chang, Po-Yao Huang, Zhihui Li, Xiaojiang Chen, Xin Wang
2021 arXiv   pre-print
Neural Architecture Search (NAS) is just such a revolutionary algorithm, and the related research work is complicated and rich.  ...  Previously related surveys have begun to classify existing work mainly based on the key components of NAS: search space, search strategy, and evaluation strategy.  ...  for multi-objective optimization tasks.  ... 
arXiv:2006.02903v3 fatcat:u3k66cclarcbjd7dhkh5rdb6cu

Device-Circuit-Architecture Co-Exploration for Computing-in-Memory Neural Accelerators [article]

Weiwen Jiang, Qiuwen Lou, Zheyu Yan, Lei Yang, Jingtong Hu, Xiaobo Sharon Hu, Yiyu Shi
2020 arXiv   pre-print
However, state-of-the-art neural architecture search algorithms for the co-exploration are dedicated for the conventional von-neumann computing architecture, whose performance is heavily limited by the  ...  In this paper, we are the first to bring the computing-in-memory architecture, which can easily transcend the memory wall, to interplay with the neural architecture search, aiming to find the most efficient  ...  In this work, we bring the CiM neural accelerator design to interplay with the neural architecture search, aiming to automatically identify the best device, circuit, and neural architecture coupled with  ... 
arXiv:1911.00139v2 fatcat:we43szndbzduzizoq75c64rhhq
« Previous Showing results 1 — 15 out of 17,970 results